DEV Community

Cover image for How to Test a Web Platform for Accessibility
Chiemezuo
Chiemezuo

Posted on

How to Test a Web Platform for Accessibility

Disclaimer

Although the header of this article is generic, I will be drawing very specific instances and experiences from my time working on the Ushahidi platform so far.

Introduction

Accessibility is a topic that would have been considered very strange many years ago but is something of a 'buzzword' these days. What does it mean, though?
Simply put, and without any computer or Internet context attached to it, 'Accessibility' means: 'The quality of being able to be entered or reached.
When you prepend the word 'web' to it, the puzzle starts to slowly complete itself. 'Web Accessibility' initially meant 'making websites accessible and/or navigable to people with disabilities, conditions (physical or mental), or impairments. However, since findings have shown that even people without disabilities still have trouble navigating some websites, the definition has been expanded to include 'everyone.'
The leading document for Accessibility is the WCAG (Web Content Accessibility Guidelines), and this will serve as a lamplight for you and me.

For an article of this nature to be practical, I will walk you through my thought process and action plan for checking how accessible a website is, using the Ushahidi client platform as my reference tool.

Note: Ushahidi is an open-source tool that enables rapid collection, management, and analysis of crowdsourced information. It is used to amplify voices in society, empower communities, and foster change in the modern day.

Setup

Chances are that if you want to test a platform for web accessibility, you're going to need to set it up. You can find the guide to create your setup to explore the new Ushahidi client platform here. Connect with the community's Discord server proved helpful for me.
With that out of the way, let's get down to the action plan.

Action Plan

As with all things in software development (or life in general too), it's best to have a plan. The plans should take into account what you hope to have achieved by the end of their execution. Now, with testing for Web Accessibility in mind, there were some important questions I had to ask myself about the Ushahidi platform as a starting point.

  • What is the user experience for a new user with no previous experience on this platform?
  • What would the experience be like for a blind person?
  • What would the experience be like for someone with other forms of visual impairments?
  • What would the experience be like for someone using alternative forms of navigation?
  • What would the experience be like for someone using programmatic tools to extract meaningful information?

These questions helped prepare me for the types of testing to run, and how much meaning to put into each particular metric. I decided that I would perform the following types of tests to match each criterion. I will explain each form of testing in their own section.

  1. Self/Manual testing.
  2. Keyboard testing
  3. Screen-reader testing.
  4. Automated testing.

Self/Manual testing

This is where I navigate around the way most of the world would. I look at things like navigation, and how easy it is to find useful information (such as whether there's an about page and so on). The idea is to gauge just how intuitive the website is, especially when not using a manual or guide because the sad reality is that most users rely on intuition rather than an external guide when using services.
While doing this testing, I found my first source of confusion on the platform: Posts, Surveys, and Categories had a seemingly weird relationship. To include a new category in a survey, you would need to first create it, and then you could add that survey to a post. However, you wouldn't realize this until you first tried creating a post.
The next thing I noticed right away was that each page in the tab showed the same title: Mzima Deployment Stage.
This was what I could make out of the website using the simplest means of testing: my own self.

Note: While there was no violation by Ushahidi, something to look out for when manually testing is to inspect pictures and media files in the Dev tools of your browser. Lots of times, images may have alt texts, but those won't be descriptive enough to give full contextual meaning in the absence of the image itself.
The beauty of manual testing is that if things don't make sense to you or if the flow seems rough for a user without impairments, it will most likely not be accessible by a user with impairments, and the first true fix would be to make changes to the state of the website as currently is.

Keyboard testing

Testing how accessible a website is via keyboard navigation is one of the most crucial types of testing methods for two major reasons:

  1. Not all users can navigate your page via a mouse or trackpad.
  2. Successful keyboard navigation is a prerequisite for screen readers or assistive technologies to work properly.

So, it's very important that a website accounts for the following cornerstones:

  • There is keyboard focus that primarily allows sighted users to know which element on the page currently has focus.
  • Keyboard tab order allows users to properly navigate the page, with a logically consistent behavior. This means that with the tab button, users should be able to move across elements in a manner that makes sense with their arrangement on the screen.
  • There are no keyboard traps that prevent users from navigating further on a page.

It was during keyboard testing that I found out the first problem with the web platform client (and likely with angular tab navigations as a whole). With keyboard navigation, at the time of my writing this, you cannot toggle between the login and signup buttons. I also spotted a problem with the 'Help & Support' screen where even though tab navigation worked, the enter button produced no action.

These are some very easy things to spot, provided you give keyboard navigation a try.

Screen-reader testing

This is where things start getting fun. When I perform this test, I do it blindfolded. It gives you a sense of what a blind user will experience. The screen-reading tool I use is NVDA. Screen readers are particularly excellent at exposing non-existent or non-descriptive aria-label tags.
With a screen reader, I try to visit every page of a website to see what the experience is like. I also navigate throughout the entire page. I also periodically took off my blindfold to make certain that the sounds I was hearing were matching the actions that were going on.
Here, I found that only 1 screen in its current state was fully navigable by a screen reader: the Activity page. The rest were either partially or completely unreadable by my screen reader. I also noticed another problem: the fact that links that open in a new window can be very tricky for screen readers and should be avoided when possible, especially because the Ushahidi platform was set up in a way that most of the modals automatically close when a button or list item in them is clicked.

Automated testing

This is the easiest type of testing, but quite powerful. It involves the use of web accessibility checkers to automatically perform checks for you. The two most popular tools are Google Lighthouse and WAVE Evaluation Tool. I use the former when I want to get metrics or quantified numbers such as percentages or numerical ratings, and I use the latter when I want a more thorough highlight of the accessibility violations.
Some of the categories where Automated tools outshine other forms of testing are in finding the following:

  • WCAG 1.1.1 Non-text content: This is whether or not an image has an alternative text.
  • WCAG 1.4.3 Contrast minimum: Whether text and images of text have the recommended contrast ratio (>= 4:5:1) for readability.
  • WCAG 1.4.11 Non-text contrast: Whether the colors of non-text content such as diagrams have a good contrast ratio (>= 3:1) to adjacent colors.
  • WCAG 2.4.4 Link purpose: Whether links contain text, or if linked images lack alternative text.
  • WCAG 2.4.6 Headings and Labels: Whether forms have labels or webpages are missing primary heading levels, or if headings are improperly arranged. These are just a subset of the things automated software could help you find out.

Grouping Issues

After deciding on and running the tests, I write my findings down for each page and try to group them. The advantage of this is that some of the accessibility problems might be from template files. Fixing the template file will solve site-wide issues.
Another reason for grouping them is that some issues might have umbrella fixes. A case of this on the Ushahidi platform was that due to stylistic choices, some forms didn't require labels and some buttons had to be empty. However, the absence of these would trigger errors from automated checkers. After grouping the problems, I realized that creating a CSS class visually-hidden to visually hide labels and button text was an appropriate fix to preserve stylistic choices, and still remove errors from checkers.

Prioritizing

That's about it for testing. If you want to progress to the stage of making fixes for the problems you have identified, the ideal thing to do would be to prioritize issues. Some problems are simply more urgent than others. Emphasis should be placed mainly on parts of the website with the most traffic, and that are often impossible to miss. Sections of the website that are rarely visited can be handled much later.
In addition to this, some issues exist that cause no problems. They do not affect assistive technologies, alternative navigation methods, screen-readers, or even regular users, but are just there. An example would be a lack of contrast between the logo and the background. It's an accessibility issue, but it does not affect overall site usability. Issues like this should fall lower on the list of priorities.

Conclusion

The article was to explain my thought process when trying to test any website for Accessibility standards compliance. I tried to not limit the scope of my article to just one platform, as there was a lot of useful information anyone could use for any platform. I thoroughly enjoyed every bit of the testing I did, and I strongly recommend you incorporate accessibility checks wherever you can.

Thanks for giving this a read.
Cheers.

Top comments (2)

Collapse
 
erioldoesdesign profile image
Eriol Fox they/them 🌈🦊🇪🇺

Fantastic blog/document - this lays out how to apply a critical thinking from user perspectives via the questions posed in 'Action Plan'. I really enjoyed the example sections too. Fantastic work!

Collapse
 
chiemezuo profile image
Chiemezuo

Thank you!! @erioldoesdesign
:)