Skip to main content Skip to docs navigation

Learn accessibility testing in web development. This guide will cover general best practices as demonstrated through testing and some tips on accessibility testing for web development from the start in detail.

On this page

Testing

Getting started with ... is a concise series introducing you to the practicalities of web development. You'll set up the tools you need to construct a simple webpage and publish your own simple code.

So far in this course, you have learned about the individual, business, and legal aspects of digital accessibility, and the basics of digital accessibility conformance. You have explored specific topics related to inclusive design and coding, including when to use ARIA versus HTML, how to measure color contrast, when JavaScript is essential, amongst other topics.

In the remaining modules, we shift gears from designing and building to testing for accessibility. We'll utilize a three-step testing process that includes automated, manual, and assistive technology testing tools and techniques. We'll use the same demo throughout these testing modules to progress the web page from inaccessible to accessible.

Each test—automated, manual, and assistive tech—is critical to achieving the most accessible product possible.

Our tests rely on the Web Content Accessibility Guidelines (WCAG) 2.1 conformance level A and AA as our standards. Remember that your industry, product type, local/country laws and policies, or overall accessibility goals will dictate which guidelines to follow and levels to meet. If you don't require a specific standard for your project, the recommendation is to follow the latest version of WCAG. Refer back to "How is digital accessibility measured?" for general information on accessibility audits, conformance types/levels, WCAG, and POUR.

As you now know, accessibility conformance is not the full story when it comes to supporting people with disabilities. But, it's a good starting point as it provides a metric you can test against. We encourage you to take additional actions outside of accessibility conformance testing, such as running usability tests with people with disabilities, hiring people with disabilities to work on your team, or consulting an individual or company with digital accessibility expertise to help you build more inclusive products.

Automated testing basics

Automated accessibility testing uses software to scan your digital product for accessibility issues against pre-defined accessibility conformance standards.

Pros of automated accessibility tests:

  • Easy to repeat tests at different stages of the product lifecycle
  • Just a few steps to run and very quick results
  • Little accessibility knowledge is required to run the tests or understand the results

Cons of automated accessibility tests:

  • Automated tools don't catch all of the accessibility errors in your product
  • Reported false positives (an issue is reported that isn't a true WCAG violation)
  • Multiple tools may be needed for different product types and roles

Automated testing is a great first step to check your website or app for accessibility, but not all checks can be automated. We'll go into more detail on how to check the accessibility of elements that cannot be automated in the manual accessibility testing module.

Types of automated tools

One of the first online automated accessibility testing tools was developed in 1996 by the Center for Applied Special Technology (CAST), called "The Bobby Report." Today, there are over 100 automated testing tools to choose from!

Automated tool implementation varies from accessibility browser extensions to code linters, desktop and mobile applications, online dashboards, and even open-source APIs you can use to build your own automated tooling.

Which automated tool you decide to use can depend on many factors, including:

  • Which conformance standards and levels are you testing against? This may include WCAG 2.1, WCAG 2.0, U.S. Section 508, or a modified list of accessibility rules.
  • What type of digital product are you testing? This could be a website, web app, native mobile app, PDF, kiosk, or other product.
  • What part of the software development life cycle are you testing your product?
  • How much time does it take to set up and use the tool? For an individual, team, or company?
  • Who is conducting the test: designers, developers, QA, etc.?
  • How often do you want the accessibility to be checked? What details should be included in the report? Should issues be directly linked to a ticketing system?
  • Which tools work best in your environment? For your team?

There are many additional factors to consider as well. Check out WAI's article on "Selecting Web Accessibility Evaluation Tools" for more information on how to select the best tool for you and your team.

Demo: Automated test

For the automated accessibility testing demo, we'll be using Chrome's Lighthouse. Lighthouse is an open-source, automated tool created to improve the quality of web pages through different types of audits, such as performance, SEO, and accessibility.

Our demo is a website built for a made-up organization, the Medical Mysteries Club. This site is intentionally made inaccessible for the demo. Some of this inaccessibility may be visible to you, and some (but not all) will be caught in our automated test.

Step 1

Using your Chrome browser, install the Lighthouse extension.

There are many ways to integrate Lighthouse into your testing workflow. We'll use the Chrome extension for this demo.

Step 2

Medical Mystery Club website, outside of the iframe.

We have built a demo in CodePen. View it in debug mode to proceed with the next tests. This is important, as it removes the <iframe> which surrounds the demo web page, which may interfere with some testing tools. Learn more about CodePen's debug mode.

Step 3

Open Chrome DevTools and navigate to the Lighthouse tab. Uncheck all of the category options except for "Accessibility." Keep the mode as the default and choose the device type you're running the tests on.

Medical Mystery Club website, with the Lighthouse report DevTools panel open.

Step 4

Click the "Analyze page load" button and give Lighthouse time to run its tests.

Once the tests are complete, Lighthouse displays a score that measures how accessible the product you're testing is. The Lighthouse score is calculated by the number of issues, issue types, and the impact on users of the issues detected.

Beyond a score, the Lighthouse report includes detailed information about what issues it has detected and links to resources to learn more about remedying them. The report also includes tests that are passed or not applicable and a list of additional items to check manually.

The Medical Mysteries Club website received a 62 for the Lighthouse score in our December 2022 test.

Step 5

Now, let's go through an example of each automated accessibility issue discovered and fix the relevant styles and markup.

Issue 1: ARIA roles

The first issue states: "Elements with an ARIA [role] that require children to contain a specific [role] are missing some or all of those required children. Some ARIA parent roles must contain specific child roles to perform their intended accessibility functions." Learn more about ARIA role rules.

In our demo, the newsletter subscribe button fails:

<button role="list" type="submit" tabindex="1">Subscribe</button>
                                
Let's fix it.

The "subscribe" button next to the input field has an incorrect ARIA role applied to it. In this case, the role can be removed completely.

<button type="submit" tabindex="1">Subscribe</button>
                                

Issue 2: ARIA hidden

"[aria-hidden="true"] elements contain focusable descendants. Focusable descendants within an [aria-hidden="true"] element prevent those interactive elements from being available to users of assistive technologies like screen readers. Learn more about aria-hidden rules.

<input type="email" placeholder="Enter your e-mail address" aria-hidden="true" tabindex="-1" required>
                                
Let's fix it.

The input field had an aria-hidden="true" attribute applied to it. Adding this attribute hides the element (and everything nested under it) from assistive tech.

<input type="email" placeholder="Enter your e-mail address" tabindex="-1" required>
                                

In this case, you should remove this attribute from the input to allow people using assistive technology to access and enter information into the form field.

Issue 3: Button name

Buttons do not have an accessible name. When a button doesn't have an accessible name, screen readers announce it as "button," making it unusable for users who rely on screen readers. Learn more about button name rules.

<button role="list" type="submit" tabindex="1">Subscribe</button>
                                
Let's fix it.

When you remove the inaccurate ARIA role from the button element in issue 1, the word "Subscribe" becomes the accessible button name. This functionality is built into the semantic HTML button element. There are additional pattern options to consider for more complex situations.

<button type="submit" tabindex="1">Subscribe</button>
                                

Issue 4: Image alt attributes

Image elements are missing [alt] attributes. Informative elements should aim for short, descriptive alternate text. Decorative elements can be ignored with an empty alt attribute. Learn more about image alternative text rules.

<a href="index.html">
                                  <img src="https://upload.wikimedia.org/wikipedia/commons/….png">
                                </a>
                                
Let's fix it.

Since the logo image is also a link, you know from the image module that it is called an actionable image and requires alternative text information about the purpose of the image. Normally, the first image on the page is a logo, so you can reasonably assume your AT users will know this, and you may decide not to add this additional contextual information to your image description.

<a href="index.html">
                                  <img src="https://upload.wikimedia.org/wikipedia/commons/….png"
                                    alt="Go to the home page.">
                                </a>
                                

Links do not have a discernible name. Link text (and alternate text for images, when used as links) that is discernible, unique, and focusable improves the navigation experience for screen reader users. Learn more about link text rules.

<a href="#!"><svg><path>...</path></svg></a>
                                
Let's fix it.

All of the actionable images on the page must include information about where the link will send users. One method to remedy this issue is to add alternative text to the image about the purpose, as you did on the logo image in the example above. This works great for an image using a <img> tag, but <svg> tags cannot use this method.

For the social media icons, which use <svg> tags, you can use a different alternative description pattern targeting SVGs, add the information between the <a> and <svg> tags and then hide it visually from users, add a supported ARIA, or other options. Depending on your environment and code restrictions, one method might be preferable over another. Let's use the simplest pattern option with the most assistive technology coverage, which is adding a role="img" to the <svg> tag and including a <title> element.

<a href="#!">
                                  <svg role="img">
                                    <title>Connect on our Twitter page.</title>
                                    <path>...</path>
                                  </svg>
                                </a>
                                

Issue 6: Color contrast

Background and foreground colors don't have a sufficient contrast ratio. Low-contrast text is difficult or impossible for many users to read. Learn more about color contrast rules.

Two examples were reported.

Lighthouse score for reported club name. The teal value contrast ratio is too low.
The club name,
Medical Mysteries Club
, has a color hex value of #01aa9d and the background hex value is #ffffff. The color contrast ratio is 2.9:1. View full size screenshot.
Lighthouse score for mermaid syndrome copy. The grey value contrast ratio is too low.
Mermaid syndrome has a text hex value of #7c7c7c, while the background's hex color is #ffffff. The color contrast ratio is 4.2:1. View full size screenshot.
Let's fix it.

There are many color contrast issues detected on the web page. As you learned in the color and contrast module, regular-sized text (less than 18pt / 24px) has a color contrast requirement of 4.5:1, while large-sized text (at least 18pt / 24px or 14pt / 18.5px bold) and essential icons must meet the 3:1 requirement.

For the page title, the teal-colored text needs to meet the 3:1 color contrast requirement since it is large-sized text at 24px. However, the teal buttons are considered regular-sized text at 16px bold, so they must meet the 4.5:1 color contrast requirement.

In this case, we could find a teal color that was dark enough to meet 4.5:1, or we could increase the size of the button text to 18.5px bold and change the teal color value slightly. Either method will stay in line with the design aesthetics.

All the gray text on the white background also fails for color contrast, except for the two largest headings on the page. This text must be darkened to meet the 4.5:1 color contrast requirements.

The teal has been fixed and no longer fails.
The club name,
Medical Mysteries Club
, has been given a color value of #008576 and the background remains #ffffff. The updated color contrast ratio is 4.5:1. View full size screenshot.
The grey has been fixed and no longer fails.
Mermaid syndrome now has a color value of #767676 and the background remains #ffffff. The color contrast ratio is 4.5:1. View full size screenshot.

Issue #7 - list structure

List items (<li>) are not contained within <ul> or <ol> parent elements. Screen readers require list items (<li>) to be contained within a parent <ul> or <ol> to be announced properly.

Learn more about list rules.

<div class="ul">
                                  <li><a href="#">About</a></li>
                                  <li><a href="#">Community</a></li>
                                  <li><a href="#">Donate</a></li>
                                  <li><a href="#">Q&A</a></li>
                                  <li><a href="#">Subscribe</a></li>
                                </div>
                                
Let's fix it.

We used a CSS class in this demo to simulate the unordered list instead of using a <ul> tag. When we wrote this code improperly, we removed the inherent semantic HTML features built into this tag. By replacing the class with a real <ul> tag and modifying the related CSS, we resolve this accessibility issue.

<ul>
                                  <li><a href="#">About</a></li>
                                  <li><a href="#">Community</a></li>
                                  <li><a href="#">Donate</a></li>
                                  <li><a href="#">Q&A</a></li>
                                  <li><a href="#">Subscribe</a></li>
                                </ul>
                                

Issue #8 - tabindex

Some elements have a [tabindex] value greater than 0. A value greater than 0 implies an explicit navigation ordering. Although technically valid, this often creates frustrating experiences for users who rely on assistive technologies. Learn more about tabindex rules.

<button type="submit" tabindex="1">Subscribe</button>
                                
Let's fix it.

Unless there is a specific reason to disrupt the natural tabbing order on a web page, there is no need to have a positive integer on a tabindex attribute. To keep the natural tabbing order, we can either change the tabindex to 0 or remove the attribute altogether.

<button type="submit">Subscribe</button>
                                

Step 6

Now that you've fixed all the automated accessibility issues, open up a new debug mode page. Run the Lighthouse accessibility audit again. Your score should be much better than on the first run.

Manual testing basics

Manual accessibility testing uses keyboard, visual, and cognitive tests, tools, and techniques to find issues that automated tooling cannot. As automated tooling does not cover all of the success criteria identified in WCAG, it's vital that you do not run automated accessibility tests and then stop testing!

As technology advances, more tests could be covered by automated tooling alone, but today, both manual and assistive technology checks need to be added to your testing protocols to cover all of the applicable WCAG checkpoints.

Pros of manual accessibility tests:

  • Reasonably straightforward and quick to run
  • Catch a higher percentage of issues than automated tests alone
  • Little tooling and expertise needed for success

Cons of manual accessibility tests:

  • More complex and time-consuming than automated tests
  • May be difficult to repeat at scale
  • Require more accessibility expertise to run tests and interpret the results

Let's compare what accessibility elements and details can currently be detected by an automated tool, versus those that won't be detected.

Can be automated Can't be automated
Color contrast of text on solid backgrounds Color contrast of text on gradients/images
Image alternative text exists Image alternative text is accurate and is properly assigned
Headings, lists, and landmarks exist Headings, lists, and landmarks are correctly marked-up and all elements are accounted for
ARIA is present ARIA is being used appropriately and applied to the correct element(s)
Identifying keyboard-focusable elements Which elements are missing keyboard focus, the focus order makes logical sense, and the focus indicator is visible
iFrame title detection iFrame, the focus order makes logical sense, and the focus indicator is visible
Video element is present Video element has appropriate alternative media present (such as captions and transcripts)


Types of manual tests

There are many manual tools and techniques to consider when looking at your web page or app for digital accessibility. The three biggest focus areas in manual testing are keyboard functionality, visually-focused reviews, and general content checks.

We will cover each of these topics at a high level in this module, but the following tests are not meant to be an exhaustive list of all the manual tests you can or should run. We encourage you to start with a manual accessibility checklist from a reputable source and develop your own focused manual testing checklist for your specific digital product and team needs.

Keyboard checks

It's estimated that about 25% of all digital accessibility issues are related to a lack of keyboard support. As we learned in the keyboard focus module, this affects all types of users, including sighted keyboard-only users, low-vision/blind screen reader users, and people using voice recognition software that uses technology that relies on content being keyboard accessible as well.

Keyboard tests answer questions such as:

  • Does the web page or feature require a mouse to function?
  • Is the tabbing order logical and intuitive?
  • Is the keyboard focus indicator always visible?
  • Can you get stuck in an element that shouldn't trap focus?
  • Can you navigate behind or around an element that should be trapping focus?
  • When closing an element that received focus, did the focus indicator return to a logical place?

While the impact of keyboard functionality is huge, the testing procedure is quite simple. All you need to do is set aside your mouse or install a small JavaScript package and test your website using only your keyboard. The following commands are essential for keyboard testing.

Key Result
Tab Moves forward one active element to another
Shift + Tab Moves backward one active element to another
Arrows Cycle through related controls
Spacebar Toggles states and moves down the page
Shift + Spacebar Moves up the page
Enter Triggers specific controls
Escape Dismisses dynamically displayed objects

Visual checks

Visual checks focus on visual elements of the page and utilize tools such as screen magnification or browser zoom to review the website or app for accessibility.

Visual checks can tell you:

  • Are there color contrast issues that an automated tool could not pick up, such as text on top of a gradient or image?
  • Are there any elements that look like headings, lists, and other structural elements but are not coded as such?
  • Are navigation links and form inputs consistent throughout the website or app?
  • Is there any flashing, strobing, or animation that exceeds the recommendations?
  • Does the content have proper spacing? For letters, words, lines, and paragraphs?
  • Can you see all the content using a screen magnifier or browser zoom?

Content checks

Unlike visual tests that focus on layouts, movement, and colors, content checks focus on the words on the page. Not only should you be looking at the copy itself, but you should review the context to be sure it makes sense to others.

Content checks answer questions such as:

  • Are page titles, headings, and form labels clear and descriptive?
  • Are image alternatives concise, accurate, and useful?
  • Is color alone used as the only way of conveying meaning or information?
  • Are links descriptive or do you use generic text such as “read more” or “click here?”
  • Are there any changes to the language within a page?
  • Is plain language being used and are all acronyms spelled out when first referenced?

Some content checks can be automated, in part. For example, you could write a JavaScript linter that checks for "Click here" and suggests you make a change. However, these custom solutions often still need a human to change the copy to something contextual.

Demo: Manual test

So far, we have run automated tests on our demo web page and found and remediated eight different issue types. We are now ready to run manual checks to see if we can discover even more accessibility issues.

Step 1

Our updated CodePen demo has all of the automated accessibility updates applied.

View it in debug mode to proceed with the next tests. This is important, as it removes the <iframe> which surrounds the demo web page, which may interfere with some testing tools. Learn more about CodePen's debug mode.

Step 2

Start your manual testing process by setting your mouse or trackpad aside and navigate up and down the DOM using only your keyboard.

Issue 1: Visible focus indicator

You should see the first keyboard issue right away—or rather, you shouldn't see it—as the visible focus indicator has been removed. When you scan the CSS in the demo, you should find the dreaded “outline: none” added to the codebase.

  :focus {
    outline: none;
  }
Let's fix it.

As you learned in the Keyboard focus module, you need to remove this line of code to allow web browsers to add a visible focus for users. You can go one step further and create a focus indicator styled to meet the aesthetics of your digital product.

:focus {
  outline: 3px dotted #008576;
}

Issue 2: Focus order

Once you have modified the focus indicator and it's visible, be sure to tab through the page. As you do so, you should notice that the form input field used to subscribe to the newsletter does not receive focus. It has been removed from the natural focus order by a negative tabindex.

<input type="email" placeholder="Enter your e-mail address" aria-hidden="true" tabindex="-1" required>
Let's fix it.

Since we would like people to use this field to sign-up for our newsletter, all we need to do is remove the negative tabindex or set it to zero to allow the input to become keyboard focusable again.

<input type="email" placeholder="Enter your e-mail address" aria-hidden="true" required>

Step 3

Once keyboard focus has been checked, we move on to visual and content checks.

As you went through the keyboard tests by tabbing up and down the demo page, you probably noticed the keyboard focused on three visually hidden links in the paragraphs about the different medical conditions.

For our page to be accessible, links must stand out from the surrounding text and include a non-color style change on mouse hover and keyboard focus.

Let's fix it.

A quick solution is to add an underline to the links inside the paragraphs to make them stand out. This would solve the accessibility issue, but it might not suit the overall design aesthetics you hope to achieve.

If you choose not to add an underline, you will need to modify the colors in such a way as to meet the requirements for both the background and copy.

When looking at the demo using a link contrast checker tool, you will see that the link color meets the 4.5:1 color contrast requirement between regular-sized text and the background. However, non-underlined links must also meet a 3:1 color contrast requirement against the surrounding text.

One option is to change the link color to match the other elements on the page. But if you change the link color to green, the body copy must also be modified to meet the overall color contrast requirements between all three elements: links, background, and surrounding text.

Screenshot of WebAIM for link text shows that the link to body text fails WCAG A level.
When the link and body text is the same, the test fails.
Screenshot of WebAIM shows that all tests pass when the link color is green.
When the link and body text is different, the test passes.

Issue 4: Icon color contrast

Another missed color contrast issue is the social media icons. In the color and contrast module, you learned that essential icons need to meet a 3:1 color contrast against the background. However, in the demo, the social media icons have a contrast ratio of 1.3:1.

Let's fix it.

To meet the 3:1 color contrast requirements, the social media icons are changed to a darker gray.

A screenshot of the demo with the color analyzer showing failing icon color contrast.

Issue 5: Content layout

If you look at the layout of the paragraph content, the text is fully justified. As you learned in the Typography module, this creates "rivers of space," which may make the text difficult for some users to read.

p.bullet {
   text-align: justify;
}
Let's fix it.

To reset the text alignment in the demo, you can update the code to text-align: left; or remove that line entirely from the CSS, as left is the default alignment for browsers. Be sure to test the code, in case other inherited styles remove the default text alignment.

p.bullet {
   text-align: left;
}

Step 4

Screenshot of the Medical Mysteries Club demo site.
All manual issues have now been addressed in the demo, as shown in this image.

Once you've identified and fixed all the manual accessibility issues outlined in the previous steps, your page should look similar to our screenshot.

It's possible that you'll find more accessibility issues in your manual checks than we covered in this module. We'll discover many of these issues in the next module.

Next step

Way to go! You have completed the automated and manual testing modules. You can view our updated CodePen, which has all the automated and manual accessibility fixes applied.

Now, head over to the last testing module focused on assistive technology testing.

This module focuses on using assistive technology (AT) for accessibility testing. A person with disabilities can use AT to help increase, maintain, or improve the capabilities of performing a task.

In the digital space, ATs can be:

  • No/Low-tech: head/mouth sticks, hand-held magnifiers, devices with large buttons
  • High-tech: voice-activated devices, eye-tracking devices, adaptive keyboards/mice
  • Hardware: switch buttons, ergonomic keyboards, auto-refreshing Braille device
  • Software: text-to-speech programs, live captions, screen readers

We encourage you to use multiple types of ATs in your overall testing workflow.

Screen reader testing basics

In this module, we focus on one of the most popular digital ATs, screen readers. A screen reader is a piece of software that reads the underlying code of a website or app. It then converts that information into speech or Braille output for the user.

Screen readers are essential for people who are blind and deafblind, but they also could benefit people with low vision, reading disorders, or cognitive disabilities.

Browser compatibility

There are multiple screen reader options available. The most popular screen readers today are JAWS, NVDA, and VoiceOver for desktop computers and VoiceOver and Talkback for mobile devices.

Depending on your operating system (OS), favorite browser, and the device that you use, one screen reader may stand out as the best option. Most screen readers are built with specific hardware and web browsers in mind. When you use a screen reader with a browser it was not calibrated for, you may encounter more "bugs" or unexpected behavior. Screen readers work best when used in the following combinations.

Screen reader OS Browser compatibility
Job Access With Speech (JAWS) Windows Chrome, Firefox, Edge
Non-Visual Desktop Access (NVDA) Windows Chrome and Firefox
Narrator Windows Edge
VoiceOver macOS Safari
Orca Linux Firefox
TalkBack Android Chrome and Firefox
VoiceOver (for mobile) iOS Safari
ChromeVox ChromeOS Chrome

Screen reader commands

Once you have the proper set-up for your screen reader software for your desktop or mobile device, you should look at the screen reader documentation (linked in the preceding table) and run through some essential screen reader commands to familiarize yourself with the technology. If you have used a screen reader before, consider trying out a new one!

When using a screen reader for accessibility testing, your goal is to detect problems in your code that interfere with the usage of your website or app, not to emulate the experience of a screen reader user. As such, there is a lot you can do with some foundational knowledge, a few screen reader commands, and a bit—or a lot—of practice.

If you need to further understand the user experience of people using screen readers and other ATs, you can engage with many organizations and individuals to gain this valuable insight. Remember that using an AT to test code against a set of rules and asking users about their experience often yields different results. Both are important aspects to create fully inclusive products.

Key commands for desktop screen readers

Element NVDA (Windows) VoiceOver (macOS)
Command Insert (NVDA key) Control + Option (VO key)
Stop audio Control Control
Read next/prev ↓ or ↑ VO + → or ←
Start reading NDVA + ↓ VO + A
Element List/Rotor NVDA + F7 VO + U
Landmarks D VO + U
Headings H VO + Command + H
Links K VO + Command + L
Form controls F VO + Command + J
Tables T VO + Command + T
Within Tables NDVA + Alt + ↓ ↑ ← → VO + ↓ ↑ ← →

Key commands for mobile screen readers

Element TalkBack (Android) VoiceOver (iOS)
Explore Drag one finger around the screen Drag one finger around the screen
Select or activate Double tap Double tap
Move up/down Swipe up or down with two fingers Swipe up or down with three fingers
Change pages Swipe left or right with two fingers Swipe left/right with three fingers
Next/previous Swipe left/right with one finger Swipe left/right with one finger

Screen reader testing demo

To test our demo, we used a Safari on a laptop running MacOS and capture sound. You can walk through these steps using any screen reader, but the way you encounter some errors may be different from how its described in this module.

Step 1

Visit the updated CodePen, which has all the automated and manual accessibility updates applied.

View it in debug mode to proceed with the next tests. This is important, as it removes the <iframe> which surrounds the demo webpage, which may interfere with some testing tools. Learn more about CodePen's debug mode.

Step 2

Activate the screen reader of your choice and go to the demo page. You may consider navigating through the entire page from top to bottom before focusing on specific issues.

We've recorded the our screen reader for each issue, before and after the fixes are applied to the demo. We encourage you to run through the demo with your own screen reader.

Issue 1: Content structure

Headings and landmarks are one of the primary ways people navigate using screen readers. If these are not present, a screen reader user has to read the entire page to understand the context. This can take a lot of time and cause frustration. If you try to navigate by either element in the demo, you will quickly discover that they do not exist.

  • Landmark example: <div class="main">...</div>
  • Heading example: <p class="h1">Join the Club</p>

If you have updated everything correctly, there should not be any visual changes, but your screen reader experience will have dramatically improved.

Listen to the screen reader navigate through this issue.
Let's fix it.

Some inaccessible elements can't be observed by just looking at the site. You may remember the importance of heading levels and semantic HTML from the Content structure module. A piece of content may look like a heading, but the content is actually wrapped in a stylized <div>.

To fix the issue with headings and landmarks, you must first identify each element that should be marked up as such and update the related HTML. Be sure to update the related CSS as well.

Landmark example: <main>...</main>

Heading example: <h1>Join the Club</h1>

If you have updated everything correctly, there should not be any visual changes, but your screen reader experience will have dramatically improved.

Now that we've fixed the content structure, listen to the screen reader navigate through the demo again.

It's important to give content to screen reader users about the purpose of a link and if the link is redirecting them to a new location outside of the website or app.

In our demo, we fixed most of the links when we updated the active image alternative text, but there are a few additional links about the various rare diseases that could benefit from additional context—especially since they redirect to a new location.

<a href="https://rarediseases.org/rare-diseases/maple-syrup-urine-disease">
  Maple syrup urine disease (MSUD)
</a>
Listen to the screen reader navigate through this issue.
Let's fix it.

To fix this issue for screen reader users, we update the code to add more information, without affecting the visuals element. Or, to help even more people such as those with reading and cognitive disorders, we may choose to add additional visual text instead.

There are many different patterns we may consider to add additional link information. Based on our simple environment that supports just one language, an ARIA label is a straightforward option in this situation. You may notice that the ARIA label overrides the original link text, so make sure to include that information in your update.

<a href="https://rarediseases.org/rare-diseases/maple-syrup-urine-disease"
  aria-label="Learn more about Maple syrup urine disease on the Rare Diseases website.">
  Maple syrup urine disease (MSUD)
</a>
Now that we've fixed the link context, listen to the screen reader navigate through the demo again.

Issue 3: Decorative image

In our automated testing module, Lighthouse was unable to pick up on the inline SVG that acts as the main splash image on our demo page—but the screen reader finds it and announces it as "image" without additional information. This is true, even without explicitly adding the role="img" attribute to the SVG.

<div class="section-right">
  <svg>...</svg>
</div>
Listen to the screen reader navigate through this issue.
Let's fix it.

To fix this issue, we first need to decide if the image is informative or decorative. Based on that decision, we need to add the appropriate image alternative text (informative image) or hide the image from screen reader users (decorative).

We weighed the pros and cons of how best to categorize the image and decided it was decorative, which means we want to add or modify the code to hide the image. A quick method is to add a role="presentation" to the SVG image directly. This sends a signal to the screen reader to skip over this image and not list it in the images group.

<div class="section-right">
  <svg role="presentation">...</svg>
</div>
Now that we've fixed the decorative image, listen to the screen reader navigate through the demo.

Issue 4: Bullet decoration

You may have noticed that the screen reader reads the CSS bullet image under the rare diseases sections. While not the traditional type of image we discussed in the Images module, the image still must be modified as it disrupts the flow of the content and could distract or confuse a screen reader user.

<p class="bullet">...</p>
Listen to the screen reader navigate through this issue.
Let's fix it.

Much like the decorative image example discussed earlier, you can add a role="presentation" to the HTML with the bullet class to hide it from the screen reader. Similarly, a role="none" would work. Just be sure not to use aria-hidden: true or you will hide all of the paragraph information from screen reader users.

<p class="bullet" role="none">...</p>

Issue 5: Form field

In the Forms module, we learned that all form fields must also have a visual and programmatic label. This label must remain visible at all times.

In our demo, we're missing both a visual and programmatic label on our newsletter sign-up email field. There is a text placeholder element, but this does not replace the label as it's not visually persistent and is not fully compatible with all screen readers.

<form>
  <div class="form-group">
    <input type="email" placeholder="Enter your e-mail address" required>
    <button type="submit">Subscribe</button>
  </div>
</form>
Listen to the screen reader navigate through this issue.
Let's fix it.

To fix this issue, replace the text placeholder with a look-alike label element. That label element is programmatically connected to the form field and movement was added with JavaScript to keep the label visible even when content is added to the field.

<form>
  <div class="form-group">
    <input type="email" required id="youremail" name="youremail" type="text">
    <label for="youremail">Enter your e-mail address</label>
    <button type="submit" aria-label="Subscribe to our newsletter">Subscribe</button>
  </div>
</form>
Now that we've fixed the form, listen to the screen reader navigate through the demo.

Wrap up

Congratulations! You have completed all of the testing for this demo. You can look at all of these changes in the updated Codepen for this demo.

Now, you can use what you've learned to review the accessibility of your own websites and apps.

The goal of all of this accessibility testing is to address as many possible issues that a user may potentially encounter. However, this does not mean that your website or app will be perfectly accessible when you're finished. You'll find the most success by designing your website or app with accessibility throughout the process, and incorporating these tests with your other pre-launch testing.

Updated on April 20, 2024 by Datarist.