Skip to main content

Approaching Accessibility Auditing as a Team

There's a lot of wonderful articles that go into great detail about the what and how of website accessibility audits, but I've yet to find any that delve into how to do an accessibility audit as a team. Adding that additional layer of complexity necessitates a level of coordination and collaboration that can be tricky if not considered early. After some trial and error honing the process, here's one way to audit as a team that's worked for me.

Target Audience

Before we get started, I want to outline a few assumptions for the target audience of this article:

  • This method should work with any digital property but is coming from a website accessibility background specifically. It's also coming from a consultancy/digital agency perspective that doesn't necessarily have a dedicated accessibility team to do this work.
  • You're an individual pulling together a team that'll be tasked with generating a website accessibility audit report.
    If you're not the ones actively doing the accessibility auditing and are instead looking for tips on how to assess and choose external experts, I highly recommend What to look for in an accessibility audit
  • You have some understanding of the required expertise of successfully performing an accessibility audit.
    If you need some more direction here, Using Combined Expertise to Evaluate Web Accessibility is a great place to start.
  • You have some experience "shifting left" and testing for accessibility best practices from the design/development/planning side, but now need to incorporate a website audit template into your workflow.
    If you need more introduction to evaluating web accessibility, I highly recommend https://www.w3.org/WAI/test-evaluate/, and in particular the Easy Checks sub page.

Team-based Website Audit Checklist

This checklist is deeply condensed so you can copy and fit it into your unique workflow, I'll go into more details for each checklist item in the next section.

  1. Identify your team - 2-4 people of differing professional backgrounds (Design, Development, QA), expertise (knowledge of testing tools, WCAG, and disability barriers) and abilities is ideal.
  2. Define your Scope - Determine conformance target, browsers/technologies, and testing tools that will be used. Assign unique combinations of browsers and tools to each team member based on expertise and background.
  3. Explore Target Website - Have everyone on the team independently identify common pages, essential functionality, and unique page types. Combine and condense them together.
  4. Select a Representative Sample - Based on the target site exploration, determine the number of sample pages you can support testing and select. Working individually here as well may be beneficial if looking for varied sample opinions.
  5. Audit the Selected Sample - Each team member should audit one page at a time and no two auditors should be working on the same page at the same time. One team member should be tasked with reviewing for completeness, duplication, and shared voice/tone.
  6. Accessibility Audit Report - Get the whole team together to collaborate on the Executive Summary and the Recommendations.

Team-based Website Audit Detail

If you're familiar with the guided setup for projects on Be Inclusive or the Website Accessibility Conformance Evaluation Methodology (WCAG-EM), then much of this is going to sound familiar. Feel free to skim to get the gist of how a team-based approach is woven in.

Identify your Team

Team size should really depending on the size, speed, and scope of your audit needs. I've seen 2-4 people work out really well. They should be of differing professional backgrounds (Design, Development, QA), expertise (knowledge of testing tools, WCAG, and disability barriers) and abilities to get a more diverse set of observations.

Define your Scope

This is where you define your Conformance Target (our default lately tends to be WCAG 2.1 AA until 2.2 becomes official), jot down the scope of the audit and additional requirements, and determine what browsers and tools you're going to use for testing. Check out WCAG-EM Step 1: Define the Evaluation Scope for more detail on this step of the process.

Like the team size, the number of browsers and tools is likely going to depend on the size, speed, and scope of the audit but a good rule of thumb that has worked out well for me is to have at least one manual and one automated testing tool for each auditor. I find that different automated tools often find different things, so having a few in the mix is really helpful. There's a lot of wonderful Web Accessibility Testing Tools listed here if you'd like a list to choose from.

Assigning tools to each auditor on the team is a bit of an artform. If you're pulling in people that have interest in learning more but currently have limited experience testing for accessibility, then try to assign them the tools that provide more guidance. A favorite one of mine for these situations is Accessibility Insights because of its comprehensive 24 step guided process for semi-manual testing, it provides instructions and visual helpers every step of the way.

It's also helpful to take individual experience into account when assigning tools, there tends to be a lot of variation here so every situation is unique but I want to share some examples to illustrate, someone with a strong background in:

  • Content authoring, wireframing, and information architecture would be great at checking focus order, headlines, info and relationships, and an overall meaningful sequence
  • User Interface design would excel at reviewing use of color, contrast, text size, and visually complex interfaces
  • Front-end development would be able to provide helpful remediation notes for any issues noticed in complex JavaScript-based custom interfaces, forms, animations, and input modalities

It sounds obvious, but utilizing peoples strengths when picking tools can take some time to get right. There's nothing wrong with a bit of trial and error here either, with some good timing you can have folks swap tools mid-audit if it's just not working the way you'd like.

Explore Target Website

Now it's time to explore the site and gather required technologies, capture some notes on essential functionality, and try to identify some common pages and templates. Check out WCAG-EM Step 2: Explore the Target Website for more detail on this step of the process.

This is where the benefits of a team of people auditing really starts to shine. Have everyone on the team independently explore the site and identify the following:

  • Common Web Pages
  • Essential Functionality
  • Variety of Page Types and Components

This has the mutual benefit of allowing them all to become more familiar with the site they're about to audit, but to also have the ability to compare what they each perceived as essential and common. Comparing even further to what the evaluation commissioner (your client, probably) considers essential and common is crucial here as well. It can often lead to some very interesting conversations that could end up affecting some details of the audit before you really get started.

Select a Representative Sample

Based on the details gathered in the first two steps, this one is all about identifying the representative sample of pages and/or page states that will be audited in detail. It's often not a feasible to audit every single page, so intentionally choosing samples that include common and relevant pages, essential functionality, and a variety of page components is the next best thing. Check out WCAG-EM Step 3: Select a Representative Sample for more detail on this step of the process.

Determine the number of sample pages you can support testing - again based on the size, speed, and scope of your audit. Smaller audits tend to be no less than 10 pages, unless we're dealing with a very small site or want to get the audit done very quickly.

Once you have a total number of pages defined, working individually again to select that number of representative samples can be particularly enlightening. It takes some additional time, but a more diverse set of opinions on which samples to test can often lead to further interesting conversation. Where'd you overlap and why? Where'd you differ and why?

If you have more samples than agreed upon after combining lists together and need to prune, some reasonable tie breakers could include page complexity and analytics. Prioritizing more complex pages may lead to more helpful observations, prioritizing more highly trafficked pages may lead to better user experiences once remediated.

Audit the Selected Sample

Now the real fun begins! I won't get into dictating how to perform audit testing in detail here, but I do think it's important to have a well defined process as it pertains to working as a team.

The idea here is that everyone on the team is going to test every page defined in the last step, but each using the different browsers and tools they were assigned. Multiple passes from different vantage points and tools often leads to a more comprehensive list of observations.

It's going to be important to maintain communication throughout the audit, everyone should work on one page at a time and no two auditors should be working on the same page at the same time. This is critical in order to avoid excessive duplicate observations. This way, when one person is done with a page and the next takes it on, they're going to be able to build on that progress. If the same observation is noted, now they can edit the existing observation to add more detail and screenshots from a different browser or testing tool.

As everyone completes their review of a page, one team member should be tasked with reviewing all the resulting observations for completeness, duplication, and a shared voice/tone. Here are a few questions I ask myself as I do this final review of observations for a page:

  • Do I understand the issue that needs to be resolved, where to find it, and how to trigger it? Are there relevant screenshots or video to help me with this understanding?
  • Is there an estimate of priority? An estimate of effort to remediate?
  • Are there remediation notes to help the person that will be tasked with addressing this?

In my opinion, the whole point of an audit is to focus attention on how to make the site usable to as many people as possible. It's critical for the audit results to be as easy to understand - and to act on - as possible. It can sometimes be weeks or months between when an observation is logged and when it's prioritized for remediation, the chances of being able to remember specifics in order to answer future questions drop considerably as time goes on.

Accessibility Audit Report

You now have a complete list of detailed and peer reviewed observations, wonderful! All that detail is going to be invaluable during remediation.

But there's one little problem.

There's a whole phase between the audit and the remediation where we'll need to provide a summary of what was observed and recommendations on how to address them. Often, we'll have to make the case for prioritizing remediation efforts into busy sprints already full of bugfixes, enhancements, and new features. A well crafted summary and carefully considered list of recommendations can provide the clarity and urgency that ensures this becomes a high priority.

I highly recommend following the Template for Accessibility Evaluation Reports, get the whole team together to collaborate on the Executive Summary and the Recommendations in particular. This is yet another area where that diversity of thought and opinion will strengthen the end result.

TL;DR

I hope this provides a useful peek into how to do an accessibility audit with a team of people, here are some key points I want to highlight:

  • Pay it forward, working with multiple people is a great way to get someone relatively new to accessibility testing up to speed.
  • Be sure to take advantage of the diversity of experience, opinions, and expertise of your team every step of the way. It may take a bit more time but the results are often much better.
  • For best results, everyone should work on one page at a time and no two auditors should be working on the same page at the same time.
  • Review all the observations for completeness, duplication, and a shared voice/tone. Also, put significant effort into the Executive Summary and Recommendations. They're both vital differentiators for the next steps of the process.

Wrapping Up

As you may have guessed, all the above advice has deeply influenced the workflow and features available on Be Inclusive. It was built from the start to make the accessibility audit process easier to manage both for individuals and for teams including:

  • the same three step process (Define Scope, Explore Website, Select Sample) for creating projects
  • helpful tooltips every step of the way
  • a customizable audit report that generates stats and charts
  • easy export options to quickly get set up in Jira, Azure Dev Ops, or any workflow that allows CSV imports

I hope you consider giving it a try with the no obligation 7 day free trial. Happy auditing!

Have any feedback about this blog post? Let us know!

Let us simplify your a11y audits

Register now and try it free for a week!

Let's get to work!