Key Takeaways
- Automated tools catch 25-40% of issues—essential but not sufficient alone
- Free browser extensions (WAVE, axe) handle most development testing needs
- CI/CD integration catches regressions before deployment
- Enterprise tools add monitoring, reporting, and organization features
- Combine automated testing with manual testing and user testing for comprehensive coverage
The Role of Automated Testing
The first time I ran an automated accessibility scan on a site I thought was pretty accessible, I found over 200 issues. Most were easy fixes—missing alt attributes on decorative images, insufficient color contrast on a few elements, form inputs without proper labels. The scan took seconds and saved hours of manual inspection. That's the power of automated testing tools, and that's also their limitation: they find what's machine-detectable, which is only part of the picture.
Accessibility testing tools have improved dramatically over the past few years. Modern tools can scan entire pages in seconds, identifying issues that would take hours to find manually. They check color contrast ratios against WCAG standards. They verify that images have alt attributes. They ensure form fields are properly labeled. They detect ARIA misuse. For these machine-verifiable issues, automated tools are invaluable.
But automated tools can't evaluate whether alt text is meaningful—just that it exists. They can't determine whether content makes sense when read by a screen reader. They can't assess whether keyboard navigation flows logically. They can't judge whether an interaction pattern is intuitive for users with disabilities. These evaluations require human judgment. Understanding what automated tools can and can't do is essential for using them effectively.
Tools Are Starting Points
Automated testing finds obvious issues quickly. Manual testing finds everything else. The most effective approach combines automated scanning for efficiency with manual testing for completeness. Relying on automated tools alone leaves significant accessibility gaps.
Browser Extensions
Free browser extensions are where most people should start with accessibility testing. They're easy to install, require no configuration, and provide immediate feedback as you develop. I keep both WAVE and axe DevTools installed and use them constantly during development.
WAVE (WebAIM)
WAVE has been my go-to for quick accessibility checks for years. What makes it special is the visual overlay approach—instead of showing you a list of issues, it annotates the actual page with icons indicating errors, alerts, and features. You can see exactly where problems are in the context of your layout. This visual feedback makes it excellent for learning accessibility principles, not just finding issues.
The tool categories issues clearly: errors are genuine WCAG violations, alerts are potential problems worth checking, and features highlight accessibility-positive elements like alt text and headings. This categorization helps you prioritize. WAVE is completely free and available for Chrome, Firefox, and Edge.
axe DevTools
The axe DevTools extension integrates directly into browser developer tools, making it feel native to the development workflow. Powered by the axe-core engine (which we'll discuss more in CI/CD integration), it provides detailed, consistent scanning. The free tier offers comprehensive automated testing; paid tiers add guided manual testing workflows.
What I appreciate about axe is the consistency—the same engine powers the browser extension, CI/CD integrations, and enterprise tools. Results are comparable across contexts. This makes it easy to use axe during development and then automate the same checks in your build pipeline.
Accessibility Insights (Microsoft)
Microsoft's Accessibility Insights offers something others don't: a guided assessment mode that walks you through manual testing procedures. FastPass handles automated checks, but the Assessment mode systematically guides you through keyboard testing, screen reader verification, and other manual checks. It also includes a tab stops visualizer that shows exactly how keyboard focus moves through a page.
This guided approach is particularly valuable for teams building accessibility skills. Instead of just knowing something needs manual testing, you get step-by-step instructions for performing that testing.
Lighthouse
Lighthouse is built into Chrome DevTools and includes accessibility as one of its audit categories alongside performance, SEO, and best practices. It's convenient for quick checks since it's always available without installing anything. However, its accessibility coverage is less thorough than dedicated tools. I use it when I want a quick general overview but rely on WAVE or axe for serious accessibility work.
| Tool | Cost | Strengths | Best For |
|---|---|---|---|
| WAVE | Free | Visual context, educational | Learning, quick checks |
| axe DevTools | Free/Paid | Detailed, consistent engine | Development workflow |
| Accessibility Insights | Free | Guided testing, tab stops | Comprehensive testing |
| Lighthouse | Free | Built-in, multi-purpose | Quick overview |
CI/CD Integration Tools
Browser extensions work during development, but they depend on someone remembering to run them. CI/CD integration makes accessibility testing automatic—every pull request, every deployment gets checked. Issues are caught before reaching production. This is where accessibility testing moves from occasional check to systematic practice.
axe-core
The axe-core JavaScript library is the engine behind axe DevTools, and it integrates with nearly every testing framework. Selenium, Playwright, Cypress, Puppeteer—there are packages for all of them. Write your functional tests normally, then add axe checks. The consistency with axe DevTools means issues found in CI match what you'd see in the browser extension.
Integration is straightforward. In a Playwright test, you import @axe-core/playwright and call checkA11y() after navigating to a page. In Cypress, you use cypress-axe and call cy.checkA11y(). The library handles scanning and reports violations. You decide what happens with that information—warnings, build failures, or logged reports.
Pa11y
Pa11y provides command-line accessibility testing that's easy to add to any CI pipeline. Point it at a URL, get accessibility results. It supports both HTML_CodeSniffer and axe as underlying engines. Pa11y Dashboard adds monitoring and reporting for ongoing tracking.
I find Pa11y particularly useful for testing production URLs rather than test environments. A scheduled job running Pa11y against your live site catches issues that slip through—new content with accessibility problems, theme changes that affect contrast, third-party scripts that add inaccessible elements.
Benefits of CI/CD Integration
The real value of automated accessibility testing in CI/CD isn't the technology—it's the shift in how teams think about accessibility. When accessibility checks are part of the pipeline, they become normal. Developers see results on every pull request. Teams discuss whether to fail builds on errors. Accessibility becomes part of quality, not an afterthought.
Start with warnings to let teams learn without blocking work. As maturity grows, enforcing standards via build failures prevents regressions. The goal is making accessibility part of the definition of "done."
Fail the Build?
Enterprise Platforms
When organizations need to manage accessibility across dozens or hundreds of sites, browser extensions and CI/CD integration aren't enough. Enterprise platforms add monitoring, reporting, issue tracking, and organizational features that help manage accessibility at scale.
Siteimprove
Siteimprove is popular in higher education for good reason. It combines accessibility scanning with SEO, analytics, and content quality tools in one platform. For universities managing many sites with limited centralized web staff, having everything in one dashboard simplifies monitoring. The accessibility scoring and issue prioritization help non-specialists understand what to fix first.
Beyond scanning, Siteimprove provides workflow tools for assigning and tracking remediation, reports for stakeholders, and trend tracking over time. This organizational layer is what justifies the enterprise cost.
Level Access (formerly SSB BART)
Level Access pairs their platform with professional services, which is valuable for organizations that need guidance along with tools. Their Accessibility Management Platform (AMP) provides scanning and tracking, but their expertise in accessibility compliance helps organizations understand what issues mean and how to fix them properly.
For organizations facing regulatory pressure or legal concerns around accessibility, having expert guidance alongside testing tools reduces risk.
Deque (axe Enterprise)
Deque's enterprise offering builds on axe-core, providing management features around the same engine used in free tools. This consistency is valuable—the same rules apply across browser testing, CI/CD, and enterprise monitoring. Their training resources and accessibility expertise complement the technical tools.
Pope Tech
Pope Tech offers WAVE-based scanning specifically designed for higher education. It's more affordable than some enterprise options while providing the reporting and management features institutions need. For colleges and universities specifically, Pope Tech understands the context—decentralized content creation, limited web resources, compliance requirements.
When to Go Enterprise
Enterprise platforms make sense when you need organization-wide monitoring, compliance reporting, workflow management, or when site volume exceeds what free tools handle efficiently. The cost is justified by reduced management overhead at scale.
Free Tools Still Work
Don't assume you need enterprise tools. For many organizations, free browser extensions plus CI/CD integration provide excellent coverage. Add enterprise when scale or compliance requirements genuinely justify the investment—not because it feels more professional.
Manual Testing Essentials
Automated tools catch perhaps a third of accessibility issues. The rest require human evaluation. Manual testing isn't optional—it's where you find the problems that actually block users with disabilities from using your site.
Keyboard Testing
Put your mouse aside and try navigating your site with only the keyboard. Can you reach every interactive element using Tab? Is focus always visible so you know where you are? Does the focus order make sense, or does it jump around illogically? Can you operate dropdown menus, modals, and custom widgets with keyboard alone?
Keyboard testing reveals issues that automated tools can't detect. A custom dropdown might have a focus indicator, but if the dropdown doesn't open with Enter or Space, keyboard users can't use it. That's a WCAG failure no scanner will catch.
Screen Reader Testing
Actually using a screen reader transforms your understanding of accessibility. Content that seems perfectly structured visually might be confusing when read aloud. Images with technically-present alt text might have descriptions that make no sense. Forms might be technically labeled but practically unusable.
You don't need to become a screen reader expert, but basic familiarity helps. VoiceOver is built into macOS—turn it on with Command+F5 and try navigating your site. NVDA is a free, widely-used screen reader for Windows. Learning basic navigation commands takes an hour and provides insights no automated tool can offer.
Content Review
Automated tools verify that elements exist but can't evaluate quality. Alt text exists—but is it meaningful? Headings are present—but do they create logical structure? Links are labeled—but is "click here" actually helpful? Manual review ensures that technically-correct implementations actually help users.
Automation Isn't Enough
Building Your Testing Strategy
Effective accessibility testing combines multiple approaches at different stages of development. No single tool or method is sufficient. Building a layered strategy ensures comprehensive coverage without overwhelming your workflow.
-
Development: Browser extensions
Run WAVE or axe while building. Catch issues immediately when they're easy to fix. Make testing as routine as checking your layout in different browsers.
-
Code review: CI/CD integration
Automated accessibility tests run on every pull request. Catch regressions before they reach production. Make accessibility part of the merge criteria.
-
Deployment: Automated scanning
Scan production after deployments. Monitor for issues that slip through—new content problems, third-party script issues, dynamic content accessibility.
-
Regular: Manual testing
Keyboard and screen reader testing on key user flows. Do this quarterly, after major changes, or whenever automated tests suggest areas of concern.
-
Periodic: Professional audits
Expert accessibility review annually or for compliance documentation. Professional auditors catch issues internal testing misses and provide actionable remediation guidance.
The best testing strategy is one you'll actually use. A sophisticated approach that never runs is worse than a simple approach that runs consistently. Start with browser extensions, add CI/CD integration, then expand to manual testing and professional audits as your accessibility practice matures.
Getting Started
If you're not doing accessibility testing now, start today. Install WAVE or axe in your browser—it takes one minute. Run a scan on your site. You'll find issues. Fix the easy ones. That's more progress than most organizations make in months of planning.
After you have browser testing established, add keyboard navigation to your routine. Tab through your key pages. You'll immediately notice problems: invisible focus, trapped keyboard users, elements you can't reach. These are often the most impactful accessibility issues, and they're free to test.
Then consider CI/CD integration. Adding axe-core to your test suite takes an afternoon and catches regressions automatically. Every pull request gets accessibility feedback. Over time, teams internalize accessibility standards because they see results constantly.
Try using a screen reader. VoiceOver is built into your Mac. Turn it on for ten minutes and navigate your site. The experience will change how you think about accessibility—it becomes real in a way that automated scan results never achieve.
No single tool solves accessibility. The combination of automated scanning, manual testing, and real user feedback creates comprehensive coverage. Start simple, build habits, and expand your testing as your practice matures. The goal isn't perfection—it's continuous improvement toward a site that works for everyone.
Frequently Asked Questions
Can automated tools catch all accessibility issues?
Which tool should I use?
How often should I run accessibility tests?
Are accessibility overlays a valid testing approach?
Need help with accessibility testing?
I help organizations establish accessibility testing practices that fit their workflow and catch real issues. Let's discuss how to improve your accessibility testing.