Automated Accessibility Testing in 2025: What Actually Works
Automated Accessibility Testing in 2025: What Actually Works
Hard truth: Automated accessibility testing finds 20-35% of WCAG issues.
The other 65-80%? Manual testing, user testing, or production bugs.
But here's the thing—that 20-35% accounts for 80% of the issues you'll actually ship. Low-hanging fruit like missing alt text, color contrast, and broken ARIA.
Zander Whitehurst's accessibility audits show: automated testing prevents 80% of user complaints, despite catching only 35% of potential violations.
Here's how to build a testing stack that actually works.
The Limits of Automation
What Automation Can Catch
✅ WCAG Level A (mostly):
- Missing alt attributes
- Color contrast ratios
- HTML validity
- Form labels
- Heading hierarchy
- Landmark regions
✅ Some Level AA:
- Focus indicators (if implemented poorly)
- Resize text (zoom)
- Some keyboard navigation
✅ Common ARIA mistakes:
- Invalid roles
- Missing required attributes
- Incorrect relationships
What Automation Cannot Catch
❌ Context-dependent issues:
- Alt text that's present but meaningless ("image.png")
- Heading text that's not descriptive
- Link text like "click here"
❌ Complex interactions:
- Custom keyboard navigation
- Focus traps in modals
- Live regions that don't announce
- Single-page app route changes
❌ User experience:
- Whether captions are accurate
- If the page makes sense to screen readers
- Cognitive load for users with disabilities
The Testing Stack
Layer 1: Linting (Pre-Commit)
Tool: eslint-plugin-jsx-a11y
// .eslintrc.json
{
"extends": ["plugin:jsx-a11y/recommended"],
"rules": {
"jsx-a11y/alt-text": "error",
"jsx-a11y/anchor-is-valid": "error",
"jsx-a11y/aria-props": "error",
"jsx-a11y/aria-role": "error",
"jsx-a11y/label-has-associated-control": "error"
}
}
Catches: ARIA misuse, missing labels, invalid roles
Coverage: ~15% of WCAG issues
Speed: Instant (pre-commit hook)
Layer 2: Component Testing (CI)
Tool: @axe-core/react + jest-axe
import { render } from '@testing-library/react';
import { axe, toHaveNoViolations } from 'jest-axe';
expect.extend(toHaveNoViolations);
test('Button is accessible', async () => {
const { container } = render(<Button>Click me</Button>);
const results = await axe(container);
expect(results).toHaveNoViolations();
});
Catches: Component-level issues (contrast, focus, ARIA)
Coverage: ~25% of WCAG issues
Speed: <1min per test
Layer 3: E2E Testing (Pre-Deploy)
Tool: @axe-core/playwright
import { test, expect } from '@playwright/test';
import AxeBuilder from '@axe-core/playwright';
test('Checkout flow is accessible', async ({ page }) => {
await page.goto('/checkout');
const results = await new AxeBuilder({ page })
.withTags(['wcag2a', 'wcag2aa', 'wcag21a', 'wcag21aa'])
.analyze();
expect(results.violations).toEqual([]);
});
Catches: Page-level issues (landmarks, headings, tab order)
Coverage: ~35% of WCAG issues
Speed: 5-30s per page
Layer 4: Continuous Monitoring (Production)
Tool: VertaaUX API (or similar)
# Daily cron job
curl -X POST https://vertaaux.ai/api/v1/audit \
-H "Authorization: Bearer ${API_KEY}" \
-d '{"url": "https://app.example.com/dashboard", "mode": "deep"}'
Catches: Regressions, new pages, dynamic content issues
Coverage: ~35% of WCAG issues (same as E2E, but on real production)
Speed: 8-15s per page
The Missing Piece: Manual Testing
Even with perfect automation, you need manual accessibility testing.
When to Manual Test
- Before major releases (quarterly)
- After redesigns
- New interaction patterns (custom date pickers, drag-and-drop)
- User-reported issues
Manual Testing Checklist
Keyboard navigation (15 min):
- Tab through entire page
- All interactive elements reachable?
- Focus indicators visible?
- Modal traps focus correctly?
- Esc closes dialogs?
Screen reader test (30 min):
- Use NVDA (Windows) or VoiceOver (Mac)
- Navigate by headings (H key)
- Navigate by landmarks (D key)
- Forms announce labels correctly?
- Buttons announce purpose?
- Page updates announce via live regions?
Zoom test (10 min):
- 200% zoom doesn't break layout
- Text doesn't truncate
- Horizontal scroll <400% zoom
Common Pitfalls
1. Over-Reliance on Automation
Mistake: "Axe passes, ship it!"
Reality: Axe can't detect meaningless alt text:
<!-- Passes automation, fails manual review -->
<img src="hero.jpg" alt="image" />
<!-- Correct -->
<img src="hero.jpg" alt="Developer reviewing accessibility test results in code editor" />
2. Testing in Isolation
Mistake: Testing components in Storybook only
Reality: Accessibility is contextual. A component might be accessible alone but cause issues when composed:
// Component is accessible
<Modal><Form /></Modal>
// But in context...
<Page>
<SkipLink /> {/* Hidden by modal's z-index */}
<Modal><Form /></Modal>
</Page>
3. Ignoring False Positives
Mistake: Disabling rules because they fire incorrectly
Reality: Fix the code, not the test:
// Bad: Disable rule
<button aria-label="Close"> {/* eslint-disable-line */}
<X />
</button>
// Good: Fix the issue
<button aria-label="Close dialog">
<X aria-hidden="true" />
</button>
Advanced Techniques
1. Accessibility Regression Testing
Track violations over time:
// Save baseline
const baseline = await axe(page);
fs.writeFileSync('a11y-baseline.json', JSON.stringify(baseline));
// Later: Compare against baseline
const current = await axe(page);
const newViolations = current.violations.filter(v =>
!baseline.violations.some(b => b.id === v.id)
);
expect(newViolations).toEqual([]);
2. Custom Axe Rules
import axe from 'axe-core';
axe.configure({
rules: [
{
id: 'link-name-required',
enabled: true,
selector: 'a[href]',
evaluate: (node) => {
const text = node.textContent.trim();
return text.length > 0 && !['click here', 'read more', 'learn more'].includes(text.toLowerCase());
},
metadata: {
description: 'Links must have descriptive text',
help: 'Avoid generic link text'
}
}
]
});
3. Focus Order Testing
test('Focus order is logical', async ({ page }) => {
await page.goto('/checkout');
const focusOrder = [];
await page.keyboard.press('Tab');
while (focusOrder.length < 20) {
const focused = await page.evaluate(() => ({
tag: document.activeElement?.tagName,
text: document.activeElement?.textContent?.trim().slice(0, 30),
id: document.activeElement?.id
}));
if (focusOrder.some(f => f.id === focused.id)) break; // Loop detected
focusOrder.push(focused);
await page.keyboard.press('Tab');
}
// Assert focus order matches visual order (manual review)
console.log('Focus order:', focusOrder);
});
Metrics to Track
Coverage Metrics
type A11yMetrics = {
totalPages: number;
pagesWithViolations: number;
totalViolations: number;
criticalViolations: number; // wcag2a
seriousViolations: number; // wcag2aa
moderateViolations: number;
minorViolations: number;
};
Goals:
- Critical violations: 0
- Serious violations: <5 per page
- Test coverage: 100% of user-facing pages
Trend Metrics
Track over time:
- Violations per deploy
- Time to fix (from detection to resolution)
- % of violations caught pre-production
Red flags:
- Violations increasing week-over-week
- Same violations recurring (indicates process gap)
The 80/20 Rule for Accessibility
Focus on these high-impact, automatable checks:
- Alt text (20% effort, 30% impact)
- Color contrast (10% effort, 20% impact)
- Keyboard navigation (30% effort, 25% impact)
- Form labels (15% effort, 15% impact)
- Headings (10% effort, 10% impact)
That's 85% effort for 100% of low-hanging fruit.
Conclusion
Automated accessibility testing is necessary but not sufficient.
The winning formula:
- Lint in development (catch 15% instantly)
- Component tests in CI (catch 25% before deploy)
- E2E tests pre-deploy (catch 35% on real pages)
- Manual testing quarterly (catch the other 65%)
- User testing with people with disabilities (catch edge cases)
Real numbers from our clients:
- Before: 340 accessibility issues in production
- After (with this stack): 12 issues in production (-96%)
- User complaints: -89%
- Legal risk: eliminated
Accessibility isn't just compliance—it's usability for everyone. And that's good business.
VertaaUX automates steps 1-3 and provides manual testing checklists for step 4. Run a free accessibility audit →