ArticlesEnd-to-end testing leveled up the way Asana engineers think about accessibility
End-to-end testing leveled up the way Asana engineers think about accessibility
February 7, 2025 by Ryan Farley

- 24/7/365 automated monitoring
- 2x automated WCAG Success Criteria coverage
Before partnering with Assistiv Labs, Asana’s accessibility team generally grouped testing into two categories: automated and manual. The former to quickly catch algorithmically-detectable low hanging fruit and the latter for everything else that required more time and attention to find. Automated testing fit well into rapid processes like continuous integration (CI), while manual testing required careful coordination with skilled testers to cover everything automated tools couldn’t.
Today, new accessibility governance principles and processes combined with Assistiv Labs’ end-to-end accessibility testing service have produced a testing workflow that redefines automated and manual. It’s something in between, a hybrid approach that dramatically increases automated coverage and reduces manual coordination overhead.
Asana is notified in hours about actionable, easy-to-reproduce issues that otherwise would have required manual testing to detect. New processes automatically determine priority and route issues to the appropriate teams, which fix most within a few days.
The benefits of this novel arrangement have been both technical and cultural. Accessibility feedback loops are far shorter and bugs are resolved faster, while engineers across the company have begun viewing accessibility issues differently.
Cameron Cundiff, technical lead for Asana’s accessibility efforts, says that because of how long it previously took to identify accessibility issues, “Many engineers put accessibility bugs in a different mental silo than other bugs.” Reports from manual QA would arrive after something was shipped, and they didn’t fit into the same mental model as non-accessibility bugs that were documented days after a pull request. That’s now changed: “Assistiv Labs’ tooling essentially eliminates the distinction.”
Laying the foundations 🧱
Reducing the time it takes to find and fix accessibility bugs didn’t begin with a new testing methodology. Cameron says it started much earlier, when Asana leadership laid the foundations by emphasizing accessibility as an integral component of Asana’s mission to “help humanity thrive by enabling the world’s teams to work together effortlessly.” Around 1 out of every 6 people in the world are disabled and workplace collaboration—by definition—requires equitable, inclusive user experiences.
Next came the people who operationalized a culture of continuous accessibility, building a dedicated team to set standards, explore new approaches that flourish within the organization, and work across the company to keep everyone aligned on accessibility commitments and priorities.
Which is no small feat—Asana is a highly configurable work management application. Yet, despite its colossal scope and near-constant app updates, they’re more on top of accessibility than ever. That’s a massive win for a product used by 85% of Fortune 100 companies1, a product where every moment an issue lingers in production equates to reduced productivity for customers.
How Asana built processes to prioritize accessibility bugs
At Asana’s scale, a strong prioritization framework supported by processes and governance principles is essential. Teams need to know what needs to be scheduled into their backlogs for when, and what needs attention right now.
But it’s easy to underestimate how hard it is to prioritize accessibility bugs. Priority is based on a dizzying matrix of unique factors including assistive technologies, browsers, impacted user groups, user feedback, product area, fix difficulty, workarounds, WCAG conformance, and historical context. Some bugs may have existed already, but others appear with new features. And some are regressions—functionality that used to be accessible but now isn’t.
Jiaxin Zheng, the Asana accessibility team’s Technical Program Manager, took the initiative to formally define different types of accessibility bugs to enable streamlined workflows around each. An essential step was “defining what constituted a regression vs. what didn’t.”
She realized that regressions could naturally receive high priority—the last thing anyone wanted was to have accessibility backslide. This can greatly simplify prioritization in many cases.
However, there was a catch. Regressions are defined by before and after snapshots. For example, a before snapshot could be a video of the functionality working and an after snapshot could be a video of a new problematic result. She said bugs naturally “had the ‘after’ snapshots, whenever the product had regressed. But we had to go digging to find the ‘before’ snapshots, the last known time something was working as intended. Those were hard to find.”
Without a reliable source of before snapshots, it often wasn’t worth the time to investigate whether a bug could be prioritized as a regression. This is where a new kind of automation proved immensely helpful.
End-to-end accessibility automation
Fully automated tests are not new to accessibility testing or to Asana. “axe DevTools and jsx-a11y for React provided us with broad, horizontal coverage. But they’re shallow,” explains Cameron. "While I think the token 30% estimate for automated tests across the industry is usually a misunderstood generalization, it was fairly close to the WCAG criteria we were achieving.” Limited coverage meant manual testing was still finding bugs that automated tools had missed.
The team needed tools that could go deeper. Tools more closely aligned with Asana’s user research and the governance principles Jiaxin and the rest of the team had created. Which is what they found with Assistiv. Tests for Assistiv’s end-to-end service are written from the ground up by Assistiv engineers based on provided user flows and test parameters. The suites incorporate keyboard shortcuts, real screen readers, browsers, and machine vision powered by the Assistiv Labs cloud. The result is more than simulation, with events transmitted to a machine the same way a human user would, maximizing coverage of WCAG and broader accessibility concerns.
“It’s the holy grail.” 🏆
End-to-end accessibility automation is radically different from traditional automation, interacting with Asana to accomplish tasks in the same way a manual tester would.
“It’s the holy grail,” Cameron says. “The possible range of things you cover is something like between 60 and 75% of WCAG criteria, depending on the test scenario.” And there are still live, critical thinking experts watching over all of it. People from both Asana and Assistiv are involved in designing representative user actions and reviewing the outcomes, drastically raising the automated testing floor in terms of scope, frequency, and accuracy.
Combining forces
With a strong prioritization framework in place and new automation that switched from finding a minority of bugs to the majority, Asana implemented a powerful prioritization workflow.
First, automated tests are sync’d with Asana’s existing engineering pipeline. New issues are detected in near-real time and correlated with the code changes that likely caused them.
Next, an Assistiv engineer reviews any test failure to filter out false positives and writes up an issue in Asana’s backlog with contextualized user impact and remediation guidance. Because automated tests are running continuously, a before snapshot is readily available and regressions can be easily classified. Jiaxin maintains automated Asana workflows that route the issue to the correct team.
In practice, this means that regressions are usually flagged within 24 hours of a deployment and documented in a way that is easy for engineers without a background in accessibility to understand. That allows Asana’s accessibility team to set an SLA for addressing regressions and leave product teams to it. No one has to make the case for which regression comes first or second or last. They’re just bugs that need attention.
For Cameron, this decentralization directly translates into a more sustainable program and more inclusive end user experiences. “We have a lot of agency and impact, a lot of space for creativity because we’re not constantly putting out fires. And that in turn means that we’re just happier and more effective.”
The ROI of faster feedback loops ♻️
A bug that goes undetected for an extended period of time is expensive to fix. Someone has to triage it to the appropriate team and ensure it’s prioritized. The engineer it’s assigned to likely isn’t the same engineer who caused it. Even when they are, it’s difficult to dig up the forgotten context, shift gears, or coordinate with other teams to untangle technical debt that has sprung up in the intervening weeks and months.
Putting numbers to the problem, a highly-cited IBM study found that it costs 30 times more to discover bugs in production than those found during the design phase. That rings true for Asana. Before the end-to-end service, Cameron recalls “we might hear about a new bug weeks later. And we had no way of separating existing historical bugs from recent regressions.”
At Asana, accessibility bugs are just bugs. And they get fixed.
Now, triage overhead is eliminated and engineers can ship a fix while the project is still fresh in their memory. But faster turnaround is only part of the process. Jiaxin remembers how Project Engineering Leads of Accessibility, or PELs, used to spend a lot of time “reading between the lines when interpreting feedback from customers, product teams, and client teams. The guidance just wasn’t written in engineering language. Now, bug reports written by engineers for engineers have turbocharged the system that we already had by cutting down a lot of the back and forth during discovery.”
When Asana engineers deploy an update in the morning, any evidence of regressions will normally start popping up before the end of the work day. The feedback loop is so short that “Assistiv has on multiple occasions been the first to alert us to generic UI bugs that are not restricted to screen readers and keyboard navigation,” according to Cameron. “They’re just so hands-on, technical, and timely with their reporting.”
As engineers started receiving faster feedback from end-to-end testing, he noticed “It reduced the cognitive overhead of prioritization. The ambiguity went away. Engineers would think ‘I’m accountable for this because I shipped it this morning. It worked yesterday and it’s broken today, I need to fix it now.’ Fixing bugs is like muscle memory and they don’t have to think about it.”
At Asana, accessibility bugs are just bugs. And they get fixed.
A better way to automate accessibility
Before onboarding end-to-end accessibility testing, Asana’s accessibility team had made significant headway. On the Operations side of things, design and engineering teams had documented review processes, regression definitions, and SLAs in place. When it came to testing, there were automated solutions for swift but shallow reports and manual QA for thorough but time-consuming audits.
What Assistiv brought to the table was a technical solution that complemented and sat atop existing efforts. Regressions are being documented sooner, in more detail, with more evidence. Virtually no one is wasting time on bug reports that aren’t actionable or reproducible.
End-to-end accessibility testing saves money, sure. And time. But that’s not what the team sees as the main benefit.
Asana has a clear picture of which bugs are new and which are old, which user flows they affect and who is responsible for remediation. They can get ahead of the curve.
But best of all, Cameron sees Asana’s engineers internalizing accessibility as part of the job. It’s a reflex built on Asana’s continuous investment in accessibility training, documentation, automation, and guidance. “We can be much more proactive now and much more visionary about our program and what we want to achieve.”
Learn more about Assistiv Labs’ accessibility-first end-to-end testing service then get in touch to schedule a demo.
Footnotes
-
As of December 2023 ↩