The State of Digital Quality in Accessibility 2026

2026 Annual Report

See the latest trends in digital accessibility and inclusive design, and learn how leading organizations benefit from their commitment to creating inclusive digital experiences.

    In a world that increasingly prioritizes digital interactions and experiences, the human element is all too easy to forget. Especially when those humans have characteristics, needs or behaviors that differ from the majority of users. Most people who encounter software defects find them frustrating or irritating. But for the 1.3 billion people living with some form of disability, when software development and test teams ignore or disregard their digital experiences, the results can be isolating and even discriminatory.

    AI tools are becoming embedded into software development and testing processes — including digital accessibility and inclusive design efforts. But in a discipline where empathy and focus on the end user are essential, is AI truly improving accessibility? Explore the findings from this year’s digital accessibility survey.

    How AI is Changing Accessibility Testing

    In a survey of more than 500 software development, QA, product, compliance and accessibility professionals, 79% reported that the organizations where they work are using AI to improve digital accessibility in their websites and applications. Teams are using AI code assist to help address existing issues and develop accessible new features. AI’s ability to quickly generate captions, transcripts and alt text for images is also proving useful — though the quality of AI outputs can vary wildly.

    Top use cases for AI in digital accessibility programs

    Use AI coding tools to address/remediate accessibility issues
    60.8%
    Use coding agents to generate accessible code on new features
    57.7%
    Provide AI-powered features for users
    55.4%
    Write test cases to check accessibility
    54.0%
    Generate alt text for images
    50.7%
    Scan sites or apps for accessibility issues
    47.8%
    Generate captions or subtitles for audio or video
    55.7%

    (n=383)

    n=1,112

    Like their automated counterparts, AI accessibility scanning tools can create the illusion of thorough coverage as teams work to conduct comprehensive audits for conformance. As AI tools get smarter and learn over time, they may eventually deliver more accurate results.

    How accurate do teams rate AI accessibility scans?

    Percentage of accessibility issues teams report their AI auditing tools accurately identify:

    n=1,097

    Respondents report mixed confidence in accessibility auditing: 40.9% say they can audit 51% to 75% of issues, 28.7% can audit 25% to 49%, 21.5% can audit more than 75%, 6.1% can audit less than 25%, and 2.8% are unsure. There were 1,097 respondents.

    Though the majority of respondents reported that their AI tools accurately identify 50% or more of accessibility issues, 89.3% still validate those test results with human testers.

    Issues with AI accessibility scanning tools

    %

    misses accessibility issues

    %

    misses issues and flags false issues

    %

    falsely identifies things as issues

    n=181

    William Reuschel, Inclusive Design Practice Lead at Applause, is cautiously optimistic about AI’s potential to improve experiences for people with disabilities. “I'm excited that AI is lowering the barrier for companies to produce more usable products for people with disabilities, but we still need to acknowledge the limits. Product teams, developers and QA teams need to understand that human insight — and in particular, input from the disability community — is still the foundation for creating apps that are truly usable for everyone.”

    As development teams rely on AI code assist to remediate accessibility issues and develop inclusive new features, those AI development tools themselves must model accessibility and inclusivity. Applause worked with one leading technology company to evaluate its low-code AI development platform, providing targeted feedback from developers with disabilities on the platform’s usability and accessibility. Testers conducted evaluations using various tools such as JAWS and NVDA screen readers, magnification software, and head-pointer devices. The testers identified a wide array of high-impact issues, including page layout friction, tabbing order problems, filter control defects, and extraneous text read by screen readers. Our client’s teams interacted directly with Applause project leads to fully understand insights and prioritize next steps. Ongoing collaboration provided continuous education for the tech company, effectively building inclusivity and empathy into the product development lifecycle.

    Despite developer and QA confidence in AI tools, people using assistive technology still often come up against critical blockers. Applause’s 2026 accessibility survey included insights from more than 1,000 people who use assistive technology to interact with apps and websites.

    Since January 1, how often have you encountered accessibility issues that prevented you from completing your task on a website or app?

    0%

    Rarely

    0%

    About once a month

    0%

    About once a week

    0%

    More than once a week

    n=954

    Patrick Cullen, Associate Director of Accessibility at Applause, outlined some of the challenges in testing how well apps work for assistive technology users. “Testing with assistive technology is quite difficult because of the sheer range of it. That's compounded by the fact that some of it is very technical and it's quite difficult to understand how it works and why, especially if you don't use it day to day. It would be great to learn every assistive technology, but that's not realistic.” Instead, Applause teaches teams how to effectively pare down to understand what tools they actually need to use and test. “We'll consult with users about where the gaps may be,” he said. Ultimately, developing a consistent methodology for ongoing accessibility testing and addressing issues allows teams to improve accessibility and inclusivity over time.

    Accessibility and Inclusivity Drive Consumer Loyalty

    To better understand how digital accessibility impacts consumer loyalty, Applause asked the same group which types of assistive technologies they use to interact with apps and websites:

    Assistive technologies used:

    n=1,005

    Distribution of the primary accessibility aid respondents rely on, with Alternative navigation most common at 46.1%, followed by Captioning at 22.5%, Screen readers at 16.0%, and Magnification or font adjustments at 15.4%. There There were 1,005 respondents.

    About two-thirds of respondents, 66%, reported that assistive technology is important or essential to their digital interactions. The remaining third stated that while they do not need assistive technology all the time, they see it as a nice-to-have that makes it easier for them to use websites and apps.

    How likely assistive technology users are to stay with brands that offer accessible digital experiences:

    0%

    Extremely loyal: I will use that brand as much as possible

    0%

    Somewhat loyal: I will use that brand, but may also use competitors

    0%

    Not loyal: other factors are more important to me than accessibility

    n=949

    The majority of people with disabilities are loyal to the apps and websites that offer them easy, intuitive digital experiences. Focusing on digital accessibility can create value for organizations in a variety of ways.

    The 16% Opportunity: How Accessible Products Earn Loyal Customers

    Explore some of the business benefits of prioritizing inclusive experiences.

    What the Most Inclusive Teams Do Differently

    Consult with assistive technology users throughout the design and development process

    Ask real users what they want and need, and listen to their feedback. Observe their user journeys to really understand how software works for them and where friction occurs – don’t just rely on automated tools or simulations. Human empathy, feedback and expertise are essential for developing the methodologies that allow organizations to create more accessible and inclusive apps.

    Blend human insight, automation and AI for thorough coverage

    While AI and automation can create some process efficiencies, human input is an essential component of developing accessible digital experiences. For example, AI tools may be able to quickly generate closed captions or alt text for images, but humans must review them for accuracy and contextual relevance.

    Focus on continuous improvement

    Accessibility is an ongoing commitment, not a one-time project. As technologies, standards, legislation and user expectations evolve, companies must ensure they stay current. Prioritizing accessibility improves the user experience for all and can deliver a number of other benefits to the business, such as improved discoverability and API integration.

    Report Methodology

    In April 2026, Applause conducted a survey of members of the uTest community as well as other software development, QA, product and accessibility professionals, with the following goals:

    • Understand how organizations are using AI in accessibility efforts
    • Learn how digital accessibility impacts brand loyalty for PWD

    We also conducted interviews with technology and accessibility leaders.

Explore Additional Digital Quality Insights

BLOG

Are AI Tools Improving Accessibility in 2026?

Read the highlights from Applause’s annual survey on the State of Digital Accessibility .

Read Now

VIDEO

Building An AI Testing Toolkit

Learn the critical elements for a comprehensive AI testing plan. Experts share how to blend AI, automation, and human testing for the best outcomes.

Watch Now

REPORT

Applause’s Testing and UX Frameworks

Learn how to improve your testing efforts across multiple dimensions

Learn More