A11y’s AI Future (The Future is Now)

Written By

An abstract illustration representing the impact of AI on web accessibility, featuring an eyeball at the center. The image uses various shades of green, with a combination of swirling colors and geometric shapes. Abstract representations of AI elements such as neural networks and digital interfaces are included, along with subtle icons or symbols of accessibility like stylized screen readers, Braille, and voice commands blending into the design. The overall tone is futuristic and dynamic, symbolizing growth and innovation in the field of accessibility.

As a web accessibility professional, my role involves understanding how technology interacts with content. This article explores how AI-powered tools will affect web accessibility and how AI technology is transforming A11y tech.

Apple Intelligence: A11y Game Changer

Apple recently released a beta version of their AI tool, Apple Intelligence.

Apple Intelligence revolutionizes user interaction, enabling control through natural language. Users can say “write an email,” dictate the content, proofread, and send it—all without viewing the screen.

This innovation provides a new generation of disabled users with swift access to essential tools. For example, a blind user no longer needs to navigate cumbersome swipe gestures or verbose screen readers to perform basic tasks that sighted users take for granted.

A Future Without Screen Readers

Tools like Apple Intelligence immediately impact the usefulness of screen readers.

Screen readers currently interpret web pages by reading the underlying code. For instance, they recognize a heading through tags like <h1>.

Generative AI doesn’t rely on strict code semantics. AI tools are primarily context-aware, recognizing headings by the <h1> tag, visual appearance, or contextual language.

Imagine: Your iPhone quickly summarizes a web page, then tells you what you can do next. An example conversation:

Me: Siri, read “equalify.app”.

Siri: Equalify.app introduces Equalify, a revolutionary website accessibility platform. You can request a demo, buy Equalify or learn about more information. What would you like to do?

Semantic markup like heading tags become a fallback tool for people who are blind or don’t want to think about how content appears on a page.

A11y Pros Must Embrace Newness

Accessibility professionals are also technology experts, and technology evolves. Therefore, I advocate for every A11y pro to become proficient in AI. Understanding tools like Apple Intelligence involves observing how people with disabilities use these technologies.

Recently, I attended a conference with my blind friend Kevin, who used the BeMyAI app to read airport signs and describe paintings in a bar. The experience taught me to appreciate the tools Kevin was using and demonstrated clear shortcomings that I needed to be aware of.

If our goal is to help people with disabilities, we will soon find ourselves showing folks how they can make their websites better for new tech like AI.

A11y Pros as AI Fact-Checkers and Advocates

The biggest role I see for accessibility pros is fact-checking.

Describing images is particularly challenging. Even the best AI tools have biases and struggle to comprehend the nuances of images.

We must identify AI errors and know how to correct them. Additionally, staying in touch with companies developing AI tools is essential for advocating for people with disabilities.

Those two roles—fact-checker and advocate for people with disabilities—will not disappear. Our jobs are secure as long as we continue to adopt new tools and ask, “How does this tool affect people’s lives?”

A11y Tech’s AI Future

A11y tech must also adapt to new technological advancements.

For example, axe-core relies on rules-based tests of website source code. Deque, the company behind axe-core, is developing future tests to analyze a page’s appearance rather than its code.

My company specializes in web accessibility reporting software, focusing on reporting and remediation rather than testing. We can integrate with any test, AI-based or otherwise.

Other technologies, such as screen readers, might evolve significantly with AI. For instance, NVDA could develop an AI voice agent specifically designed for blind users.

The survival of A11y tech hinges on its ability to adapt to new technologies while addressing specific challenges.

What do you think?

I’m very curious to hear the role you see for accessibility professionals in the age of AI. Will AI replace screen readers? When will we no longer use web browsers?

Drop a comment below!

Categories:

Leave a Reply

Your email address will not be published. Required fields are marked *


Discover more from Equalify

Subscribe to get the latest posts sent to your email.

Email Address(Required)