Apple unveiled a comprehensive suite of accessibility enhancements coming to iOS, macOS, iPadOS, and watchOS later this year, coinciding with Global Accessibility Awareness Day. These updates represent some of the most significant accessibility improvements in recent Apple history, introducing entirely new capabilities alongside meaningful upgrades to existing features.
The announcement, made ahead of Apple’s Worldwide Developers Conference, showcases the company’s continued commitment to making technology more inclusive. From breakthrough features like Brain Computer Interface support to practical improvements like system-wide reading modes, these updates will meaningfully impact how millions of users interact with their devices.
Apple is bringing its popular Magnifier app to Mac computers for the first time, creating new possibilities for users with visual impairments. The Mac version will integrate with Continuity Camera (Apple’s feature that lets your iPhone serve as a wireless camera for your Mac) as well as USB-connected cameras.
Users can point documents at their camera and view magnified text on their larger Mac screen, with full control over brightness, contrast, and zoom levels. The app allows custom view configurations that can be saved and organized for different use cases, whether reading small print on documents or examining detailed objects.
Apple will introduce dedicated “Accessibility Nutrition Labels” on App Store product pages, similar to how nutrition labels work on food packaging. These labels will clearly highlight which accessibility features each app supports, helping users make informed decisions about downloads.
This transparency addresses a longstanding challenge where users often discovered accessibility limitations only after purchasing or downloading apps. The labels will cover features like VoiceOver compatibility, switch control support, and visual accessibility options.
Perhaps the most groundbreaking addition is Switch Control for Brain Computer Interfaces (BCIs) – technology that allows device control through thought alone, without any physical movement. This represents Apple’s entry into assistive technology that can help users with severe mobility limitations interact with their devices using only neural signals.
While BCIs are still emerging technology, Apple’s integration suggests these interfaces are becoming mature enough for mainstream accessibility applications.
Apple is introducing comprehensive Braille Access, described as a “full-featured braille note taker” deeply integrated throughout the Apple ecosystem. Users will be able to take notes in braille format and perform mathematical calculations using Nemeth Braille, a specialized braille code for mathematical and scientific notation.
This integration goes beyond basic braille display support, creating a complete digital braille workspace across Apple devices.
A new Accessibility Reader will debut across iPhone, iPad, Mac, and Apple Vision Pro, providing a consistent reading experience for users with dyslexia, low vision, or other reading-related disabilities. Users can customize text font, size, color, contrast, and spacing to optimize readability across any app or website.
This system-wide approach means users won’t need to configure reading preferences separately for each application.
Apple Watch will gain Live Captions functionality, particularly valuable for users who are deaf or hard of hearing. The feature transforms the paired iPhone into a remote microphone, streaming audio content directly to AirPods, Made for iPhone hearing aids, or compatible Beats headphones.
This creates a discreet way to access audio content in environments where visual captions aren’t practical.
Apple Vision Pro will receive Enhanced View, a feature that enlarges everything in the user’s field of view, including their physical surroundings. This spatial computing accessibility feature addresses low vision challenges in mixed reality environments, ensuring the technology remains inclusive as it evolves.
Personal Voice, Apple’s feature that creates synthetic speech matching a user’s natural voice, will become dramatically faster. The updated system can generate a personal voice in under one minute using just 10 recorded phrases, thanks to improved on-device machine learning algorithms.
This speed improvement is crucial for users who may lose their speaking ability due to medical conditions and need to create their Personal Voice quickly.
Background Sounds, designed to help users minimize distractions and improve focus, will gain new equalization settings for personalized audio environments. Users can fine-tune ambient sounds like rain, ocean waves, or white noise to match their specific concentration needs.
Vehicle Motion Cues, previously available only on mobile devices, will expand to Mac computers. This feature helps reduce motion sickness by displaying subtle visual indicators that correspond to vehicle movement, particularly helpful for users who experience nausea when using devices while traveling.
iPhone and iPad will receive improved Eye Tracking functionality, allowing users to make selections using either switch controls or “Dwell” – a technique where users look at an interface element for a specified time to activate it. This provides more precise and comfortable navigation for users with limited hand mobility.
Head Tracking improvements will make iPhone and iPad control more intuitive and responsive for users who navigate their devices through head movements. These enhancements promise smoother cursor control and more accurate gesture recognition.
The Apple TV app will integrate Assistive Access, a simplified interface designed specifically for users with intellectual and developmental disabilities. This creates a more navigable entertainment experience with reduced complexity and clearer visual organization.
Sound Recognition will expand in two significant ways. First, Name Recognition will help users who are deaf or hard of hearing know when someone calls their name. Second, Sound Recognition will integrate with CarPlay, alerting drivers and passengers with hearing difficulties to important sounds like crying babies or emergency vehicles.
Voice Control and Live Captions will add support for additional languages, making these accessibility features available to more users worldwide. Apple will also introduce the ability to share accessibility settings between devices, simplifying setup for users with multiple Apple products.
These accessibility features will roll out throughout the remainder of 2024, with most expected to arrive alongside the annual iOS, macOS, iPadOS, and watchOS updates typically released in fall. Apple’s advance announcement allows developers time to optimize their apps for these new capabilities and ensures users can plan for the enhanced accessibility options.
The comprehensive nature of these updates reflects Apple’s recognition that accessibility improvements benefit all users, not just those with specific disabilities. Features like system-wide reading modes and customizable audio environments often enhance the experience for anyone using Apple devices in challenging conditions or environments.