Building accessible apps

With built-in accessibility features, accessibility APIs, and developer tools, Apple operating systems provide extraordinary opportunities to deliver high-quality experiences to everyone, including people with disabilities. Take advantage of VoiceOver — the revolutionary screen reader for blind and low-vision users — Music Haptics, Switch Control, Guided Access, Text to Speech, closed‑captioned or audio‑described video, and more.

A MacBook Pro with Xcode, Accessibility Inspector, and Simulator windows open. Accessibility Inspector is being used to review the simulated iPhone 13 Pro app’s hierarchy.

    What’s new

    Breakthrough features like Eye Tracking and Hover Typing on iPhone and iPad, along with new accessibility APIs like Music Haptics, can help make your app more inclusive. And when integrated with existing features like Dynamic Type and VoiceOver, these enhancements make it easier than ever to deliver high-quality experiences to everyone.


    Apple devices include a wide variety of features and assistive technologies to support users who are blind or have low vision, such as display and text settings, screen and cursor magnification, a full-featured screen reader, and much more.

    Learn more about Vision


    VoiceOver is a screen reader that enables people to experience an app’s interface without having to see the screen. With touch gestures on iOS and iPadOS, keyboard buttons on macOS, and remote buttons on tvOS, VoiceOver users can move through your app’s UI in reading order, from the top of the screen to the bottom, and receive descriptions in over 60 languages and locales of UI controls, text, and images audibly or in braille.

    iOS, iPadOS, macOS, tvOS, watchOS

    Dim Flashing Lights

    Dim Flashing Lights is a setting that allows people to indicate that they want to avoid bright, frequent flashes of light in video. When someone enables this setting on an Apple device, the device automatically dims video when flashes or strobe effects are detected. Use the Media Accessibility API to check if someone has chosen to dim flashing lights on their Apple device. If you work with media content outside of Apple platforms, learn about the science behind reducing flashing lights in video content to create an enjoyable media-viewing experience for everyone.

    iOS, iPadOS, macOS, tvOS


    Apple’s devices can read selected text from your app out loud in over 60 languages and locales, and you can adjust the voice’s dialect and speaking rate. The AVSpeechSynthesizer class produces synthesized speech from text on a device, and provides methods for controlling or monitoring the progress of ongoing speech.

    iOS, iPadOS, macOS, tvOS, watchOS

    Dynamic Type

    Dynamic Type allows users to choose the text size of content displayed on the screen for better readability. It also accommodates those who can read smaller text, allowing more information to appear on the screen. Apps that support Dynamic Type also provide a more consistent reading experience.

    iOS, iPadOS, watchOS

    Display customization

    There are a range of features to customize the display, including Bold Text, Increase Contrast, Reduce Transparency, Smart Invert, Differentiate Without Color, On/Off Labels, Button Shapes, Dark Mode, and Reduce Motion. Use UIAccessibility APIs to detect when these settings are enabled so that your app behaves correctly.

    iOS, iPadOS, macOS, tvOS, watchOS

    Audio descriptions and captions

    Let people watch movies with detailed audio descriptions of every scene on iPhone, iPad, Mac, Apple TV, or iPod touch. Use AVFoundation with built-in support for captioning and audio descriptions during media playback in your apps.

    iOS, iPadOS, macOS, tvOS


    Apple has multiple solutions that help users with limited physical or motor abilities use your apps. Both Voice Control and Switch Control use the accessibility hierarchy to interact with elements within your app.

    Learn more about Mobility

    Voice Control

    If your app uses accessibility APIs, Voice Control users can navigate your app’s interface using just their voice. Commands like “click,” “swipe,” and “tap” allow them to interact with elements within your app through the APIs.

    iOS, iPadOS, macOS

    Switch Control

    With Switch Control, users can navigate your app’s interface using a variety of adaptive devices, such as a switch, joystick, keyboard Space bar, or trackpad. They can navigate your app by scanning through each UI item, either by manually activating their switch or auto scanning the interface. Once the desired item is reached, users can perform the appropriate action with their device. To provide a great experience, make sure to use accessibility APIs.

    iOS, iPadOS, macOS, tvOS

    Keyboard support

    Provide keyboard shortcuts so that people who don’t navigate using a mouse due to limited motor skills can still fully access your app’s features.


    iOS, iPadOS, tvOS, watchOS


    Haptic feedback on Apple Watch can provide valuable information to everyone, but it can be particularly useful to those with a range of disabilities. With access to the Taptic Engine, you can add haptic feedback in your apps.

    iOS, iPadOS, macOS, tvOS, watchOS

    Quick Actions

    Quick Actions in watchOS 9 let people quickly perform common tasks within your app. When your app uses the Quick Actions API, people can assign an action and activate that action with a double-pinch gesture. For example, with Quick Actions built into Apple Watch, users can:

    • Answer and end a phone call.
    • Dismiss a notification.
    • Take a photo.
    • Pause and resume an active workout.



    Apple accessibility technologies contain multiple features to accommodate people who are deaf or hard of hearing, including captions, system translation, Made For iPhone (MFi) hearing aid support, sound recognition, and background sounds.

    Learn more about Hearing


    Let people watch movies with closed captions or subtitles for the deaf and hard of hearing (SDH) for all audio in every scene on iPhone, iPad, Mac, and Apple TV. Use AVFoundation with built-in support for captioning during media playback in your apps.

    iOS, iPadOS, macOS, tvOS

    Hearing devices

    Top manufacturers have created hearing aids and sound processors designed specifically for iPhone and iPad. These advanced hearing devices provide outstanding sound quality, offer many helpful features, and are as easy to set up and use as any other Bluetooth device.

    People can instantly apply their audiologist’s suggested environmental presets as they go outdoors or enter noisy locations, such as restaurants, without having to rely on additional remotes. If you’re a hearing aid manufacturer and want to make your devices compatible with Apple devices, consult the resources below.

    iOS, iPadOS


    Apple products contain many technologies that can fit the different ways users learn or communicate, including Guided Access, captions, and word prediction.

    Learn more about Cognitive

    Assistive Access

    Assistive Access, a game-changing new feature, makes technology — and your apps — more accessible to users with cognitive disabilities on iPhone and iPad. Cutting-edge advancements in speech synthesis let you create even more custom experiences.

    iOS, iPadOS, macOS, tvOS

    Guided Access

    Guided Access helps people with autism or other attention and sensory challenges stay focused on the task at hand. Implementing the Guided Access protocol in your app lets you specify which parts of your apps are functional, depending on the user’s needs.

    iOS, iPadOS, macOS, tvOS


    Learn how to build accessible apps with Apple developer tools, documentation, videos, and sample code.

    View resources