-
Prepare your app for Accessibility Nutrition Labels
Learn how to prepare your app for Accessibility Nutrition Labels by supporting essential accessibility features. Discover how you can enhance interaction methods like VoiceOver and Voice Control by properly configuring accessibility labels, traits, and values for custom controls and gestures. Find out how you can support larger text sizes using Dynamic Type, and prevent content truncation with flexible layouts. And learn how to make your app design more inclusive by adopting Dark Mode, responding to preferences like Differentiate Without Color, and ensuring sufficient contrast.
Chapters
- 0:01 - Welcome
- 2:18 - Interaction methods
- 18:04 - Larger text
- 26:27 - Color
- 32:55 - Next steps
Resources
- Overview of Accessibility Nutrition Labels
- Accessibility
- Learn more about designing for accessibility
Related Videos
WWDC25
WWDC24
-
Search this video…
Hi everyone. My name is Ryan and I’m a software engineer on the accessibility team. Today I'm excited to be talking about some strategies to prepare your app for Accessibility Nutrition Labels. Apple products are designed with everyone in mind – with an extensive list of built-in features. These features help people customize the ways their devices work to meet their needs. Some of these features fundamentally change the ways that people interact with their device. And others change how color, text, and other UI appears.
Many have APIs that allow you as developers to check these preferences, like text size or increase contrast with accessibility. Nutrition Labels In the App Store, you can indicate which features your app supports.
This empowers people to be confident that your app will work for them before downloading it. There are nine accessibility features that you can indicate support for with Accessibility Nutrition Labels across a range of categories.
You can indicate your support for VoiceOver and Voice Control. Two assistive technologies that offer different interaction methods within your app.
You can also indicate whether your app supports five different visual accessibility features. Sufficient Contrast, Dark Interface, Larger Text, Differentiate Without Color Alone, and Reduce Motion.
And if your app features media like audio and video, you can indicate support for captions and audio descriptions. Today, I'll go through a few of these areas and go over some specific APIs that you can use to make your apps more accessible as you extend your support for more accessibility features. I'll begin by discussing the different interaction methods and the considerations you'll need to keep in mind to allow people to interact with your app, even without sight or touch.
Next, I'll explore ways you can adapt your app for larger text sizes.
Finally, I'll talk about ways you can make sure that your app's use of color is accessible to everyone.
All right, I'll start by going over a couple of the different ways that people with disabilities might use accessibility features to interact with your app, and what you can do to make your app more accessible for them.
VoiceOver is a built in screen reader that describes what's on the screen with spoken audio, sound effects, and haptics, or with Braille. When someone using VoiceOver touches the screen, focus moves to the item under their finger. “Collections, button.” And swiping right and left moves the focus to the next or previous item on the screen.
“Map, button. Landmarks, button.” Double tapping the screen activates the focused item.
“Featured landmark, Mount Fuji, button.” And there is a rich set of multi-touch gestures that can be used to control VoiceOver, such as swiping with three fingers to scroll – “Page two of three, Australia slash Oceania, heading.” Twisting with two fingers to select a rotor to move between things like containers or headings – “Containers. Headings.” – and swiping vertically to move between options in the selected rotor.
“Europe, heading. North America, heading.” Voice Control allows people with motor disabilities to fully navigate their devices with voice commands, without needing to touch the screen. Items on the screen can be shown with labels or with numbers, or they can have no annotations. People using Voice Control can speak commands like tap Mount Fuji, scroll down.
And go back.
VoiceOver and Voice Control are the two alternate interaction methods with Accessibility Nutrition Labels. But there are several other ways that people may interact with your app.
With Switch Control, people can use a button, joystick, game controller, or even mouth sounds and face and hand gestures to control their devices.
People can connect a hardware keyboard to iPhone, iPad, Apple Vision Pro, or Mac to fully navigate their device with full keyboard access.
And with Head Tracking and Eye Tracking, people can control a cursor on their screen by moving only their head or eyes. These features all rely on the same underlying system and APIs to expose items in your apps to the assistive technologies. So to narrow your focus, I recommend starting with VoiceOver. Having good support for VoiceOver should get you most of the way there to providing a good experience with voice control and other interaction methods. A great initial check to do with VoiceOver is to swipe through the screens of your app and make sure that VoiceOver can focus on all content. There are three main pieces of information that VoiceOver should read where applicable: The label, a name for the element, the traits, what the element’s role is, and the value, what the current state of the element is.
For example, I'm working toward another badge in the Landmarks app, and I'm checking the activities I've completed so far. When focused on this completed activity, VoiceOver says, “Read the landmark description, completed, button.” “Read the landmark description” is the label. “Button” is the trait and “Completed” is the current value of this item. I'll go over each of these in more detail later.
Before doing that, I want to point out that if you're using default system components and controls for things like buttons, sliders, and toggles, I have great news for you. System frameworks will provide a strong foundation with built in accessibility for system controls. You'll just want to consider how VoiceOver reads your content in these views, like providing labels to describe images or adding heading traits to text that you add visual weighting to. And if you're creating custom controls, there will be more that you need to do yourself. To set what VoiceOver reads as the name of an element, Use the accessibilityLabel modifier or set the accessibilityLabel property on a UI view.
Text views will automatically have an accessibility label matching their text. But for images and graphics, you may need to set a label yourself. A good label should include any text within the element and describe what is shown. For this picture of Mount Fuji from the Landmarks app, a good accessibility label might be, “Vibrant cherry blossoms frame a snow-capped mountain rising in the background.
For buttons with images, the label should describe what the button does, not what the icon is. For the heart-shaped favorite button in Landmarks. A good label is “favorite” rather than “heart”.
The label should also not include the role of the element like “image” or “button”, which will automatically be read by VoiceOver, if you are correctly setting accessibility traits.
By default, people using Voice Control can refer to an element by its accessibilityLabel. If there are multiple words or phrases that someone might use to refer to an element, you can specify several options with accessibilityInputLabels.
In this example, someone could say "tap favorite”, ”tap heart”, “tap like”, or “tap love” to activate the favorite button.
Accessibility traits help describe what an element is or what it does. For example, whether it's a heading, a button, or an image.
All of these and more can be set with the accessibilityAddTraits. SwiftUI modifier or by setting the UI views, accessibilityTraits, and VoiceOver will announce the role in a consistent way.
For example, text that serves as a header often has visual prominence like a larger or bolder font, the header trait conveys that same information to people who rely on VoiceOver for most traits. All you need to do is set the accessibility traits property, but one trait in particular requires some extra implementation to define behaviors for adjusting an element's value.
I'm adding a new feature to the Landmarks app, so that I can rate each landmark that I’ve visited on a scale from 0 to 5. The adjustable trait is implemented by SwiftUI and UIKit for slider views.
For the slider to add my rating, VoiceOver reads, “Rating, four, adjustable. Swipe up or down with one finger to adjust the value.” If you're using a custom component, you'll need to define what happens when someone swipes up or down in SwiftUI. Add the accessibilityAdjustableAction modifier and handle increment and decrement cases. In UIKit, you can add the adjustable trait and implement the accessibilityIncrement, and accessibilityDecrement functions. An element with a value that can be changed like a slider or text field should use accessibilityValue for the value that’s being set, while keeping accessibilityLabel as a description of what the value or field represents.
For example, when updating the name of a collection in the Landmarks app, VoiceOver reads, “Title, Towering Peaks, Text field.” “Title” is the accessibility label the name of the field that I’m setting. “Towering Peaks” is the accessibility value. The current value of the field and “Text field” indicates its trait.
So far, I've covered how you can make sure all of your app's content has a label, value and traits, but you might notice as you use your app with VoiceOver, that some screens take a lot of swipes to get through all the content, or that it's difficult to understand how different elements relate to one another.
As long as all content and functionality is available to VoiceOver. It's actually okay for VoiceOver to not focus on each element individually.
In fact, sometimes you can significantly improve how easy it is to navigate and understand your app by hiding or combining elements.
One case of this is decorative images. Images that may help add visual context but don't necessarily provide additional useful information when spoken by VoiceOver. Often, these are images accompanied by text that conveys the same meaning. For example, when I'm adding landmarks to a collection in the Landmarks app, the important thing to know is which landmarks I've added.
The images can help sighted people to more quickly find what they're looking for. But for VoiceOver, the description of the image is less relevant in this screen, and people can already hear the full description of the image when they view the detail page for the landmark.
In this case, VoiceOver should not focus on the thumbnail image and the text separately, which only adds extra swipes and extra verbosity.
“Select, button. Mount Fuji.” “Vibrant cherry blossoms frame a snow-capped mountain rising in the background. Image.” “Mount Everest. A snowy mountain peak is illuminated by the rising sun. Image.” To prevent VoiceOver from focusing on a decorative image in SwiftUI, use the image decorative initializer or set accessibilityHidden to true. In UIKit, set isAccessibilityElement to false.
With this change, checking which landmarks are in my collection is much more streamlined.
“Select, button. Mount Fuji.” “Mount Everest. Rocky mountains.” “Kirkjufell mountain.” Another time when it may be preferable for VoiceOver to not focus on each individual element is when there is a visual or logical grouping between multiple views.
For example, when multiple text elements are put in a stack for a certain visual arrangement. If visually someone would read them together, you should make sure that VoiceOver reads them together also. I'm working on a new feature for the Landmarks app to show flights to landmarks I'm interested in visiting. My view shows information about the origin and destination airports, as well as information about the departure or arrival time. VoiceOver reads each view individually, which takes a lot of swipes to get through and is difficult to understand how all the information fits together. “Arriving in, SFO. Airplane departure, image. HND. 15 Minutes.” I can use accessibilityElement: Children combine on the entire stack view to group it together for VoiceOver, and automatically combine the labels of the child views.
“SFO to HND arriving in 15 minutes” All right, now I'll put it all together with an example. I’m updating my view to rate landmarks from a slider to an HStack with five stars that you can tap to enter your rating.
By default, VoiceOver focuses on each star separately. Even if I set a label for each star to indicate whether it's filled or empty, it's difficult to understand what's going on and what the current rating is.
“Filled star, image.” “Filled star, image.” “Filled star, image.” “Filled star, image.” “Empty star, image.” “Empty star, image.” For a more cohesive experience with VoiceOver, I want VoiceOver to focus on this entire control as one element with an easy way to adjust the rating. If I combine this into one accessibilityElement, set the label and value and add an adjustable action to increment and decrement the rating, now it is much easier to navigate and understand with VoiceOver.
“Rating, four, adjustable. Swipe up or down with one finger to adjust the value.” “Three.” “Two.” “Three.” “Four.” “Five.” With the changes I just made, VoiceOver is essentially just treating this like a standard slider.
If there is a system equivalent to your custom control, there is also a powerful SwiftUI modifier that you can use to have your custom controls inherit accessibility behaviors from the system version.
In the same example I just showed. If I apply the accessibilityRepresentation modifier with a system slider view, VoiceOver will automatically apply the proper label, value, and adjustable behavior, making the rating component behave in the same way I just showed.
One more important consideration is to review any custom gestures in your app. Most places that you're using UI gesture recognizers or SwiftUI gestures, you’ll need to do some accessibility work also.
Some apps might use single tap gestures for interactive elements instead of button or link views, but sighted people may still be able to use visual indicators like color or symbols to tell that the element is interactive.
For example, you might have text that says “Learn more,” has your app’s accent color, and is accompanied by a chevron symbol, but if it just has a tap recognizer rather than truly being a button. VoiceOver won't announce it as a button.
If you do this, make sure that you add the button trait so that people using VoiceOver can also know that it is interactive even without seeing the visual indicators.
Remember that gestures are mapped differently with VoiceOver, and people using Voice Control may not be able to touch the screen. So if you have gestures with multiple taps or touches or with swipes or drags, there needs to be an alternative way for people using VoiceOver or Voice Control to perform the same action. I added a quick way for people to add landmarks to their favorites by double tapping the image at the top of the detail page.
I can use the accessibilityAction modifier to allow VoiceOver to access this same functionality in the actions rotor. To go even deeper on accessibility custom actions and other SwiftUI accessibility modifiers. Check out ”Catch up on accessibility in SwiftUI” from WWDC24 Next, I’ll talk about supporting larger text sizes. Vision is a wide spectrum, and increasing the text size is one of the most commonly used accessibility settings. Some people may have had some degree of vision loss for their entire life, while others may be finding they need to increase the size a bit as they get older. To make sure everyone can view the content in your app at their preferred text size, It's important that your content scales with Dynamic Type and does not truncate or overlap in display and brightness settings. People can increase the text size from the default 100% up to 135%.
If you go into display and text size settings under accessibility and turn on larger accessibility sizes, you can increase the text size up to 310%. To add Larger Text support to your Accessibility Nutrition Labels, you should make sure that your app works well up to a minimum of 200%.
The best way to make sure that text in your app scales with Dynamic Type is to use standard system font styles like title, headline, body, and caption. These fonts automatically adapt to the device's text size setting. For example, a body font at the default text size on iOS is a 17 point font, while it scales up to 28 points at the first larger accessibility size and all the way up to 53 points at the largest accessibility size. Note that different font styles scale at different rates.
Titles are large to begin with and don't need to scale by as large of a multiplier as a caption font, which starts off small. While the exact percentage change at a given text size is not identical for all font styles, the relative size of font styles is maintained across all text sizes.
To use these system font styles in SwiftUI, just use the font modifier with one of the predefined styles, and your text will automatically scale as the text size setting is adjusted in UIKit. Set the font with UIFont.preferredFont forTextStyle and set adjustsFontForContentSizeCategory to true to have the text automatically scale.
If you use custom fonts, you can also have them match the auto scaling behavior of system font styles. In SwiftUI, just use the font modifier with a custom font and specify the name, size, and what font style you want to scale relative to.
In UIKit, create your custom font and then use UIFontMetrics for the text style you want to scale with. To set your label's font, remember that different font styles scale at different rates, so make sure to use a text style that is similar in size to your custom font.
When designing an app, it's common to work with designers to produce a detailed specification and then build to match it.
Having good support for larger text sizes starts at the design stage. Make sure the entire team understands that text can scale based on preferences, and layouts may need to change based on Dynamic Type. This is a process that happens all the time here at Apple. Designers start by providing an initial spec for roughly how things should be laid out, and then will follow up with more specific metrics for different text sizes and how layouts should change if needed. Take Control Center, for example, a one by two control can show the name of the control, but the layout doesn't allow much room for the text to grow.
At larger accessibility sizes, these controls can grow to one by four to give text more room to expand across the width of the screen.
If you've encountered challenges from inflexible designs that limit your support for Dynamic Type, I encourage you to make some improvements to your design process. The Human Interface Guidelines for typography have specifications for what size text will be for each style at each setting, which can help figure out a good mapping for how the fonts you use in your app can scale. Once you’re using fonts that can scale, Xcode Previews are a great way to test how different screens in your app respond to Dynamic Type. You can easily add previews for different text sizes to identify where text may be truncating or overlapping. I'll go through an example of how I can use Xcode Previews to identify and fix large text bugs.
I'm adding a new screen to the Landmarks app to congratulate people when they earn a new badge. I have a VStack with an image, a title, and a message in the middle of the screen,
-
-
6:39 - Add descriptive accessibility labels
// Add descriptive accessibility labels // SwiftUI Image(landmark.backgroundImageName) .accessibilityLabel("Vibrant cherry blossoms frame a snow-capped mountain rising in the background.") // UIKit let imageView = UIImageView(image: UIImage(named: landmark.backgroundImageName)) imageView.isAccessibilityElement = true imageView.accessibilityLabel = "Vibrant cherry blossoms frame a snow-capped mountain rising in the background." -
7:43 - Add alternate labels for Voice Control
// Add alternate labels for Voice Control Button { addFavorite() } label: { Image(systemName: "heart") } .accessibilityLabel("Favorite") .accessibilityInputLabels(["Favorite", "Heart", "Like", "Love"]) -
8:23 - Set accessibility traits to match an element’s role
// Set accessibility traits to match an element’s role // SwiftUI Text("Accessibility Nutrition Labels") .font(.title) .accessibilityAddTraits(.isHeader) // UIKit let label = UILabel() /*...*/ label.accessibilityTraits = .header -
9:29 - Implement custom adjustable behavior
// Implement custom adjustable behavior // SwiftUI MySliderView() .accessibilityAdjustableAction { direction in switch direction { case .increment: // Increase value case .decrement: // Decrease value } } // UIKit class MySliderView: UIView { override var accessibilityTraits: UIAccessibilityTraits { get { .adjustable } set {} } override func accessibilityIncrement() { /* Increase value */ } override func accessibilityDecrement() { /* Decrease value */ } } -
9:54 - Set an accessibility value
// Set an accessibility value // SwiftUI MyView() .accessibilityValue("Value") // UIKit myView.accessibilityValue = "Value" -
12:26 - Hide decorative images from VoiceOver
// Hide decorative images from VoiceOver // SwiftUI Image(decorative: landmark.thumbnailImageName) /* or */ Image(landmark.thumbnailImageName) .accessibilityHidden(true) // UIKit imageView.isAccessibilityElement = false -
12:59 - Combine elements to improve VoiceOver navigation
// Combine elements to improve VoiceOver navigation HStack { HStack { Text("SFO") Image(systemName: "airplane.departure") .accessibilityLabel("to") Text("HND") } .font(.title3.bold()) Spacer() VStack { Text("Arriving in") Text("15").font(.title.bold()) Text("minutes") } } .accessibilityElement(children: .combine) -
15:03 - Make a custom component accessible
// Make a custom component accessible HStack { ForEach(1...5, id: \.self) { index in Image(systemName: index <= Int(badgeProgress.rating) ? "star.fill" : "star") } } .accessibilityElement() .accessibilityLabel("Rating") .accessibilityValue("\(Int(badgeProgress.rating))") .accessibilityAdjustableAction { direction in switch direction { case .increment: badgeProgress.rating = min(5, badgeProgress.rating + 1) case .decrement: badgeProgress.rating = max(0, badgeProgress.rating - 1) } } -
15:37 - Make a custom component accessible
// Make a custom component accessible HStack { ForEach(1...5, id: \.self) { index in Image(systemName: index <= Int(badgeProgress.rating) ? "star.fill" : "star") } } .accessibilityRepresentation { Slider(value: $badgeProgress.rating, in: 0...5, step: 1.0) { Text("Rating") } } -
16:30 - Add accessibility traits for tap gestures
// Add accessibility traits for tap gestures HStack { Text("Learn more") Image(systemName: "chevron.forward") } .foregroundColor(.blue) .onTapGesture { /*...*/ } .accessibilityAddTraits(.isButton) -
17:13 - Add accessibility actions for custom gestures
// Add accessibility actions for custom gestures Image(landmark.backgroundImageName) .accessibilityLabel(landmark.imageDescription ?? landmark.name) .onTapGesture(count: 2) { modelData.addFavorite(landmark) } .accessibilityAction(named: "Favorite") { modelData.addFavorite(landmark) } -
20:07 - Adopt system text styles for automatic scaling
// Adopt system text styles for automatic scaling // SwiftUI Text("Hello World") .font(.body) // UIKit label.font = UIFont.preferredFont(forTextStyle: .body) label.adjustsFontForContentSizeCategory = true -
20:33 - Make custom fonts scale proportionally with system font styles
// Make custom fonts scale proportionally with system font styles // SwiftUI Text("Hello World") .font(.custom("MyFont", size: 17, relativeTo: .body)) // UIKit guard let customFont = UIFont(name: "MyFont", size: 17) else { return } label.font = UIFontMetrics(forTextStyle: .body).scaledFont(for: customFont) label.adjustsFontForContentSizeCategory = true -
22:57 - Embed content in ScrollView to avoid truncation
// Embed content in ScrollView to avoid truncation ScrollView { VStack(spacing: 24) { EarnedBadgeView(badge: badge) Text("Congratulations!") Text("...") } } .scrollBounceBehavior(.basedOnSize) .safeAreaBar(edge: .bottom) { VStack { Button("Share badge") { } Button("Close") { } } } -
25:17 - Set number of lines to 0 to avoid truncation
// Set number of lines to 0 to avoid truncation // SwiftUI Text("Some longer text that takes up multiple lines.") .lineLimit(nil) // UIKit label.numberOfLines = 0 -
26:53 - Check the differentiate without color setting
// Check the differentiate without color setting // SwiftUI @Environment(\.accessibilityDifferentiateWithoutColor) var differentiateWithoutColor // UIKit let differentiateWithoutColor = UIAccessibility.shouldDifferentiateWithoutColor NotificationCenter.default.addObserver(self, selector: #selector(diffWithoutColorDidChange), name: UIAccessibility.differentiateWithoutColorDidChangeNotification, object: nil) -
27:42 - Differentiate without color alone in Swift Charts
// Differentiate without color alone in Swift Charts Chart(visitorData) { data in LineMark( x: .value("Month", data.month), y: .value("Visitors", data.numVisitors) ) .foregroundStyle(by: .value("Landmark", data.landmark)) PointMark( x: .value("Month", data.month), y: .value("Visitors", data.numVisitors) ) .foregroundStyle(by: .value("Landmark", data.landmark)) .symbol(by: .value("Landmark", data.landmark)) } -
30:36 - Check the preference for Reduce Transparency
// Check the preference for Reduce Transparency // SwiftUI @Environment(\.accessibilityReduceTransparency) var reduceTransparencyEnabled // UIKit let reduceTransparencyEnabled = UIAccessibility.isReduceTransparencyEnabled NotificationCenter.default.addObserver(self, selector: #selector(reduceTransparencyDidChange), name: UIAccessibility.reduceTransparencyStatusDidChangeNotification, object: nil) -
31:22 - Check the preference for Increase Contrast
// Check the preference for Increase Contrast // SwiftUI @Environment(\.colorSchemeContrast) var colorSchemeContrast // UIKit let increaseContrastEnabled = view.traitCollection.accessibilityContrast == .high registerForTraitChanges([UITraitAccessibilityContrast.self], action: #selector(accessibilityContrastDidChange))
-