I am testing the accessibility feature available in the Settings app called "Speak Screen". The help text in the Setting app states that swiping down with two fingers will cause the screen content to be spoken. However, I've been unable to get this feature to work. Every time I try the double finger swipe down, it behaves the same as the single finger swipe down gesture. Usually this manifests as making scroll views bounce.
I've tried toggling the feature on and off, turning off Reachability, and rebooting my phone, but I can't get the speak screen gesture to work. If I access the speak screen feature from the "Speech Controller" button, then the screens content is spoken, as expected, so I know the feature is enabled. It's just the gesture that doesn't work.
Is there something else I need to do to get this gesture to work? I don't want to tell my users to turn this feature on if I can't verify that the gesture will work with my app.
Accessibility
RSS for tagMake your apps function for a broad range of users using Accessibility APIs across all Apple platforms.
Posts under Accessibility tag
143 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection.
Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
Hi,
I want to detect if there is a fullscreen window on each screen.
_AXUIElementGetWindow and kAXFullscreenAttribute methods work, but I have to be in a non-sandbox environment to use them.
Is there any other way that also works? I don't think it's enough to judge if it's fullscreen by comparing the window size to the screen size, since it doesn't work on MacBook with notch, or the menu bar is set to 'auto-hide'.
Thanks.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Accessibility
Mac App Store
Core Graphics
App Sandbox
I'm not seeing the Accessibility or Attributes Inspectors in Xcode 26. Were these features removed?
Xcode 16
Xcode 26
I have sent in a feedback report (FB18222398) but I have no idea if anyone has looked at it. I know from past experiences that Apple devs do look at these forums.
This applies to each of the betas, 1, 2 and 3. I have created a new Personal Voice with each beta. I create a personal voice in English. When it's done processing, I tap Preview and it says in English what is expected. But after some time, an hour or a day, the language of the voice file changes languages and no longer works properly. If I press Preview it is no longer intelligible. I have a text to speech app and initially the created voice works but then when the language of the file changes, it no longer works. I have run an app on my iphone through Xcode that prints to the console the voices installed on the device with the language. Currently this is the voice file:
Voice Identifier: com.apple.speech.personalvoice.AAA9C6F2-9125-475F-BA2F-22C63274991D
Language: es-MX
and on a second device the same personal voice is in a different language:
Voice Identifier: com.apple.speech.personalvoice.AAA9C6F2-9125-475F-BA2F-22C63274991D
Language: zh-CN
Although, a previous personal voice file that listed as Spanish-Mexican played in English with a Spanish accent or when playing Spanish text, it sounded almost perfect. This current personal voice doesn't do that, and is unintelligible. Previous attempts have converted to Chinese.
I hope someone can look into this.
How can I force VoiceOver to read parentheses for math expressions like this:
Text("(2+3)×4") // VoiceOver: Two plus three, times four
I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks.
Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.”
Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
Hello! I'm adding VoiceOver support for my app, but I'm having an issue where my accessibility value is not being spoken. I have made a helper class that creates an NSString from a double and converts it to the user's region currency.
CurrencyFormatter.m
+ (NSString *) localizedCurrencyStringFromDouble: (double) value {
NSNumberFormatter *formatter = [[NSNumberFormatter alloc] init];
formatter.numberStyle = NSNumberFormatterCurrencyStyle;
formatter.locale = [NSLocale currentLocale];
NSString *currencyString = [formatter stringFromNumber: @(value)];
[formatter release];
return currencyString;
}
View Contoller
self.checkTotalLabel.accessibilityLabel = NSLocalizedString(@"Total Amount", @"Accessibility Label for Total");
self.checkTotalLabel.accessibilityValue = [CurrencyFormatter localizedCurrencyStringFromDouble: total];
I'm confused on whether the value should go into the accessibility label or not. When the currency is just USD and the language is English, it's a simple fix. But when the currency needs to be converted, I'm not sure where to go from here.
If anyone has any guidance, it would help me a lot!
Thank you!
I'm developing a calculator app and working to ensure a great experience for both VoiceOver and Braille display users.
For expressions like (2+3)×5, I need two different accessibility outputs:
VoiceOver (spoken): A descriptive string like “left paren two plus three right paren times five,” provided via .accessibilityValue. I'm using a custom spellOut function since VoiceOver doesn't announce parentheses—which are kind of important when doing math!
Braille (symbolic): The literal math string (2+3)×5, provided using .accessibilityCustomContent("", ...), with an empty label so it’s not spoken aloud.
The issue: I don’t have access to a Braille display device and Xcode’s Accessibility Inspector doesn’t seem to show the custom content.
Is there any way to confirm that custom Braille content is being set correctly in Simulator or with other tools?
Or…is there a "math mode" in VoiceOver that forces it to announce parentheses?
Any advice or workarounds would be much appreciated!
Thanks,
Uhl
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
External Accessory
iOS
Accessibility
SwiftUI
I am building a language learning app for a unlisted primary language. My plan is to go with english. Any other suggestions or heads up? Check screenshot.
Its unfortunate that i have to tag a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
Amogst the two languages my app would have lets say 10% and 90%.
I am launching an app for a unlisted Primary Language. I consider whatever is inside the app as the primary and that wont be English. The secondary language
I am building a language learning app for a Unlisted Primary Language. Any suggestions or heads ups? My plan is to select english and go with it.
Its unfortunate that I have to list a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
A Summary of the WWDC25 Group Lab - Accessibility
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Accessibility.
Accessibility Nutrition Labels are a really big step forward for the experience people have on the App Store to find apps that will work for them. How should developers get started with Accessibility Nutrition Labels?
A good starting point is to review the Accessibility Nutrition Label evaluation criteria on App Store Connect Help. It's a concise document, roughly 10 pages, and you can approach it section by section after the introduction. Even with prior experience using accessibility features like VoiceOver, the criteria offer valuable insights that might not be immediately apparent. For those newer to accessibility, a good entry point might be one of the visual feature labels, such as Dark Interface, which is a popular and frequently used feature.
Which accessibility features can I indicate support for in Accessibility Nutrition Labels?
The accessibility features covered include support for assistive technologies like VoiceOver and Voice Control, media enhancements such as captions and audio descriptions, and display accommodations. These display accommodations cover options like larger text, dark interface, differentiating without color alone, sufficient contrast, and reduced motion.
With the new Accessibility Nutrition Labels, will app store reviewers validate what we select?
The Accessibility Nutrition Label can be edited at any time without requiring a new app submission. However, if an app inaccurately claims feature support, App Review may contact the developer and request an update to the label or the app.
Are there any updates to tools for analyzing the accessibility of our apps?
Although there aren't new updates this year, continued support for Accessibility Audits is available through Xcode's built-in Accessibility Inspector. XCTest also supports accessibility audits, enabling developers to test app accessibility with every build. These audits analyze aspects like contrast, dynamic type, text clipping, element labels, and more within each view. For a deeper dive, the "Perform accessibility audits for your app" session from WWDC 2023 is a valuable resource.
What are accessibility features you wish more people integrated?
Accessibility features encompassing user input labels optimized for voice control, keyboard navigation and shortcuts, and dynamic type support could be more used to benefit users.
What were some of the biggest accessibility challenges your team encountered while developing Liquid Glass?
Apple is known for its innovation and strives to deliver a high-quality experience for everyone. Accessibility is considered a core component of visual design from the outset. For example, the Liquid Glass design inherently supports reduced transparency and increased contrast. As design continues to evolve, user feedback submitted through Feedback Assistant is invaluable.
How does Liquid Glass respond to contrast? Especially for text and low contrast environments.
Content legibility is a crucial aspect of the Liquid Glass design. It inherently supports accessibility features like reduced transparency and increased contrast. Your feedback during the beta period and beyond is essential to ensuring Liquid Glass provides a great experience within your apps.
What are some Apple apps that stand out for their accessibility?
Apps like Keynote in the iWork suite offer groundbreaking VoiceOver features to enhance creative productivity for all users. Assistive Access makes core apps such as Messages, Photos, Camera, Phone, and Music more accessible. Podcasts provides transcripts to broaden its reach, and frameworks like SwiftUI ensure that apps built with the latest UI frameworks have excellent built-in accessibility.
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
While it is possible to scroll content using VoiceOver on macOS, I was not able to find any NSAccessibility APIs related to it (such as accessibilityScroll: on iOS).
I have two overlay views on each side of a horizontal scroll. The overlay views are helper arrow buttons that can be used to scroll quickly. This issue occurs when I use either ZStack or .overlay modifier for layout. I am using accessibilitySortPriority modifier to maintain this reading order.
Left Overlay View
Horizontal Scroll Items
Right Overlay View
When voiceover is on and i do a single tap on views, the focus shifts to particular view as expected. But for the trailing overlay view, the focus does not shift to it as expected. Instead, the focus goes to the scroll item behind it.
When VoiceOver reads decimal numbers with six or more digits after the decimal, it stops announcing the decimal separator and also adds pauses between each digit.
Text("0.12345") // VoiceOver: "zero **point** one two three four five"
Text("0.123456") // VoiceOver: "zero one, two, three, four, five, six"
How can I force VoiceOver to announce the decimal separator ("point") and not insert pauses regardless of the number of decimal digits?
I’ve noticed that the VoiceOver reads currency amounts correctly when they are below thousand.
Then, for higher amounts, for example 12.225,34 € VoiceOver reads ‘twelve point two two five thirty four euros’
If the amount is formatted without the thousand separator (12225,34 €) this problem doesn’t exist. (VO reads twelve thousand two hundred and twenty five euros and thirty four cents)
Why is the thousand separator a problem for VoiceOver if this formatting is coming from the currency and locale?
This issue exists in English. I changed my device language to Italian and German and in both cases the number was read correctly even with the separator.
Is there a way to make it work in English?
On iOS, there is accessibilityLanguage.
Hello!
My question is about 1) if we can use any and or all accessibility features within a sandboxed app and 2) what steps we need to take to do so.
Using accessibility permissions, my app was working fine in Xcode. It used NSEvent.addGlobalMonitorForEvents and localMoniter, along with CGEvent.tapCreate. However, after downloading the same app from the App Store, the code was not working. I believe this was due to differences in how permissions for accessibility are managed in Xcode compared to production.
Is it possible for my app to get access to all accessibility features, while being distributed on the App Store though? Do I need to add / request any special entitlements like com.apple.security.accessibility?
Thanks so much for the help. I have done a lot of research on this online but found some conflicting information, so wanted to post here for a clear answer.
We are unable to programmatically enable AppleScript automation for VoiceOver on macOS 15 (Sequoia)
In macOS 15, Apple moved the VoiceOver configuration from:
~/Library/Preferences/com.apple.VoiceOver4/default.plist
to a sandboxed path:
~/Library/Group Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist
Steps to Reproduce:
Use a macOS 15 (ARM64) machine (or GitHub Actions runner image with macOS 15 ARM).
Open VoiceOver:
open /System/Library/CoreServices/VoiceOver.app
Set the SCREnableAppleScript flag to true in the new sandboxed .plist:
plutil -replace SCREnableAppleScript -bool true ~/Library/Group\ Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist
Confirm csrutil status is either disabled or not enforced.
Attempt to control VoiceOver via AppleScript (e.g., using osascript voiceOverPerform.applescript).
Observe that the AppleScript command fails with no useful output (exit code 1), and VoiceOver does not respond to automation.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
macOS
Accessibility
App Sandbox
AppleScript