Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

How to combine child views and expose containerView for accessibility and at the same time child views for Automation?
I have an image and label inside UIStackView which are inside a viewcontroller. I want to make my UIView accessible. So I wrote below code: var image: UIImage! var myView: UIStackView! var label : UILabel! myView.isAccessibilityElement = true myView.accessibilityLabel = "Hello" myView.accessiblityIdentifier = "test_view" image.accessiblityIdentifier = "test_image" label.accessibilityIdentifier = "test_label" All are UIKit Elements. How to expose mvView to accessibility and only children to automation I tried below two ways, none of them worked: self.view.accessibilityElements = [myView] myView.accessibilityElements = [] from apple documentation : If the object is a view and it’s an accessibility element, and accessibilityElements is empty, the system assigns the list of subviews that have an accessibilityIdentifier to automationElements. myView.automationElements = [myView,image,label] from apple documentation : In some cases, you might want to expose elements for automation but not for accessibility, or vice versa. In a view containing an image with a label under it, for example, you might choose to expose only the label for accessibility. For automation, however, you might include both the image and the label in a test to confirm the both objects exist. In this case, add both the image and the label to automationElements. I am really going crazy with this since many days. Help is very much appreciated.
0
0
469
Oct ’24
Why does the macOS window sharing indicator appear for some windows but not others?
On recent versions of macOS, when a window is being shared (via the system screen-capture APIs), the OS sometimes shows a small "shared window" badge in the title bar. I’ve noticed that this indicator is not consistent: For some windows, the badge reliably appears when they are being shared. For other windows, the badge never appears, even though the window is actively shared. In particular, windows that use a standard system title bar seem to show the indicator more often, while windows with custom-drawn or non-standard chrome do not. My questions are: What are the exact conditions under which macOS decides to draw the “shared window” indicator in a window’s title bar? Is this strictly tied to certain NSWindow styles or masks (e.g. titled vs borderless)? Is there any API or flag I can use to detect programmatically whether a given window will display this system indicator when shared?
0
0
1k
1w
Random Disconnection Issues with iStorage diskAshur on macOS
Hello everyone, I am experiencing random and untimely disconnection issues with my iStorage diskAshur external HDD/SSD while using macOS. I have already contacted Apple Support but unfortunately, they could not provide a solution and suggested I reach out to this forum for further assistance. Has anyone else faced similar issues with the diskAshur on macOS? If so, what steps did you take to resolve the problem? Any help or advice would be greatly appreciated. Thank you!
0
0
332
Oct ’24
Screen Capture on Mac 15 not working through SSH
Steps: Connect to Mac through ssh and execute “screencapture abc.jpg” It shows the attached popup, though we have added “ssh-keygen-wrapper” to control the computer in Privacy and Security->Accessibility and in Privacy and Security->Screen and System Audio recording in System Settings Ideally, we wouldn’t like to see this popup as we have given enough permissions to take the screenshot as mentioned above, same works fine in Mac14 and below versions Even upon clicking “Allow for one week”, new session of SSH will have this popup again.
0
0
316
Sep ’24
Personal Voice authorization requires app restart
I'm trying to include Apple's Personal Voice feature in an app I'm working on, but I want to use a button or toggle to request access, rather than firing the request on first launch. The problem is that, if AVSpeechSynthesizer is used during the same session, before Personal Voice is authorized, the app has to be restarted to use the feature. Here is a basic example that demonstrates the issue on my iPhone (running 18.1 beta, but the issue was present at least in 18.0, maybe before): import AVFoundation import SwiftUI struct TestView: View { let synthesizer = AVSpeechSynthesizer() @State private var personalVoices: [AVSpeechSynthesisVoice] = [] var body: some View { VStack(spacing: 100) { Text("Personal Voices Available: \(personalVoices.count)") Button { speakUtterance(string: "Hello, world!") } label: { Image(systemName: "hand.wave.fill") .font(.system(size: 100)) } Button("Fetch Personal Voices") { Task { await fetchPersonalVoices() } } } } func fetchPersonalVoices() async { AVSpeechSynthesizer.requestPersonalVoiceAuthorization() { status in if status == .authorized { personalVoices = AVSpeechSynthesisVoice.speechVoices().filter { $0.voiceTraits.contains(.isPersonalVoice) } } } } func speakUtterance(string: String) { let utterance = AVSpeechUtterance(string: string) if let voice = personalVoices.first { utterance.voice = voice } else { utterance.voice = AVSpeechSynthesisVoice(language: Locale.preferredLanguages[0]) } synthesizer.speak(utterance) } } If you tap the hand symbol first (before authorizing Personal Voice), you'll probably notice that the Personal Voices Available number never increases. If you authorize Personal Voice before tapping the hand symbol, it should speak using your Personal Voice as expected. The example code is mostly taken directly from this WWDC23 video (Personal Voice info begins around the 10-minute mark). Does anyone have any idea what could be causing this? Note: Personal Voice can't be tested in Simulator. The code will need to be run on a physical device that has Personal Voice set up, to test.
0
0
481
Sep ’24
Focus issues with ScrollView iOS18
When using an app via external keyboard, FocusState and .focused used to work just fine until iOS17. Vertical-axis textfields were also accessible without any issues. But after iOS18 update, adding focused modifier removes elements out of focus order of external keyboard. 1 such example is -when a button using focused modifier and @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink, that button is not accessible via Bluetooth (external) keyboard. TextEditor / Vertical-axis TextFields also seem to be impacted in external-keyboard-focus-order when added inside ScrollView. Is this a known iOS18 issue with ScrollView / any tip to get this fixed ? Sample code that can reproduce this issue: struct ContentView: View { @State private var showBottomSheet: Bool = false @State private var goToNextView: Bool = false @FocusState private var focused: Bool @AccessibilityFocusState private var voFocused: Bool var body: some View { NavigationView { VStack { Text("Hello, world!") // This button works fine in Bluetooth keyboard in all versions Button("Trigger a bottomsheet") { showBottomSheet = true } .focused($focused) .accessibilityFocused($voFocused) Button("Goto another view") { goToNextView = true } NavigationLink( destination: View2(), isActive: $goToNextView ) { EmptyView() } .accessibility(hidden: true) } .sheet(isPresented: $showBottomSheet, onDismiss: { focused = true voFocused = true }, content: { VStack() { Text("Hello World ! I'm in a bottomsheet") Button("Close me") { showBottomSheet = false } } }) .padding() } } } #Preview { ContentView() } struct View2: View { @FocusState private var focused: Bool @AccessibilityFocusState private var voFocused: Bool @State private var showBottomSheet: Bool = false var body: some View { ScrollView { VStack { Text("check") // In iOS18, this button doesn't get focused in Bluetooth / external keyboard // This issue occurs when these 3 combine in iOS 18 - a button using FocusState inside a view that has a ScrollView & it is opened via NavigationLink Button("Trigger a bottomsheet") { showBottomSheet = true } .focused($focused) .accessibilityFocused($voFocused) Button("Test button") { } } .sheet(isPresented: $showBottomSheet, onDismiss: { focused = true voFocused = true }, content: { VStack() { Text("Hello World ! I'm in a bottomsheet") Button("Close me") { showBottomSheet = false } } }) .padding() } } }
0
1
556
Feb ’25
[Technical Issue] BDK Native: Unexpected UI element (notch) appearing only in iPad environment
Hi, my name is Yuki. I'm developing an application with generative AI for junior and high school students based on the Nocode tool "bubble" (BDK Native). I want to upload this app to the App Store, however, a notch-like interface element is appearing only in the iPad environment, which is causing my app to fail App Store review. I've reached out to the BDK Native support team about this issue, but they were unable to identify the cause and only offered a refund as a solution. This is particularly frustrating as I'm unable to proceed with the App Store publication, and time is passing without a resolution. Technical details: The notch appears only on iPad devices The issue is not present on iPhone versions The app was built using bubble/BDK Native Multiple App Store submissions have been rejected due to this UI issue Has anyone encountered a similar issue or knows how to resolve this iPad-specific interface problem? Any guidance or suggestions would be greatly appreciated, as this is blocking our app's release. Thank you in advance for your help!
0
1
578
Nov ’24
VoiceOver Headings Accessibility Rotor with SwiftUI on iOS
Hi, On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor). In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense. There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following. .accessibilityRotor(.headings) { ForEach(entries) { entry in // entry.id must be the same as the id of the SwiftUI view it is about AccessibilityRotorEntry(entry.name, id: entry.id) } } It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up. I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them. So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else) The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed). Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder. Thanks
0
0
72
Jun ’25
Custom prediction panel not working in Google Docs
I’m working on a macOS Accessibility setup for a French-speaking user and I’ve hit a wall. (I'm not a developper and I'm trying to help my kid with dyslexia) I successfully built a custom word prediction panel using the Panel Editor (Keyboard) in macOS Accessibility > Keyboard > Accessibility Keyboard. Here’s what I have so far: • The prediction panel works system-wide: I can use it to type in Finder, Safari, Notes, TextEdit, and even browser search bars. • The panel appears above all applications and suggestions show up correctly. • However, it does not work inside Google Docs (tested in Chrome, Safari, and Firefox). Selecting a word from the panel does nothing in the Docs editor. I suspect this is because: • Google Docs does not use a standard macOS text input field. • Docs is a web app that relies on custom JavaScript editors, contentEditable elements, and canvas rendering, so macOS Accessibility APIs (AXTextField, AXInsertText, etc.) don’t register or inject text events. • Accessibility tools like the Accessibility Keyboard rely on native macOS text input methods, which don’t hook into Google Docs’ custom editor. Important: I’m not a programmer. I’d like to know if there is an easy fix or option in macOS, Google Chrome, or Google Docs that would make my custom prediction panel work, before going into custom development. Technical setup: • MacBook Air (M2, 2022) • RAM: 8 GB • macOS: Sequoia 15.3.1 • Language: French (system and keyboard) • Accessibility Keyboard: Enabled via Settings > Accessibility > Keyboard • Custom panel: Built using Panel Editor (Keyboard), named “Philemon Prédiction” • Browsers tested: Chrome, Safari, Firefox (same issue) • Behavior: Panel is visible, suggestions appear, but inserting text does nothing in Google Docs Has anyone worked around this limitation? Is there a simple setting, workaround, or accessibility option to bridge macOS Accessibility input with Google Docs’ editor? Thanks a lot!
0
1
906
2w
When using UIScrollView and UITextView together, inserting a link causes the following text to disappear.
Since UITextView does not support the zoom function, the zoom function of UITextView with addSubview is used in UIScrollView. However, when I use the link here, the text behind it is missing. Ex) https://appstoreconnect.apple.com/login\nApple Developer Login -> The text “Apple Developer Login” does not appear. If anyone has experienced the same problem as me or knows a solution, please leave a comment. Note) It is working normally in iOS16, but the text behind the link disappears in iOS18. The text is not visible, but you can copy it and paste it to retrieve the missing text.
0
0
220
Feb ’25
Avoid trackpad gesture conflict between dragging and accessibility zooming when using three fingers
Double-tap three fingers and drag to change zoom” should suppress “Three Finger to Drag”. Currently these gestures are triggered simultaneously, for no real reasons. I saw different behaviors for different environments, but none is desired. Current and desired behavior: This seems an issue so I filed a feedback.
0
0
698
3w
Getting isAuthenticated false when trying to retrieve book for india
url: https://uclient-api.itunes.apple.com/WebObjects/MZStorePlatform.woa/wa/lookup?version=2&id=1515995528&p=mdm-lockup&caller=MDM&platform=enterprisestore&cc=IN Response : { "results": {}, "version": 2, "isAuthenticated": false, "meta": { "storefront": { "id": "143467", "cc": "IN" }, "language": { "tag": "en-gb" } } } Any solution for this?
0
0
342
Nov ’24
App in Unlisted Language
I am building a language learning app for a Unlisted Primary Language. Any suggestions or heads ups? My plan is to select english and go with it. Its unfortunate that I have to list a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
0
0
143
Jul ’25
A Summary of the WWDC25 Group Lab - Accessibility
A Summary of the WWDC25 Group Lab - Accessibility At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Accessibility. Accessibility Nutrition Labels are a really big step forward for the experience people have on the App Store to find apps that will work for them. How should developers get started with Accessibility Nutrition Labels? A good starting point is to review the Accessibility Nutrition Label evaluation criteria on App Store Connect Help. It's a concise document, roughly 10 pages, and you can approach it section by section after the introduction. Even with prior experience using accessibility features like VoiceOver, the criteria offer valuable insights that might not be immediately apparent. For those newer to accessibility, a good entry point might be one of the visual feature labels, such as Dark Interface, which is a popular and frequently used feature. Which accessibility features can I indicate support for in Accessibility Nutrition Labels? The accessibility features covered include support for assistive technologies like VoiceOver and Voice Control, media enhancements such as captions and audio descriptions, and display accommodations. These display accommodations cover options like larger text, dark interface, differentiating without color alone, sufficient contrast, and reduced motion. With the new Accessibility Nutrition Labels, will app store reviewers validate what we select? The Accessibility Nutrition Label can be edited at any time without requiring a new app submission. However, if an app inaccurately claims feature support, App Review may contact the developer and request an update to the label or the app. Are there any updates to tools for analyzing the accessibility of our apps? Although there aren't new updates this year, continued support for Accessibility Audits is available through Xcode's built-in Accessibility Inspector. XCTest also supports accessibility audits, enabling developers to test app accessibility with every build. These audits analyze aspects like contrast, dynamic type, text clipping, element labels, and more within each view. For a deeper dive, the "Perform accessibility audits for your app" session from WWDC 2023 is a valuable resource. What are accessibility features you wish more people integrated? Accessibility features encompassing user input labels optimized for voice control, keyboard navigation and shortcuts, and dynamic type support could be more used to benefit users. What were some of the biggest accessibility challenges your team encountered while developing Liquid Glass? Apple is known for its innovation and strives to deliver a high-quality experience for everyone. Accessibility is considered a core component of visual design from the outset. For example, the Liquid Glass design inherently supports reduced transparency and increased contrast. As design continues to evolve, user feedback submitted through Feedback Assistant is invaluable. How does Liquid Glass respond to contrast? Especially for text and low contrast environments. Content legibility is a crucial aspect of the Liquid Glass design. It inherently supports accessibility features like reduced transparency and increased contrast. Your feedback during the beta period and beyond is essential to ensuring Liquid Glass provides a great experience within your apps. What are some Apple apps that stand out for their accessibility? Apps like Keynote in the iWork suite offer groundbreaking VoiceOver features to enhance creative productivity for all users. Assistive Access makes core apps such as Messages, Photos, Camera, Phone, and Music more accessible. Podcasts provides transcripts to broaden its reach, and frameworks like SwiftUI ensure that apps built with the latest UI frameworks have excellent built-in accessibility.
0
0
822
Jul ’25