Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

How to Enable Group Navigation Behavior for Custom Views in VoiceOver?
In VoiceOver, when using Group Navigation style, the cursor first focuses on the semantic group. To navigate inside the group, a two-finger swipe (left or right) can be used. This behavior works for default containers like the Navigation Bar, Tab Bar, and Tool Bar. How can I achieve the same behavior for a custom view? I tried setting accessibilityContainerType = .semanticGroup, but it only works for Mac Catalyst. Is there an equivalent approach for iOS?
0
0
410
Mar ’25
External Keyboard + Voiceover focus not working with .searchable + List
While editing the search text using the external keyboard (with VoiceOver on), if I try to navigate the to List using the keyboard, the focus jumps back to the search field immediately, preventing selection of list items. It's important to note that the voiceover navigation alone without a keyboard works as expected. It’s as if the List never gains focus—every attempt to move focus lands back on the search field. The code: struct ContentView: View { @State var searchText = "" let items = ["Apple", "Banana", "Cherry", "Date", "Elderberry", "Fig", "Grape"] var filteredItems: [String] { if searchText.isEmpty { return items } else { return items.filter { $0.localizedCaseInsensitiveContains(searchText) } } } var body: some View { if #available(iOS 16.0, *) { NavigationStack { List(filteredItems, id: \.self) { item in Text(item) } .navigationTitle("Fruits") .searchable(text: $searchText) } } else { NavigationView { List(filteredItems, id: \.self) { item in Text(item) } .navigationTitle("Fruits") .searchable(text: $searchText) } } } }
1
0
89
Jun ’25
Best Way to Navigate to the Top Element Using VoiceOver
I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need? For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient. Is there a better way to handle this using VoiceOver?
2
0
343
Mar ’25
How to Ensure Data Privacy with VoiceOver Reading Sensitive Information?
VoiceOver reads out all visible content on the screen, which is essential for visually challenged users. However, this raises a privacy concern—what if a user accidentally focuses on sensitive information, like a bank account password, and it gets read aloud? How can developers prevent VoiceOver from exposing confidential data while still maintaining accessibility? Are there best practices or recommended approaches to handle such scenarios effectively?
1
0
340
Mar ’25
App in Unlisted Language
I am building a language learning app for a Unlisted Primary Language. Any suggestions or heads ups? My plan is to select english and go with it. Its unfortunate that I have to list a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
0
0
169
Jul ’25
How to force VoiceOver to read decimal point even when there are 6 or more decimal digits?
When VoiceOver reads decimal numbers with six or more digits after the decimal, it stops announcing the decimal separator and also adds pauses between each digit. Text("0.12345") // VoiceOver: "zero **point** one two three four five" Text("0.123456") // VoiceOver: "zero one, two, three, four, five, six" How can I force VoiceOver to announce the decimal separator ("point") and not insert pauses regardless of the number of decimal digits?
1
0
262
Jun ’25
Registering a macOS app for dynamic text sizing in macOS 15
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day? For context, my app is a macOS-native SwiftUI app.
1
0
605
Jan ’25
accessibilityRespondsToUserInteraction return true on Simulator but false on device
I am seeing a strange issue where NSObject accessibilityRespondsToUserInteraction returns true on Simulator but false on device. Checking the same object on simulator with Accessibility inspector I see the object traits as image so why would it return true in that case? Are there any other way to check the the item might be accessibilityRespondsToUserInteraction OR Clickable beside that property and traits? (Or is it just another bug)
1
0
93
Jun ’25
Accessibility & Inclusion
We are developing Apple AI for foreign markets and adapting it for iPhone models 17 and above. When the system language and Siri language are not the same—for example, if the system is in English and Siri is in Chinese—it can cause a situation where Apple AI cannot be used. So, may I ask if there are any other reasons that could cause Apple AI to be unavailable within the app, even if it has been enabled?
0
0
158
1d
AXSpeech Thread Crash SEGV_ACCERR
Hi everyone, I've encountered a rare and strange crash in my app that I can't consistently reproduce. The crash seems to occur deep within Apple's internal frameworks, and I can't pinpoint which line of my own code is causing it. Here's the crash stack trace: #44 AXSpeech SIGSEGV SEGV_ACCERR 0 CoreFoundation ___CFCheckCFInfoPACSignature + 4 1 CoreFoundation _CFRunLoopSourceSignal + 28 2 Foundation _performQueueDequeue + 492 3 Foundation ___NSThreadPerformPerform + 88 4 CoreFoundation ___CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 5 CoreFoundation ___CFRunLoopDoSource0 + 176 6 CoreFoundation ___CFRunLoopDoSources0 + 340 7 CoreFoundation ___CFRunLoopRun + 828 8 CoreFoundation _CFRunLoopRunSpecific + 608 9 Foundation -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212 10 TextToSpeech _TTSCFAttributedStringCreateStringByBracketingAttributeWithString + 776 11 Foundation ___NSThread__start__ + 732 12 libsystem_pthread.dylib __pthread_start + 136 Sometimes, instead of line 10 referencing _TTSCFAttributedStringCreateStringByBracketingAttributeWithString, it shows: 10 TextToSpeech LogWarning(char const*, ...) + 7288 Has anyone experienced a similar issue or know what might be triggering this crash? Any guidance on how to investigate or resolve this would be greatly appreciated. Thank you!
1
0
147
Jun ’25
TabItems in swiftUI do not scale
I have a TabView with a sample tabItem as follows: .tabItem { Label ("Import", systemImage:"doc.on.doc") .accessibilityLabel("Import Text") } But accessibility settings for large display size on does not seem to work, nor do dynamic font sizes: .tabItem { Label ("Import", systemImage:"doc.on.doc") .font(.largeTitle) .accessibilityLabel("Import Text") } The tabItems appear as a fixed size. The tab contents scale well, so this does not look pleasant at all. Is this a known bug in SwiftUI?
0
0
733
Jul ’25
SwiftUI Accessibility Inspector?
Please excuse me if this is obvious. I'm new to Apple development. Is there a SwiftUI Accessibility Inspector? I run the standard one, in Xcode 26b3, and it shows me warnings for things that I didn't create in SwiftUI. I presume that "SwiftUI" is primarily implemented using macros and that these things are either generated or boilerplate lower-level things. But if so, then why would they trip Accessibility Inspector warnings? Is there something I can do from SwiftUI to clear them? Or... is there a demangler somewhere that will translate from these names into something this human might recognize? I'm targeting macos, btw, if that makes any difference.
1
0
1.3k
Jul ’25
Audio Driver kit Read opration Sync Error。
When My Usb interface working on recording, the sync is not good work. I found every IO will in_io_buffer_frame_size is same, it is not sync to UpdateCurrentZeroTimestamp. So The Audio driver Kit Read opration is not same like Write? What is the way sync with Usb in data? If only play audio with UpdateCurrentZeroTimestamp, it working fine. Thanks!
5
0
677
Jan ’25
How to lookup keybinding translation across input sources
I have an application that binds a menu item to trigger on ⌘]. When I set the US input source, I press ⌘] in order to trigger that item. However, when I switch the input source to QWERTZ (German), the trigger changes to ⌘Ä automatically by the OS. It seems to translate keystrokes for different input sources. The problem is that I also render the keybindings in a window in my application, and my application is not aware of this translation. Furthermore, I have other key shortcuts in my application which are not bound to menu items, and I want to make sure those get translated too. Does AppKit expose a way to lookup what a keystroke will be when MacOS translates it, i.e. lookup ⌘Ä from ⌘] when the current layout is QWERTZ? I can't find anything in Apple's docs. I tried converting a character to virtual key code based on the US layout and then mapping it back to a character based on the QWERTZ layout. That doesn't seem to be the same b/c that ends up converting ] to + instead which seems to be based on physical key location, different from how the keybindings are handled. Update: I notice similar behavior for VS Code's menu bar, e.g. in their "Terminal" menu. Switching to German changes some bindings. This does not occur at all in iTerm's menu bar, I suspect b/c their menu items are specified in a different way, xib files with hard-coded key equivalents
0
0
433
Jan ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
537
2w
Assistive Access Bugs
Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode. Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work? Cannot make FaceTime calls with favorite contacts. Find My iPhone cannot jump to the maps app. Camera cannot zoom in or out. Photos cannot be deleted, edited, or shared in a shared album in the photos app. Photos/videos cannot be sent in messages. Spotify cannot be accessed from the lock screen. Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks). There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
1
0
115
Mar ’25