Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Posts under Accessibility tag

138 Posts

Post

Replies

Boosts

Views

Activity

A Summary of the WWDC25 Group Lab - Accessibility
A Summary of the WWDC25 Group Lab - Accessibility At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Accessibility. Accessibility Nutrition Labels are a really big step forward for the experience people have on the App Store to find apps that will work for them. How should developers get started with Accessibility Nutrition Labels? A good starting point is to review the Accessibility Nutrition Label evaluation criteria on App Store Connect Help. It's a concise document, roughly 10 pages, and you can approach it section by section after the introduction. Even with prior experience using accessibility features like VoiceOver, the criteria offer valuable insights that might not be immediately apparent. For those newer to accessibility, a good entry point might be one of the visual feature labels, such as Dark Interface, which is a popular and frequently used feature. Which accessibility features can I indicate support for in Accessibility Nutrition Labels? The accessibility features covered include support for assistive technologies like VoiceOver and Voice Control, media enhancements such as captions and audio descriptions, and display accommodations. These display accommodations cover options like larger text, dark interface, differentiating without color alone, sufficient contrast, and reduced motion. With the new Accessibility Nutrition Labels, will app store reviewers validate what we select? The Accessibility Nutrition Label can be edited at any time without requiring a new app submission. However, if an app inaccurately claims feature support, App Review may contact the developer and request an update to the label or the app. Are there any updates to tools for analyzing the accessibility of our apps? Although there aren't new updates this year, continued support for Accessibility Audits is available through Xcode's built-in Accessibility Inspector. XCTest also supports accessibility audits, enabling developers to test app accessibility with every build. These audits analyze aspects like contrast, dynamic type, text clipping, element labels, and more within each view. For a deeper dive, the "Perform accessibility audits for your app" session from WWDC 2023 is a valuable resource. What are accessibility features you wish more people integrated? Accessibility features encompassing user input labels optimized for voice control, keyboard navigation and shortcuts, and dynamic type support could be more used to benefit users. What were some of the biggest accessibility challenges your team encountered while developing Liquid Glass? Apple is known for its innovation and strives to deliver a high-quality experience for everyone. Accessibility is considered a core component of visual design from the outset. For example, the Liquid Glass design inherently supports reduced transparency and increased contrast. As design continues to evolve, user feedback submitted through Feedback Assistant is invaluable. How does Liquid Glass respond to contrast? Especially for text and low contrast environments. Content legibility is a crucial aspect of the Liquid Glass design. It inherently supports accessibility features like reduced transparency and increased contrast. Your feedback during the beta period and beyond is essential to ensuring Liquid Glass provides a great experience within your apps. What are some Apple apps that stand out for their accessibility? Apps like Keynote in the iWork suite offer groundbreaking VoiceOver features to enhance creative productivity for all users. Assistive Access makes core apps such as Messages, Photos, Camera, Phone, and Music more accessible. Podcasts provides transcripts to broaden its reach, and frameworks like SwiftUI ensure that apps built with the latest UI frameworks have excellent built-in accessibility.
0
0
848
Jul ’25
Binary executable requires Accessibility Permissions in Tahoe
I have a binary executable which needs to be given Accessibility Permissions so it can inject keypresses and mouse moves. This was always possible up to macOS 15 - when the first keypress arrived the Accessibility Permissions window would open and allow me to add the executable. However this no longer works in macOS 26: the window still opens, I navigate to the executable file and select it but it doesn't appear in the list. No error message appears. I'm guessing that this may be due to some tightening of security in Tahoe but I need to figure out what to change with my executable to allow it to work.
4
1
244
16h
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
445
1d
iOS 26 regression: `DeviceActivityEvent`: `eventDidReachThreshold` called immediately (instead of waiting till threshold is reached)
Hello Albert! I am experiencing some strange bugs around DeviceActivityEvents (part of the DeviceActivity framework) on iOS 26 / iOS 26.1 / iOS 26.2 beta: When creating a DeviceActivityEvent we can assign a threshold and applicationTokens. The idea is, that after the user has spent said threshold on said apps, eventDidReachThreshold() is called. The property includesPastActivity is set to false. On iOS 26 however, it happens (quite reliably after updating to a new beta seed) quite often that eventDidReachThreshold() is called immediately (after a couple of seconds) instead of waiting for the threshold to be met. Is anyone else seeing similar issues on iOS 26 / iOS 26.1 / iOS 26.2 beta? Only workaround I have found is to ask users to revoke and re-grant Screen Time permissions. This only holds for about two weeks though or at most until the next iOS 26 beta update is installed, so it is not a permanent solution unfortunately. Feedback (incl. sysdiagnoses and sample project) is filed under: FB18061981 FB18927456 One of our users has filed their own feedback request as well: FB20817853 Thanks a lot for any help on this!
0
0
265
1w
How to Constrain a TableView Cell Similarly to Apple's Settings App
Hello! I'm creating a settings page for my app and I want it to look as native as possible. I want to know if it's possible to add constraints that make the second label go to the bottom when the text size gets really large (see Picture1) instead of having to force it to be on the right (see Picture 2). I've left my constraint code for this cell down below, too. I'm still learning constraints and best practices, so if there's any feedback, I'd love to hear it. Thank you! Picture 1 Picture 2 - (void) setConstraints { [NSLayoutConstraint activateConstraints:@[ // Cell Title Label [self.themeColorLabel.leadingAnchor constraintEqualToAnchor:self.contentView.layoutMarginsGuide.leadingAnchor], [self.themeColorLabel.trailingAnchor constraintEqualToAnchor:self.contentView.layoutMarginsGuide.trailingAnchor], [self.themeColorLabel.topAnchor constraintEqualToAnchor: self.contentView.layoutMarginsGuide.topAnchor], [self.themeColorLabel.bottomAnchor constraintEqualToAnchor: self.contentView.layoutMarginsGuide.bottomAnchor], // Selected Theme Color Label [self.selectedColorLabel.trailingAnchor constraintEqualToAnchor: self.contentView.layoutMarginsGuide.trailingAnchor], [self.selectedColorLabel.topAnchor constraintEqualToAnchor: self.contentView.layoutMarginsGuide.topAnchor], [self.selectedColorLabel.bottomAnchor constraintEqualToAnchor: self.contentView.layoutMarginsGuide.bottomAnchor], ]]; }
3
0
99
1w
Potential Structural Swift Concurrency Issue: unsafeForcedSync called from Swift Concurrent context
I occasionally get this error in Xcode’s console: Potential Structural Swift Concurrency Issue: unsafeForcedSync called from Swift Concurrent context. What does this mean, and how can I resolve it? Googling it doesn’t turn up any results. This doesn't crash the app - it’s just an error diagnostic that I see in the Xcode console. The app keeps running before and after the issue. Is there a way I can set a breakpoint to catch this where it happens?
8
2
1.9k
1w
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
1
0
253
2w
UI tests blocked by “bash requesting screen access” popup in Mac OS 15
On macOS, I get a system popup when running UI tests in GitHub saying: “bash” is requesting to bypass the system private window picker and directly access your screen and audio. How can I prevent these login and screen access popups from appearing during automated UI tests? Is there an official setup or configuration for running IntelliJ UI tests in CI environments (macOS, Linux, Windows) to avoid such dialogs? My builds run in GitHub Actions VMs, so I can’t manually grant these permissions, and they block the tests.
0
0
49
3w
Unable to submit my macOS window‑manager app
Hello Apple Developer Support, I’m writing with a mix of enthusiasm and frustration after more than six months of full‑time development on my macOS window‑manager TilesWM (a feature‑rich competitor to Magnet, Divvy, BetterSnapTool, etc.). I have completed the Application, the product page, a knowledge-base with 90+ entries, an in-app onboarding flow, preparing the feedback-hub for submissions, all required marketing assets and finally; signing up for the $99 Developer Program... I am now blocked at App Store Connect validation. What I’m trying to submit App name: TilesWM Bundle ID: dev.steinhorst.tileswm Core functionality: Detect window movement & resize windows, optional global hot‑keys, persistent user settings are stored in a SQLite-DB located at: ~/Library/Application Support/<bundle‑identifier> Privacy: No analytics, no data collection, no runtime downloads. Tested on: macOS 15.6.1 (Apple Silicon M1) & macOS 26.0.1 (M3‑Max). The app works exactly like the existing mainstream window managers: it runs non‑sandboxed and requests Accessibility (AX) permissions on demand to control other windows dimensions and positioning. Validation errors Validation failed Invalid Code Signing Entitlements. Your application bundle's signature contains code signing entitlements that are not supported on macOS. Specifically, key 'com.apple.security.accessibility' in 'dev.steinhorst.tileswm.pkg/Payload/TilesWM.app/Contents/MacOS/TilesWM' is not supported. (ID: 13b13813-edd6-4be6-b392-9db5bddd39a0) Validation failed App sandbox not enabled. The following executables must include the "com.apple.security.app-sandbox" entitlement with a Boolean value of true in the entitlements property list: [( "dev.steinhorst.tileswm.pkg/Payload/TilesWM.app/Contents/MacOS/TilesWM" )] Refer to App Sandbox page at https://developer.apple.com/documentation/security/app_sandbox for more information on sandboxing your app. (ID: 28aa17e8-e7b2-4f3f-8def-15922c68ec8a) . In short, App Store Connect refuses to accept an app that uses the Accessibility API and is not sandboxed. Yet the same capability is openly used by Magnet, Divvy, BetterSnapTool and other competitors that are currently on the Mac App Store. Why this matters to me I am a full‑stack engineer with 15+ years of enterprise experience; side projects keep my skills sharp and give back to the macOS community. This would be my entry to the software-side of MacOS, the next product-ideas are scribbled already. Over the last six months I have designed, coded, documented, created marketing assets, purchased a domain, paid for hosting, and funded the Apple Developer Program, all in good faith that the app could be submitted. What I need help with Clarification – Is the com.apple.security.accessibility entitlement truly unsupported for macOS distribution, how can Magnet and other competitors exist in that case, shouldn't they be able to receive competition? Guidance – If sandboxing is mandatory (even though the competition doesn't use it either, looking at their entitlements with codesign -d --entitlements :-<path>). What is the recommended way to retain full window‑management functionality while remaining within Apple’s policies, I tried sandboxing it, but the only app I was able to "resize" was TilesWM (my App) itself. Additional resources A "basic"-demo video, feature comparisons, FAQ & knowledge-base as well as the feedback hub: https://www.tileswm.app I appreciate any insight you can provide. My goal is to bring a polished, useful tool to the Mac App Store while fully respecting Apple’s security requirements, without having to discard months of work or resort to an external distribution model. Thank you for your time and assistance. Best regards, Denis Steinhorst Full‑Stack Engineer – macOS enthusiast Bundle ID: dev.steinhorst.tileswm
6
1
237
Oct ’25
How to prevent VoiceOver from reading text INSIDE an image?
In the example below, VoiceOver (in both iOS 18 and 26) reads the text contained within the image after the .accessibilityLabel, introduced by a “beep.” VoiceOver: Purple rounded square with the word 'Foo' in white letters. Image [beep] foo. I’d like it to only read the accessibility label. As a developer focused on accessibility, I make sure every image already has an appropriate label, so having iOS read the image text is redundant. Sample Code import SwiftUI struct ContentView: View { var body: some View { Image("TextInImage") .resizable() .scaledToFit() .frame(width: 64, height: 64) .accessibilityLabel("Purple rounded square with the word 'Foo' in white letters.") } } Sample Image Drop this image in to Assets.xcassets and confirm it's named TextInImage.
4
0
181
Oct ’25
Proposal: Capacitive swipe-based volume control integrated into iPhone frame
I would like to propose a design enhancement for future iPhone models: using the existing bottom-right antenna line (next to the power button area) as a capacitive “volume control zone” that supports swipe gestures. Today this line is a structural antenna break, but it is also located exactly where the thumb naturally rests when holding the phone in one hand. With a small embedded capacitive/force sensor, the user could slide their finger along this zone to control volume without reaching for the physical buttons. Why this makes sense: • Perfect ergonomic thumb position in both portrait and landscape • One-handed volume adjustment becomes easier for large-screen devices • Silent and frictionless vs. clicking buttons (useful in meetings / night mode) • Consistent with Apple’s recent move toward contextual hardware input (Action Button, Capture Button, Vision Pro gestures) The interaction model would be: • Swipe up → increase volume • Swipe down → decrease volume • (Optional) long-press haptic = mute toggle This could also enhance accessibility, especially for users with reduced hand mobility who struggle to press mechanical buttons on tall devices. Technically, this would be similar to the Capture Button (capacitive + pressure layers), but linear instead of pressure-based. It does not replace physical buttons, it complements them as a silent gesture-based alternative. Thank you for considering this as a future interaction refinement for iPhone hardware design.
2
0
533
Oct ’25
AVSpeechSynthesisVoice ignores user-selected voices in iOS 26 (Regression)
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings. Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2. Particularly affects: Third-party speech synthesis voices (CereProc, Grammatek, etc.) Apps relying on automatic voice selection based on user preferences Example: // User selected CereProc Heather for en-GB in Accessibility settings let voice = AVSpeechSynthesisVoice(language: "en-GB") print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default) Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting. Tested methods: AVSpeechSynthesisVoice(language:) AVSpeechUtterance auto-selection Reflection for new APIs All return the system default voice, not the user's preference. Filed: FB[20271264] Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
4
1
316
Oct ’25
Accessibility Permission In Sandbox For Keyboard
Hello! My question is about 1) if we can use any and or all accessibility features within a sandboxed app and 2) what steps we need to take to do so. Using accessibility permissions, my app was working fine in Xcode. It used NSEvent.addGlobalMonitorForEvents and localMoniter, along with CGEvent.tapCreate. However, after downloading the same app from the App Store, the code was not working. I believe this was due to differences in how permissions for accessibility are managed in Xcode compared to production. Is it possible for my app to get access to all accessibility features, while being distributed on the App Store though? Do I need to add / request any special entitlements like com.apple.security.accessibility? Thanks so much for the help. I have done a lot of research on this online but found some conflicting information, so wanted to post here for a clear answer.
8
0
244
Oct ’25
Illegible navigation title when presenting Map view
When building with the iOS 26 SDK (currently beta 4), the navigation title is often illegible when rendering a Map view. For example, note how the title "Choose Location" is obscured by the map's text ("South America") in the screenshot below: This screenshot is the result of the following view code: import MapKit import SwiftUI struct Demo: View { var body: some View { NavigationStack { Map() .navigationTitle(Text("Choose Location")) .navigationBarTitleDisplayMode(.inline) } } } I tried using the scrollEdgeEffectStyle(_:for:) modifier to apply a scroll edge effect to the top of the screen, in hopes of making the title legible, but that doesn't seem to have any effect. Specifically, the following code seems to produce the exact same result shown in the screenshot above. import MapKit import SwiftUI struct Demo: View { var body: some View { NavigationStack { Map() .scrollEdgeEffectStyle(.hard, for: .top) // ⬅️ no apparent effect .navigationTitle(Text("Choose Location")) .navigationBarTitleDisplayMode(.inline) } } } Is there a recommended way to resolve this issue so that the navigation title is always readable?
7
3
268
Oct ’25
iOS 26 Voice Over is reporting an extra tab
Feedback number: FB20451665 When building with Xcode 26, Voice Over is reporting an extra tab when swiping through tabs. Please see the sample project below: /* This is a Sample project to show that I believe there is a Voice Over bug in iOS 26. When swiping through tabs with Voice Over active, there always appears to be an extra tab. Here I have 5 tabs, when on tab one VO reads out tab 1 of 6, then tab 2 of 6, all the way to the last tab, when voice over reads out tab 5 of 6. Never tab 6 of 6. Is there a possibility that voice over is picking up the underlying `more` tab and reading that out? This has also been reportedly found in the Files app here: https://www.applevis.com/comment/195441#comment-195441 */ struct ContentView: View { var body: some View { TabView { /// Activating this has Voice over telling us there are 6 Tabs. Tab(RootTab.home.title, systemImage: "circle.fill") { Text("This is the \(RootTab.home.title.capitalized) screen") } .accessibilityLabel("\(RootTab.home.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.home.title.capitalized) tab") Tab(RootTab.diary.title, systemImage: "circle.fill") { Text("This is the \(RootTab.diary.title.capitalized) screen") } .accessibilityLabel("\(RootTab.diary.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.diary.title.capitalized) tab") Tab(RootTab.meals.title, systemImage: "circle.fill") { Text("This is the \(RootTab.meals.title.capitalized) screen") } .accessibilityLabel("\(RootTab.meals.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.meals.title.capitalized) tab") Tab(RootTab.knowledge.title, systemImage: "circle.fill") { Text("This is the \(RootTab.knowledge.title.capitalized) screen") } .accessibilityLabel("\(RootTab.knowledge.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.knowledge.title.capitalized) tab") Tab(RootTab.profile.title, systemImage: "circle.fill") { Text("This is the \(RootTab.profile.title.capitalized) screen") } .accessibilityLabel("\(RootTab.profile.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.profile.title.capitalized) tab") /// Activating this also has Voice over telling us there are 6 Tabs. // ForEach(RootTab.allCases, id: \.self) { tab in // // Text("This is the \(tab.title.capitalized) screen") // .tabItem { // Label(tab.title.capitalized, systemImage: "circle.fill") // } // .accessibilityLabel("\(tab.title.capitalized) tab") // .accessibilityHint("Double tap to open the \(tab.title.capitalized) tab") // } } } enum RootTab: CaseIterable { case home case diary case meals case knowledge case profile var title: String { switch self { case .home: "home" case .diary: "diary" case .meals: "meals" case .knowledge: "knowledge" case .profile: "profile" } } } } I'm curious if anyone else can see this issue, or if anyone knows of a workaround for it.
3
0
2k
Oct ’25
Silent push throttling breaking accessibility app for neurodivergent users
Hello all 👋 We're developing an app for families with neurodivergent members (primarily autistic children) and have run into a critical reliability issue with silent push notifications that breaks core functionality. Our current implementation: When a caretaker updates the person's daily routine/schedule in our system, we send a silent push notification to the user's device. The app wakes, connects to our server, downloads the updated schedule, and creates/updates local notifications for upcoming activities. The problem: Because the app is rarely/never directly interacted with by the end user (the child doesn't open the app - caregivers configure it on their behalf), silent push notifications get progressively throttled and eventually stop being delivered entirely. This means schedule changes made by caregivers never reach the device, breaking the app's core value proposition. Uninstalling and reinstalling doesn't reset the throttling state Questions: Is there any way to reset or mitigate throttling for devices that legitimately need background updates but have low or no user interaction? This is an accessibility use case where the end user (child) doesn't interact with the app, but the app must reliably receive updates. Would switching to regular (visible) push notifications avoid this throttling even if the app is not interacted with? We already have Critical Alerts entitlement, but for regular updates we're worried that the "CRITICAL ALERT" banner will be too upsetting for the child. Is there any exception process for accessibility apps to change the way Critical Alerts are presented? For neurodivergent individuals, predictable routines are essential. When schedule updates don't reach their device, it can cause significant distress. This is a genuine accessibility need, not a "nice-to-have" feature. Any guidance from Apple engineers or developers who've solved similar challenges would be greatly appreciated. Thank you!
6
0
172
Oct ’25
RTT call option and confirmation dialog missing when dialing emergency numbers
Hello, In our app we provide a button that initiates a phone call using tel://. For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel. If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call. However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT. Is this the expected system behavior for emergency numbers on iOS? 
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option? Thanks in advance for clarifying.
2
0
615
Sep ’25