Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Too small font in Notes app
Dear developer team, After updating to iOS 18.3.1 I noticed the font in the Notes app became too small to read comfortably, and I have already got poor eyesight. There is no way to increase the font size. When I select my preferred text size through Accessibility settings, it only changes the size of headings in the Notes app but the text remains too small in the note itself. I’m using the IPhone 13. I googled the issue and seems like other users across the Internet are also unhappy about the lack of ability to change the text size in Notes to suit their comfortable levels. I hope that this issue will be addressed by developers in the next version of the iOS because the reading size in the standard app can affect health for the tired and diminished eyesight. Kind regards, Maria
2
0
502
Feb ’25
Hi,I applied for the COMMUNICATION capability failed, what should I do next?
Hi,I applied for the COMMUNICATION capability, but have a message that I already have the driving task app entitlement. After that ,I have applied one more time ,there is no reply anymore. I do not have the com.apple.developer.carplay-communication capability, that means I can not apply this capability? What should i do next to get this capatibility? Thanks
2
0
1.1k
Jul ’25
Programmatically force VoiceOver to read parentheses for math expressions
How can I force VoiceOver to read parentheses for math expressions like this: Text("(2+3)×4") // VoiceOver: Two plus three, times four I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks. Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.” Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
2
0
344
Jul ’25
Tahoe Launchpad
I wrote this in the regular forums and they deleted it and told me to write it here because it was dealing with unreleased software. I read that Launchpad is disappearing in Tahoe and I have real concerns about that. For me, that is an accessibility issue. I have both memory problems and scanning problems. So having my apps organized into categories is extremely important to me. Just today I needed to find an app that I didn't remember the name of and I rarely use, but when I need it, it is important to me. Just to see if I could find it without launchpad, I scanned my applications folder and I couldn't find it. I went to launchpad and to the category I knew it was in and it was right there, easy for me to find. Please don't take away our organization options.
2
1
1.2k
Jul ’25
Is it possible to animate the accessibility frame on iOS and macOS?
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
2
0
233
Jul ’25
Having trouble with Accessibility API of the ApplicationServices framework
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1 CFTypeRef frontWindow = NULL; AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );     if ( err != kAXErrorSuccess ){         NSLog(@"failed with error: %i",err);     } NSLog(@"app: %@  frontWindow: %@",app,frontWindow); 'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
2
0
770
Jun ’25
VoiceOver does not focus App Store subscription modal when shown via AppStore.showManageSubscriptions(in:)
Description When calling AppStore.showManageSubscriptions(in:), the system modal for managing subscriptions appears visually. However, it is not automatically focused by VoiceOver, and in some cases, VoiceOver still allows interaction with elements in the underlying view controller, such as buttons and labels. This creates confusion and violates accessibility expectations. Steps to Reproduce 1. In a UIKit app, present the system subscription sheet via AppStore.showManageSubscriptions(in:). 2. Ensure VoiceOver is enabled on the device. 3. Observe the focus behavior when the modal appears. 4. Try swiping right/left — VoiceOver continues to announce items in the presenting view controller. Expected Result The modal should automatically take VoiceOver focus, and all elements behind it should be non-accessible until dismissed. Actual Result VoiceOver continues to focus and interact with elements behind the presented modal. Notes • Tested on iOS 18.5 • Reproducible on device • Using Swift/UIKit (not SwiftUI)
2
0
178
Jul ’25
To say I'm extremely hurt is an understatement! 😔
Voice Control Disabling System Services After Reboot I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem. To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months. I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem. I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter. Sincerely, Donald Spencer Kirby Dayton, Ohio
2
0
147
Jun ’25
IOS18 Crash
At present, in iOS, if using the in-house app, there may be crashes in the new iOS 18.3 and later versions, but it works normally on other phones and the certificate is not problematic. A total of 3 machines were found, and there was no pattern between the machines and the system, with different models and versions. We tested it on a machine that crashes, but the app downloaded from the store doesn't. If the same app is packaged and installed directly in the development tool, it will crash. Is this related to compatibility with the new version of IOS? Is there a solution? Do others also have relevant situations?
2
0
107
Jun ’25
AssistiveTouch pointer cannot move past center of screen in landscape orientation on iPhone
Hello everyone, I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5. This has also been reported via Feedback Assistant with Feedback ID: FB17806167 Description: When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation. Specifically: The pointer cannot move past the center of the screen Horizontal and vertical (X/Y) movements appear to be swapped or misaligned Natural movement of the pointer is not possible It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape. This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps. Steps to Reproduce: Enable AssistiveTouch Connect a Bluetooth mouse to the iPhone Rotate the device to landscape orientation Try moving the mouse pointer across the screen → Notice that: Pointer cannot move past the center Horizontal/vertical input is interpreted incorrectly (as if still in portrait) Expected Behavior: The mouse pointer should move across the entire screen correctly, regardless of device orientation. Actual Behavior: In landscape orientation, the pointer is either restricted to part of the screen or misaligned. It behaves as if the device is still in portrait. Horizontal mouse movement causes vertical pointer movement, and vice versa User experience feels broken and unintuitive Feature Suggestion: Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS. I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior. Thanks in advance for any insights or suggestions. Best regards, Jannis
2
0
200
Jun ’25
Solo Developer User Feedback Avenues
I have a couple follow up questions after the "Accessibility technologies group lab". I know it was briefly mentioned that user feedback is an excellent way to grow inclusivity in the design an app and utilizing these forums were one for example. Is inviting folks here on the forum via test flight a reasonable approach to this for a solo developer? Are there other strategies, avenues, or examples to promote user feedback?
2
0
80
Jun ’25
System clock ancs notification is not sent on iPhone >= 14
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore. Is There any reason for this to happening?. Is anyone aware of this behaviour? Any suggestion would be really appreciated. Cheers
2
0
130
Jun ’25
AXChildren does not get all children
I'd like to add borders to all buttons in the iOS simulator from my Mac app. First I get the simulator window. Then I access the children of all AXGroup and if it's a button or a static text, I add a border. But for some buttons this does not work. In the example image the NavigationBarButtons are not found. I guess the problem is, that for some AXGroup the children array access with AXChildren is empty. Here is some relevant code: - (NSArray<DDHOverlayElement *> *)overlayChildrenOfUIElement:(AXUIElementRef)element index:(NSInteger)index { NSMutableArray<DDHOverlayElement *> *tempOverlayElements = [[NSMutableArray alloc] init]; NSLog(@">>> -----------------------------------------------------"); NSString *role = [UIElementUtilities roleOfUIElement:element]; NSRect frame = [UIElementUtilities frameOfUIElement:element]; NSLog(@"%@, role: %@, %@", element, role, [NSValue valueWithRect:frame]); NSArray *lineage = [UIElementUtilities lineageOfUIElement:element]; NSLog(@"lineage: %@", lineage); NSArray<NSValue *> *children = [UIElementUtilities childrenOfUIElement:element]; if (children.count < 1) { NSLog(@"NO CHILDREN"); } for (NSInteger i = 0; i < [children count]; i++) { NSValue *child = children[i]; AXUIElementRef uiElement = (__bridge AXUIElementRef)child; NSString *role = [UIElementUtilities roleOfUIElement:uiElement]; NSRect frame = [UIElementUtilities frameOfUIElement:uiElement]; NSLog(@"----%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]); } NSLog(@"<<< -----------------------------------------------------"); for (NSInteger i = 0; i < [children count]; i++) { NSValue *child = children[i]; AXUIElementRef uiElement = (__bridge AXUIElementRef)child; NSString *role = [UIElementUtilities roleOfUIElement:uiElement]; NSRect frame = [UIElementUtilities frameOfUIElement:uiElement]; NSLog(@"%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]); if ([role isEqualToString:@"AXButton"] || [role isEqualToString:@"AXTextField"] || [role isEqualToString:@"AXStaticText"]) { NSString *tag = [NSString stringWithFormat:@"%ld%ld", (long)index, (long)i]; NSLog(@"tag: %@", tag); DDHOverlayElement *overlayElement = [[DDHOverlayElement alloc] initWithUIElementValue:child tag:tag]; [tempOverlayElements addObject:overlayElement]; } else if ([role isEqualToString:@"AXGroup"] || [role isEqualToString:@"AXToolbar"]) { [tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:++index]]; } else if ([role isEqualToString:@"AXWindow"]) { [self.overlayWindowController setFrame:[UIElementUtilities frameOfUIElement:uiElement]]; [tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:index]]; } } return [tempOverlayElements copy]; } For some AXGroup the children are found. For some they are empty. I cannot figure out why. Does anyone have an idea what I'm doing wrong?
2
0
121
May ’25
Discussion on Location Services and Green light (Will someone deaf or blind ever know when their location was last on?)
Haptic or Sound queue to allow for the accessibility of the blind (sound) and deaf population (haptic) for even knowing when location services and the camera were last used? Also, the grey notification rather than the purple notification for location services should appear for the full 24 hours after an application has used the app, if the correct description is within the "copy" of Settings The green light lets them know that the application has changed to the camera and fade out orange light both could even have subtle simply click sounds, like a shutter, big haptic, softer sound, but editable in Settings, of course
2
1
174
May ’25