Haptic or Sound queue to allow for the accessibility of the blind (sound) and deaf population (haptic) for even knowing when location services and the camera were last used?
Also, the grey notification rather than the purple notification for location services should appear for the full 24 hours after an application has used the app, if the correct description is within the "copy" of Settings
The green light lets them know that the application has changed to the camera and fade out orange light both could even have subtle simply click sounds, like a
shutter, big haptic, softer sound, but editable in Settings, of course
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I would like to enable the option to resize windows with the apple pencil pro. I tried but I see that this feature is not enabled.
Topic:
Accessibility & Inclusion
SubTopic:
General
My team has a robust digital accessibility program and processes for WCAG conformance in our apps. Because of this, there are definitely accessibility defects that get caught and addressed in order of impact and business priority like any other bug. Obviously we want to aim for 100% accessibility for our users, but it's a continual work in progress as new enhancements or changes are released.
I'm stuck on the appropriate measurement to indicate support. If we have 50 common tasks and the most central 10 tasks are solid but some supporting (but also common) tasks have a contrast fail or accessibleLabel missing, does that make the whole app not supporting the feature? If "completing the task" is the rubric there are a whole range of interpretations for that.
In a complex app, I anticipate that a group like ours will have strong support for many of the Accessibility Nutrition Labels accessibility features across tasks and devices, but realistically never be 100% free of defects for a given Apple Accessibility feature, even among core tasks.
As I consider the next steps for Nutrition Labels, I do not see anything in the documentation that gives a sort of baseline or measurement for inclusion. We plan to test all steps to complete a task, and log defects accordingly with an assigned timeline for fixing them (as would be true for functional defects).
Hello,
I’m in the process of enrolling my business (Carzo Rent A Car, Prishtine, Kosovo) in the Apple Developer Program, but I have been waiting for my D-U-N-S number to be issued.
I submitted the request to Dun & Bradstreet on July 28, 2025 (Case #9142648) and have only received a system-generated email with a tracking ID (#9086421). There has been no further update.
My questions are:
Is there a way for Apple to expedite or provisionally approve my enrollment while the D-U-N-S number is pending?
How long does Apple typically wait for D&B updates before the enrollment is affected?
Are there any alternative steps I can take to avoid further delays?
Thank you for your guidance.
Topic:
Accessibility & Inclusion
SubTopic:
General
When my app is in the background, I create a Live Activity through a push notification with token get from pushToStartTokenUpdates, and this process works fine. However, without opening the app, how can I retrieve the new push token for this Live Activity again and use it for subsequent updates to the Live Activity content?
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused.
According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup.
In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact.
So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
I'd like to add borders to all buttons in the iOS simulator from my Mac app. First I get the simulator window. Then I access the children of all AXGroup and if it's a button or a static text, I add a border.
But for some buttons this does not work. In the example image the NavigationBarButtons are not found. I guess the problem is, that for some AXGroup the children array access with AXChildren is empty.
Here is some relevant code:
- (NSArray<DDHOverlayElement *> *)overlayChildrenOfUIElement:(AXUIElementRef)element index:(NSInteger)index {
NSMutableArray<DDHOverlayElement *> *tempOverlayElements = [[NSMutableArray alloc] init];
NSLog(@">>> -----------------------------------------------------");
NSString *role = [UIElementUtilities roleOfUIElement:element];
NSRect frame = [UIElementUtilities frameOfUIElement:element];
NSLog(@"%@, role: %@, %@", element, role, [NSValue valueWithRect:frame]);
NSArray *lineage = [UIElementUtilities lineageOfUIElement:element];
NSLog(@"lineage: %@", lineage);
NSArray<NSValue *> *children = [UIElementUtilities childrenOfUIElement:element];
if (children.count < 1) {
NSLog(@"NO CHILDREN");
}
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"----%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
}
NSLog(@"<<< -----------------------------------------------------");
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
if ([role isEqualToString:@"AXButton"] ||
[role isEqualToString:@"AXTextField"] ||
[role isEqualToString:@"AXStaticText"]) {
NSString *tag = [NSString stringWithFormat:@"%ld%ld", (long)index, (long)i];
NSLog(@"tag: %@", tag);
DDHOverlayElement *overlayElement = [[DDHOverlayElement alloc] initWithUIElementValue:child tag:tag];
[tempOverlayElements addObject:overlayElement];
} else if ([role isEqualToString:@"AXGroup"] ||
[role isEqualToString:@"AXToolbar"]) {
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:++index]];
} else if ([role isEqualToString:@"AXWindow"]) {
[self.overlayWindowController setFrame:[UIElementUtilities frameOfUIElement:uiElement]];
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:index]];
}
}
return [tempOverlayElements copy];
}
For some AXGroup the children are found. For some they are empty. I cannot figure out why.
Does anyone have an idea what I'm doing wrong?
Topic:
Accessibility & Inclusion
SubTopic:
General
How can I force VoiceOver to read parentheses for math expressions like this:
Text("(2+3)×4") // VoiceOver: Two plus three, times four
I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks.
Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.”
Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
Dear Team,
We are stuck in membership enrollment process at the purchase stage. We enter the payment details, get the message that our purchase will be completed in 48 hours, but after some time, it reverts back to the same stage of purchase your membership without any error or information. This has happened for 3-4 times till now.
Kindly let us know how to resolve this issue and move forward with the enrollment process.
Thanks.
Hi,I applied for the COMMUNICATION capability, but have a message that I already have the driving task app entitlement.
After that ,I have applied one more time ,there is no reply anymore.
I do not have the com.apple.developer.carplay-communication capability, that means I can not apply this capability?
What should i do next to get this capatibility?
Thanks
写了个自己用的app,在自己手机上测试中,隔一周左右就打不开了,显示不再可用。
ps.没花钱买开发者账号,app也不打算发布。
Topic:
Accessibility & Inclusion
SubTopic:
General
Is it possible to position windows on the floor by changing some setting? Currently, they cannot be placed on the floor due to drag limitations.
if you are on the tik tok website on safari, you are able to view a video that originally brought you to the website the from the search log, but if you want to click on another video listed on the website, it claims you need to use the app to go farther, and upon proceeding it just brings you to the App Store regardless if you have the app already or not , and you are unable to view the video displayed on the website without searching for it separately on the app.
Topic:
Accessibility & Inclusion
SubTopic:
General
i have updated to the ipados 26 and my pointer is still the circle one and not the arrow cursor
Topic:
Accessibility & Inclusion
SubTopic:
General
While it is possible to scroll content using VoiceOver on macOS, I was not able to find any NSAccessibility APIs related to it (such as accessibilityScroll: on iOS).
I updated developers beta 18.3 on my iPad 7th generation. Post the update sound isn’t working. I tried to restart and reset device completely But no luck. Also, youtube app isn’t working post update
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars.
Issue details:
My iPhone is set to Region: Chile and Language: Spanish (Chile).
In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars.
A notification with the text:
let content = UNMutableNotificationContent()
content.body = "¡Has recibido un pago por $5.000!"
is read aloud by Siri as:
”¡Has recibido un pago por 5.000 dólares!”
(English: “You have received a payment of five thousand dollars!”)
instead of the correct:
”¡Has recibido un pago por 5.000 pesos!”
(English: “You have received a payment of five thousand pesos!”)
Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177
This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services:
watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions).
Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency.
Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly.
Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency.
I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348.
This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars.
Has anyone found a workaround, or is there any update from Apple on this?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Siri and Voice
User Notifications
Localization
Apple Intelligence
After IOS 26 beta 2 installation in my iphone 13, I can't do a screenshot using assistivetouch nor touch on back.
Hello,
I am a student studying accessibility.
I aim to analyze the smartphone usage patterns of visually impaired individuals.
Therefore, I would like to log the VoiceOver usage records of visually impaired iPhone users.
Is there a way to output VoiceOver logs, similar to the AccessibilityService API on Android?
Thank you in advance for your responses.
CallKit and WebRTC are used to realize the call functionality.
You can select video, voice, or text calling as your calling method.
When making a text call, the voice input is grayed out and cannot be used, is there a solution?