Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Having trouble with Accessibility API of the ApplicationServices framework
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1 CFTypeRef frontWindow = NULL; AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );     if ( err != kAXErrorSuccess ){         NSLog(@"failed with error: %i",err);     } NSLog(@"app: %@  frontWindow: %@",app,frontWindow); 'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
2
0
771
Jun ’25
Best Way to Navigate to the Top Element Using VoiceOver
I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need? For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient. Is there a better way to handle this using VoiceOver?
2
0
309
Mar ’25
Too many verification codes have been sent.
Hello, I have the following problem. I’m developing a NoCode app using the FlutterFlow platform and have been working on it for over a year. This time, after publishing a new version of the app through FlutterFlow, I tried logging into Apple Store Connect, but I got an error saying that I had made too many login attempts and needed to try again later. However, I hadn’t attempted to log in before that at all. No matter how long I wait—24 hours, 48 hours—the same error keeps appearing, meaning I still can’t access my account. Apple Support hasn’t responded for 4 days, and in total, I’ve been locked out of my account for over 9 days. Please help me understand what might be causing this issue. Apple Store Connect refuses to send me an SMS with the login code.
2
1
441
Feb ’25
Do Rotors add more Complexity to VoiceOver?
This may sound like a bit of an odd question, but this was what I was told this morning by one of our Accessibility managers. This past June at WWDC, I scheduled a lab session with Apple's accessibility folks for a review. I had the pleasure of working with Ryan who helped give the great VoiceOver Testing Talk from WWDC 2018. I believe I've worked with him before in the labs, but regardless, no matter who I meet with in the Accessibility Labs they always provide me with some new nugget of information that I learn, no matter how well versed I might think I am in Accessibility. After the labs, I made all the changes that Ryan suggested and also told other developers on my team of what I was taught. In our app we provide various forms, and each field component that appears in the form has a header text which we apply a header trait to. This allows for the use of a Header Rotor to quickly navigate between all the questions in the form, say if a user wants to return to a previous field etc. I even suggested we should take the time to provide a custom rotor that would allow users to navigate to fields that may be in an error state. If say the user submits the form, and the responses are validated, if 1 or multiple fields be in error we should have a rotor to allow the user to navigate directly to those fields. They may not be able to see the Red text / red outlines of those fields. This morning, I was told that I needed to undo that. That our headerLabel properties should not be marked with the UIAccessibilityTrait.header trait. When I stated that it makes navigation of the form much easier via the Headers Rotor, I was told by the Accessibility Manager this is not the case. I have the MS Teams transcript in front of me, which reads as follows (give or take a few transcript errors) So I went ahead and I just double checked with two of my friends, who are blind and for them on their end, they both said that they would not actually use that, and could add more complexity, because they have—in addition to being blind—but there's also mobility limitations. So they actually can't even use the Rotor at all. They only can use the swipes. Does this make sense to anyone, because it doesn't to me? Thoughts on this?
2
0
409
Dec ’24
Accessibility full keyboard access issue.
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
2
0
526
Mar ’25
RTT call option and confirmation dialog missing when dialing emergency numbers
Hello, In our app we provide a button that initiates a phone call using tel://. For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel. If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call. However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT. Is this the expected system behavior for emergency numbers on iOS? 
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option? Thanks in advance for clarifying.
2
0
576
3d
xcstrings file in not being updated
I'm using Xcode 15.2 and have migrated my (macOS) project to use an xcstrings file a while back. Now when I check the xcstrings file, all items are marked as "stale". When I add new localized strings in code, they don't show up in the xcstrings file. The xcstrings file is built correctly (into .lproj/Localizable.strings) when building. Where can I check which source files are checked to update xcstrings status? "xcstringstool" appears to have a "sync" feature which reads "stringsdata" files, but there is no information in the xcstringstool help on where the stringsdata files come from. If I create a new project I can see a "stringsdata" file being generated for each source file in the intermediate build products folder.
2
0
2.6k
Oct ’24
PHPickerViewController No Auto Focus
The issue is, I cannot auto acquire bluetooth keyboard focus in PHPickerViewController after enabling 'Full Keyboard Access' in my IPhone 14 with iOS version 18.3.1. The keyboard focus in PHPickerViewController will show, however, after I tapped on the blank space of the PHPickerViewController. How to make the focus on at the first place then? I'm using UINavigationController and calling setNavigationBarHidden(true, animated: false). Then I use this controller to present PHPickerViewController using some configuration setup below. self.configuration = PHPickerConfiguration() configuration.filter = .any(of: filters) configuration.selectionLimit = selectionLimit if #available(iOS 15.0, *), allowOrdering { configuration.selection = .ordered } configuration.preferredAssetRepresentationMode = .current Finally I set the delegate to PHPickerViewController and call UINavigationController.present(PHPickerViewController, animated: true) to render it. Also I notice animation showing in first video then disappear.
2
0
262
Mar ’25
AVSpeechSynthesisProviderVoice audioFileSettings field
Hello, the AVSpeechSynthesisVoice has a audioFileSettings attributes let utterance = AVSpeechUtterance(string: text) utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!) print("- voice \(utterance.voice!.audioFileSettings)") ["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32] This is declared in AVSpeechSynthesisVoice { ... @available(iOS 13.0, *) open var **audioFileSettings:** [String : Any] { get } @available(iOS 17.0, *) open var voiceTraits: AVSpeechSynthesisVoice.Traits { get } } How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ? Cause in AVSpeechSynthesisProviderVoice there is no such field AVSpeechSynthesisProviderVoice { open var name: String { get } open var identifier: String { get } open var primaryLanguages: [String] { get } open var supportedLanguages: [String] { get } open var voiceSize: Int64 open var version: String open var gender: AVSpeechSynthesisVoiceGender open var age: Int } Regards
2
0
111
Mar ’25
Keyboard navigation not working in native iOS Wallet interface
Hi guys, I'm facing an issue with the native interface to add a card into the wallet - does someone have some ideas on how to fix/work around that? STEPS TO REPRODUCE: Disable VoiceOver (Settings → Accessibility → VoiceOver → Off). Connect and confirm that you can navigate other iOS interfaces using an external keyboard. In any app, present a PKAddPassesViewController with a valid .pkpass file. When the Wallet “Add Pass” sheet appears, attempt to navigate using only the external keyboard (Tab/Arrow/Enter). Observe that focus does not move to the Cancel or Add buttons, and no elements receive keyboard focus. EXPECTED RESULT: All interactive elements in PKAddPassesViewController (e.g., Cancel and Add) should be fully keyboard accessible without requiring VoiceOver. Users should be able to navigate, select, and complete actions using only a hardware keyboard. ACTUAL RESULT: Keyboard navigation is not possible. No elements receive focus. Users cannot activate Cancel or Add buttons using keyboard input. The only way to interact is by touch or enabling VoiceOver, which does not satisfy keyboard accessibility requirements. IMPACT: Violates WCAG 2.1 Success Criterion 2.1.1 (Keyboard Accessible). Prevents keyboard-only users (including users with motor disabilities) from adding passes to Wallet. Affects users of external keyboards who rely on tab/arrow navigation. Creates an inconsistent accessibility experience compared to other iOS system modals.
2
0
1.3k
Aug ’25
Too small font in Notes app
Dear developer team, After updating to iOS 18.3.1 I noticed the font in the Notes app became too small to read comfortably, and I have already got poor eyesight. There is no way to increase the font size. When I select my preferred text size through Accessibility settings, it only changes the size of headings in the Notes app but the text remains too small in the note itself. I’m using the IPhone 13. I googled the issue and seems like other users across the Internet are also unhappy about the lack of ability to change the text size in Notes to suit their comfortable levels. I hope that this issue will be addressed by developers in the next version of the iOS because the reading size in the standard app can affect health for the tired and diminished eyesight. Kind regards, Maria
2
0
502
Feb ’25
Custom tab bar in SwiftUI
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused. According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup. In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact. So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
2
0
301
Aug ’25
DUNS Number
Hello, I’m in the process of enrolling my business (Carzo Rent A Car, Prishtine, Kosovo) in the Apple Developer Program, but I have been waiting for my D-U-N-S number to be issued. I submitted the request to Dun & Bradstreet on July 28, 2025 (Case #9142648) and have only received a system-generated email with a tracking ID (#9086421). There has been no further update. My questions are: Is there a way for Apple to expedite or provisionally approve my enrollment while the D-U-N-S number is pending? How long does Apple typically wait for D&B updates before the enrollment is affected? Are there any alternative steps I can take to avoid further delays? Thank you for your guidance.
2
0
139
Jul ’25
Text Replacement preferences are not exportabe, is there a PLIST file or similar that stores these values?
macOS > Settings > Keyboard > Text Replacement is a macOS/iOS feature I use extensively but it is unavailable in quite a few macOS applications — including some Apple apps like Xcode. Other apps I want Text Replacement to work in but it doesn't are the Adobe CC suite, all my code text editors like VC Studio, Sublime Text, BBEdit, emacs etc and Firefox. Does anybody know what the file path is for the file that stores the Text Replacement key/string pair data? If i can access the .plist file or similar where the Text Replacement key/string pairs are stored I will be able to convert to JSON using REGEX and import to the Firefox plugin that will replicate Text Replacement functionality in Firefox for me. Ditto other code editor applications with their own particular text substitution functionality. Background Some of these apps have plugins or functional equivalents of Text Replacement but i need a way to do the import/export dance to keep them in sync with my Text Replacement text pairs. Sadly, even though we can select the Text Replacement table (in macOS but not in iOS versions), we can't copy that information. This seems to me a violation of good GUI design principles. Why allow selection of the entire table if we cannot copy it to the clipboard? Nor can we import or export the table of text tuples. The Text Replacement GUI has no buttons to do this (consider this post also a FR for that). Screen-capturing and running text recognition software over the PNG it is not an option given: a) all the UTF-16 unusual glyphs and combination glyphs i use; b) that I'd like to script this as a multi-directional syncing application I can run periodically. Typically macOS and app preferences are stored in plist files. I want to find such a file and convert it to JSON for importing into a Firefox plugin that replicates Text Replacement within Firefox. I tried modifying Text Replacements by adding a new item to the list and clicking "Done" and then filtering ~/Library for .plist files and sorting by "Date Modified" but nothing is showing up with these values in it.
2
0
526
Oct ’24