Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Female English Indian Siri Voice Pronounces Certain Words in the American Pronunciation Instead of the correct Pronunciation
I just downloaded the public release of iOS 18.2. I found a bug when using the English India female Siri voice, it’s pronouncing certain words using the American pronunciation instead of the correct pronunciation that the voice is supposed to have. For example, the Siri voice says the American pronunciation of the word privacy. This is definitely a bug that needs to get resolved as soon as possible. Can you please fix this right away?
4
0
513
Dec ’24
FKA Accessibility focus seems broken in SwiftUI
There are several ways we are supposed to be able to control a11y (accessibility) focus in FKA (Full Keyboard Access) mode. We should be able to set up an @AccessibilityFocusState variable that contains an enum for the different views that we want to receive a11y focus. That works from VO (VoiceOver) but not from FKA mode. See this sample project on Github: https://stackoverflow.com/questions/79067665/how-to-manage-accessibilityfocusstate-for-swiftui-accessibility-keyboard Similarly, we are supposed to be able to use accessibilitySortPriority to control the order that views are selected when a user using FKA tabs between views. That also works from VO but not from FKA mode. In the sample code below, the `.accessibilitySortPriority() ViewModifiers cause VO to change to a non-standard order when you swipe between views, but it has no effect in FKA mode. Is there a way to either set the a11y focus or change the order in which the views are selected that actually works in SwiftUI when the user is in FKA mode? Code that should cause FKA to tab between text fields in a custom order: struct ContentView: View { @State private var val1: String = "val 1" @State private var val2: String = "val 2" @State private var val3: String = "val 3" @State private var val4: String = "val 4" var body: some View { VStack { TextField("Value 1", text: $val1) .accessibilitySortPriority(3) VStack { TextField("Value 2", text: $val2) .accessibilitySortPriority(1) } HStack { TextField("Value 3", text: $val3) .accessibilitySortPriority(2) TextField("Value 4", text: $val4) .accessibilitySortPriority(4) } } .padding() } }```
4
0
174
Jul ’25
AVSpeechSynthesisMarker - iOS 18 - Sync lost
Hello, in an AVSpeechSynthesisProviderAudioUnit sending word position to host using AVSpeechSynthesisMarker / AVSpeechSynthesisMarker.Mark.word seems to be broken on iOS 18. On the app/client side all the events are received immediately whereas they should be received synchronised with the audio. The exact same code works perfectly on iOS 17 On the AVSpeechSynthesisProviderAudioUnit the AVSpeechSynthesisMarker are appended with the correct Position/SampleOffset let wordPos = NSMakeRange(characterRange.location, characterRange.length) let marker = AVSpeechSynthesisMarker(markerType: AVSpeechSynthesisMarker.Mark.word, forTextRange:wordPos, atByteSampleOffset:byteSampleOffset) // also tried with // let marker = AVSpeechSynthesisMarker(wordRange:wordPos, atByteSampleOffset:byteSampleOffset) markerArray.append(marker) print("word : pos \(characterRange) - offset \(byteSampleOffset)") // send events to host speechSynthesisOutputMetadataBlock?(markerArray, self.request!) word : pos {7, 7} - offset 2208 word : pos {15, 8} - offset 37612 word : pos {24, 6} - offset 80368 word : pos {31, 3} - offset 118652 word : pos {35, 2} - offset 128796 ... But on the client side they are all received in the same time (at the beginning of speech) whereas on iOS 17 they arrive sync to the audio. func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, willSpeakRangeOfSpeechString characterRange: NSRange, utterance: AVSpeechUtterance) { print("characterRange : \(characterRange)") } Using Apple Voice/engine works so there is obviously something to change but documentation of AVSpeechSynthesisProviderAudioUnit / AVSpeechSynthesisMarker seems unchanged Thanks in advance
4
0
554
Dec ’24
VoiceOver spells word letter by letter
We currently have an odd issue with VoiceOver spelling a word letter by letter while the same word is spoken as a whole for other items. The app is in German. I have a View in SwiftUI whose button traits are removed, then a label "Start Tab 1 von 5" is added. "Tab is spoken as a whole word here, all fine. If I change the label to "Tab-Schaltfläche" or for example "SimplyGo Tab 3 von 5", then "Tab" is spoken as "T A B", letter by letter. is there a way to force VoiceOver to speak it as a whole?
4
0
1.2k
Jan ’25
VoiceOver cursor focus tracking
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients. We need to know which item has the VoiceOver focus so we can keep track of it. setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused). At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder. Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element? Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
4
0
451
Mar ’25
Critical Bug: Children Can Disable Screen Time Apps Like Choreio Without Parental ApprovalI
Dear Apple Support, I am reporting a critical issue affecting parental control apps like my app, Choreio, which is live on the App Store. When Screen Time settings are configured to require a parent’s password for changes, parents must log in on their child’s device to make any adjustments. This restriction is expected to extend to apps using the Screen Time API, such as Choreio. However, I’ve discovered a significant bug: children can bypass this restriction by simply toggling off Choreio in the Screen Time settings—without needing the parent’s password. This effectively disables the app and defeats its purpose as a parental control tool. Please address this issue as soon as possible to ensure the intended functionality of parental controls. Let me know if you need any additional information to assist with resolving this. Thank you for your attention to this matter. Best regards, Jeff Houston STEPS TO REPRODUCE Here are the steps to reproduce the issue clearly: Install Choreio from the App Store on the child’s phone. Enable parental controls in Screen Time and set it to require the parent’s password for any changes to Screen Time settings. Go to the Screen Time settings on the child’s phone. Observe that the child can simply toggle off Choreio, effectively deactivating the app, without needing the parent’s password. Expected behavior: Toggling off Choreio should require the parent’s password, just like it does for other Screen Time settings. Let me know if additional details are needed!
4
0
319
Feb ’25
VoiceOver and currency - high amounts
I’ve noticed that the VoiceOver reads currency amounts correctly when they are below thousand. Then, for higher amounts, for example 12.225,34 € VoiceOver reads ‘twelve point two two five thirty four euros’ If the amount is formatted without the thousand separator (12225,34 €) this problem doesn’t exist. (VO reads twelve thousand two hundred and twenty five euros and thirty four cents) Why is the thousand separator a problem for VoiceOver if this formatting is coming from the currency and locale? This issue exists in English. I changed my device language to Italian and German and in both cases the number was read correctly even with the separator. Is there a way to make it work in English?
4
1
2.2k
Jun ’25
Already Enrolled, but Now Asked to Re-Enroll – Certificates Revoked, No Response
Our company enrolled in the Apple Developer Program as an organization in July 2024. Everything was fine for several months, but in early January 2025, our developer noticed that the certificates were missing. When we logged into our developer account, we were shocked to see a page prompting us to “Enroll Today”—as if we had never joined in the first place. Clicking the enrollment button led us to an error page stating we cannot enroll. We immediately reached out to Apple Developer Support via email, but despite multiple attempts, we received no response. Strangely, our apps remain live on the App Store, App Store Connect functions as usual, and we continue receiving payments every month. However, we are completely blocked from developing and releasing updates. Today, I managed to reach Apple by phone. After being transferred to a senior representative, I was told they couldn’t tell me why this was happening. They only confirmed that a request had been made and that I should “wait.” That’s it—no explanation, no timeline, nothing. While it’s somewhat reassuring that they acknowledge the issue, I’ve already seen other developers with the same problem go unanswered for months. My suspicion? This account might be linked to an individual developer account from way back in 2015 when Apple’s registration process was far less strict. Could that be the issue? No idea—because Apple won’t say a word. Meanwhile, both of our apps have been exposed to several bugs, and customers are waiting for updates. If there’s still no response from Apple, I have no choice but to register a new account—purely to continue supporting our users. CASE ID: 102508598957
4
1
440
Mar ’25
VoiceOver navigation in carousels
Hi all, I’ve got a usability question about accessibility navigation. My app has a lot of carousels (horizontally scrolling lists of content with far more elements than can fit on the screen). Often, these are just images, but sometimes, they’re cards with multiple subelements. In our previous implementation, each card was a single accessibility element, and we exposed the subelements as accessibility custom actions. Despite this, users frequently mentioned navigating with VoiceOver as a pain point. It takes a long time to navigate through and navigate past these carousels. To solve this, I converted my carousels into a single adjustable element, so users can navigate through it with one swipe, and they can still access the elements by adjusting the values up and down. I got this advice from this 2018 WWDC talk. Is this still the recommended advice? Or is there a new, preferred way to do this? Additionally, I had to get a little creative with the second carousel, the one with multiple subelements. Some of these were interactive (imagine a card with a description, an upvote button, and a downvote button). Adjustable elements override the accessibility custom actions VoiceOver gesture, so I can’t expose the individual buttons as actions. Instead, I made each subelement in each card in the carousel one of the adjustable values. Swiping up would go from description 1 to upvote button 1 to downvote button 1 to description 2, etc. Double tapping with VoiceOver would perform whatever action the carousel is currently on. So if I adjust the value to the element at index 2 (say, downvote 1), double tapping would trigger the downvote button’s action. Does this make sense? Is there a better way to do this? This seemed to be the best compromise between screenreader navigation speed, exposing all actions, and the existing UI.
4
3
266
Jun ’25
SpaceBar Not functioning as expected
When I am doing a file search, in TextEdit, and on certain webistes the space bar will quit functioning as soon as i start typing. If I hold down the "Option" key it allows the space bar to work as normal. I have checked every setting I can think of and nothing has helped.
3
0
112
Apr ’25
`accessibilityUserInputLabels` is ignored on `UIBarButtonItem`
accessibilityUserInputLabels is working fine with any view I tried this on. Meaning that the control can be toggled with the provided alternative names when using Voice Control. When setting this property on any UIBarButtonItem though, it seems Voice Control ignores the alternative names provided by setting accessibilityUserInputLabels. For comparison, accessibilityLabel works perfectly when set on UIBarButtonItem. Is anyone facing the same issue? Using Xcode 16.0 (16A242) on iOS 18
3
0
547
Aug ’25
Programmatically setting accessibility focus broken?
Hello! I'm trying to improve the accessibility of a UIKit login form in our iOS app. If an error occurs, an error message is shown in a label that is hidden by default. For our VoiceOver users, I want to move the focus to the error message label so that VoiceOver reads out the error message. I'm trying to achieve this using UIAccessibility.post, but try as I might, it does not work. To better understand the problem, I created a very simple App which shows a button and a label (always visible), and on pressing the button, I post an accessibility notification: UIAccessibility.post(notification: .layoutChanged, argument: label) What I expect to happen is for the focus to move from the button to the label. What happens instead is the focus stays with the button and VoiceOver reads out the button's label again. So it seems to process the notification, but ignore the argument. Am I misunderstanding how accessibility notifications work or is this simply broken at the moment? I am testing this withy my iPhone with the current iOS version 18.2.1 By the way, using the more modern variant leads to the same result: AccessibilityNotification.LayoutChanged(label).post()
3
0
488
Jan ’25
iOS18.3.1+ widget: Local color picture load widget crashes
Environment:xcode 16.2 WidgetKit: Image(uiImage: UIImage(named: "jp_jump")!).resizable().scaledToFit().frame(width: 58, height: 16).padding(EdgeInsets(top: 0, leading: 16, bottom: 0, trailing: 0)) ”jp_jump“: Local color picture load widget crashes info: Thread 4: EXC_RESOURCE (RESOURCE_TYPE_MEMORY: high watermark memory limit exceeded) (limit=30 MB)
3
0
79
Mar ’25
accessibilityElements excludes the unlisted subviews – How to Fix?
I have a parent view containing 10 subviews. To control the VoiceOver navigation order, I set only a few elements in accessibilityElements. However, the remaining elements are not being focused or are completely inaccessible. Is this the expected behavior? If I only specify a subset of elements in accessibilityElements, does it exclude the rest? What’s the best way to ensure all elements remain accessible while customising the order?
3
0
130
Apr ’25
Developer Mode Restart without HomeButton
After enabling Developer Mode on my iPhone and restarting it, the device asks me to press the Home button to confirm. Unfortunately, my Home button is broken, so I can’t access Developer Mode. The iPhone itself still works, but I can’t enable the mode. Is there any way to bypass this without the Home button?
3
0
76
Mar ’25
Components with Earcon haptic feedback for VoiceOver users
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound? Some transitions in Apple apps generally include different beep sounds, such as opening a new screen screen dimming when a VoiceOver user swipes from the header / navbar to the body a scraping sound when swiping up or down a page. the beginning or end of the body section in Calculator when swiping from one row to the next. opening a pop up menu I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
3
1
752
May ’25
Add words to Voice Control
I want to create a utility to import a list of words to the Voice Control user custom vocabulary. Is there an API to do this? I noticed if you use the built-in export vocabulary functionality (Settings > Accessibility > Voice Control > ...) the file that gets exported is a plist document type. If there is no API to add words programmatically should I just create a utility that generates a plist file and import it using the built-in import vocabulary functionality (Settings > Accessibility > Voice Control > ...)?
3
0
470
Oct ’24
When to use Numbered Lists in VoiceOver Accessibility
When would it be a good idea to utilize numbered lists with VoiceOver accessibility. As an example, for tab bars it will read about "2 of 5". In speaking with an Accessibility engineer at WWDC this year, the said that its good practice, but we ran out of time in the call to dig further deeper. When or how do you know when you should read out the item count "2 of 5". It makes sense to me in say a tab group or a chip group, but I don't see it being good in say potentially a UITableView with potentially a various number of sections of which each section itself can include multiple cells. I've had trouble finding additional guidance on this topic, if anyone can provide recommendations or their thoughts, that would be great.
3
0
796
Oct ’24
WKWebView not scrollable with a physical keyboard
Hello all. Currently I am trying to get WKWebView to scroll with a physical keyboard and it just will not work. I tried allowsKeyboardScrolling( ) and it did not work. UIWebView works but its no longer supported. Trying to get full keyboard access to work to make our app more accessible but WKWebView does not want to play nice. Has anyone else had issues trying to use WKWebView with an external keyboard, and if so did you find any solutions? Greatly appreciated!
3
0
433
Feb ’25
How to use core spotlight ?
Watched videos, blog post and downloaded their projects and there the core spot lights works accordingly. I copied code to an empty project and did the same as what they did but still is not working os: macOS and iOS on coredataobject I settled up a attribute to index for spotlight and in object it self I putted the attribute name in display name for spotlight. static let shared = PersistenceController() var spotlightDelegate: NSCoreDataCoreSpotlightDelegate? @MainActor static let preview: PersistenceController = { let result = PersistenceController(inMemory: true) let viewContext = result.container.viewContext for _ in 0..<10 { let newItem = Item(context: viewContext) newItem.timestamp = Date() } do { try viewContext.save() } catch { let nsError = error as NSError fatalError("Unresolved error \(nsError), \(nsError.userInfo)") } return result }() let container: NSPersistentContainer init(inMemory: Bool = false) { container = NSPersistentContainer(name: "SpotLightSearchTest") if inMemory { container.persistentStoreDescriptions.first!.url = URL(fileURLWithPath: "/dev/null") } container.loadPersistentStores(completionHandler: { [weak self] (storeDescription, error) in if let error = error as NSError? { fatalError("Unresolved error \(error), \(error.userInfo)") } if let description = self?.container.persistentStoreDescriptions.first { description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey) description.type = NSSQLiteStoreType if let coordinator = self?.container.persistentStoreCoordinator { self?.spotlightDelegate = NSCoreDataCoreSpotlightDelegate( forStoreWith: description, coordinator: coordinator ) self?.spotlightDelegate?.startSpotlightIndexing() } } }) container.viewContext.automaticallyMergesChangesFromParent = true } } in my @main view struct SpotLightSearchTestApp: App { let persistenceController = PersistenceController.shared var body: some Scene { WindowGroup { ContentView() .environment(\.managedObjectContext, persistenceController.container.viewContext) .onContinueUserActivity(CSSearchableItemActionType) {_ in print("") } } } } onContinueUserActivity(CSSearchableItemActionType) {_ in print("") } never gets triggered. Sow What am I missing that they dont explain in the blog post or videos ?
3
0
189
Mar ’25