Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics

Post

Replies

Boosts

Views

Activity

Can’t connect to App Store. iOS 15.8.3 - I need to update an App, but I can’t access App Store. Other Apps appear to work normally. AND I can’t connect the phone by cable to the PC. Both things work on my other iPhone 8.
Can’t connect to App Store. iOS 15.8.3, iPhone SE 1st edition - I need to update an App, but I can’t access App Store. Other Apps appear to work normally. AND I can’t connect the phone by cable to the PC. Both things work on my iPhone 8. No idea if the two problems are connected.
0
0
389
Dec ’24
调用苹果授权登录,AKAuthenticationError Code=-7074
我尝试创建一个command line tool 应用,然后创建一个dylib,其核心功能是,调用苹果登录。我安装sign in with apple 的要求,先生成一个bundle ID 然后,在开发者后台注册,同时开启其 sign in whit apple 功能。然后实现ASAuthorizationAppleIDRequest相关逻辑。运行后,控制台报错提示 [siwa] Authorization failed: Error Domain=AKAuthenticationError Code=-7074 "(null)"。相同操作对比创建app应用 是可以拉起苹果授权登录页面。对比了一下,有两点区别: 1、在signing&capabilities里,app应用可以添加sign in with apple 的capability,而command line tool 应用不可以 2、签名的区别,app的签名里会保护描述文件信息(包括bundle id)而command line tool 应用没有。 3、应用的差别,command line tool 应用 和 app应用。 现在不知道是哪个原因导致该报错
0
0
278
Dec ’24
Female English Indian Siri Voice Pronounces Certain Words in the American Pronunciation Instead of the correct Pronunciation
I just downloaded the public release of iOS 18.2. I found a bug when using the English India female Siri voice, it’s pronouncing certain words using the American pronunciation instead of the correct pronunciation that the voice is supposed to have. For example, the Siri voice says the American pronunciation of the word privacy. This is definitely a bug that needs to get resolved as soon as possible. Can you please fix this right away?
4
0
395
Dec ’24
iOS 18 open settings URLs
A lot of apps use undocumented App-prefs URLs to help users get to the iOS Settings screen needed to set up the app. In iOS 18, it seems like these all stopped working. Here are the ones I currently use: App-prefs:MESSAGES - broken in iOS 18 Used for SMS Protection. App-prefs:Phone - broken in iOS 18 Used for Live Voicemail, Silence Unknown Callers, and SMS Reporting. Some but not most paths have specific documented replacements. E.g. for Call Blocking & Identification you can use CXCallDirectoryManager.sharedInstance.openSettings() and this still works in iOS 18. But I don't see any other direct replacements. Apple probably doesn't consider this a bug but I filed FB14378568 anyway. I consider this an accessibility issue because many older, inexperienced, or users with disabilities have trouble finding the right Settings screen based on a textual description alone.
11
5
6.1k
Jul ’24
Accessibility Localization Questions
For practice, I have implemented accessibility labels and announcement in a very simple test app (All SwiftUI, all iOS 18). The app is not localized, default language is English. When running this on a German phone, odd things happen in the localization. My accessibility labels are read with an accent, but when they contain a url, the "dots" are read as the German "Punkt" (with an English Accent). When I am providing the same text as accessibility announcement, the same text (which is in English), is read with a German voice. I am also providing a Button with an "arrow.clockwise" image, and VoiceOver reads this, in an English Voice with "Refresh, Button". This is great and was to be expected. However, when the button is disabled, VoiceOver reads "Refresh, grau dargestellt, Button", all in an English Voice. Is this an error? Am I doing it wrong? The video at the link should show the issue https://share.icloud.com/photos/0757FJW2Q3fsA_cdhMX6ls46Q
2
0
705
Dec ’24
macOS avoid VoiceOver audio ducking for certain sounds
Is it possible to play certain sounds from a macOS app that remain at full volume when VoiceOver is speaking? Here's some background: I want to play sounds from my app (even when it's not in focus) to notify a VoiceOver user that an action is available (this action can be triggerred even when the app is not in focus; and is comfigurable by the user). I tired using an NSSound for this, but VoiceOver ducks the audio of my sound when it is speaking. Is there some way to avoid audio ducking for certain sounds? Or is there another, perhaps lower level, audio API that i can use to achieve this?
1
0
371
Dec ’24
Fullscreen API web standard is unacceptably missing on iPhone
It is outrageous that Apple continue to fail to implement the Fullscreen API web standard for web apps on iPhone only, which is so important to accessibility and web app functionality. The only possible reason for this block is commercial: to promote iOS apps instead of browser based web apps. To quote a client from a major agency just now - a typical enquiry : We value accessibility greatly, and we noticed that the embedded player is missing a full screen button on iPhone. Everything else works perfectly fine, including a full screen button that appears on the mobile webpage on android devices. Is there any way we can include a button to enable full screen view for our viewers in your player that are going to watch it on iOS devices? To which, as usual, we have to reply: Apple unfortunately block fullscreen mode from being used with all web applications on iPhone. Apple will allow this to be displayed fullscreen on MacBooks and iPads, but currently not on on iPhone - so we have to hide the fullscreen button there. So fullscreen works on all devices and browsers apart from on iPhone. As you've seen with Android, all other devices and browsers follow the universal 'Fullscreen API' web standard to allow full screen. You're probably familiar with seeing the fullscreen button on normal linear videos on iPhone. These use Apple's native video player, which doesn't let buttons and scripts be used on top of it - just a single video, not an interactive web application. Our player looks like a video player but it is actually a web app combining multiple different video clips connected together by code and styling. They block it on iPhones for reasons known only to them, but the assumption is that it is to incentivise people to make iOS apps instead of web apps. The web development community is hopeful that Apple will change this unfortunate restriction soon, but we have been waiting a long time in vain. We have to send this to a lot of people. It's a very bad look for Apple. In less than a month it will be 2025. We have been waiting years for this. The web standard documentation showing universal support on other devices and browsers is here: https://developer.mozilla.org/en-US/docs/Web/API/Fullscreen_API This is not acceptable. It is time for Apple to stop blocking this important accessibility web standard for commercial reasons - only on iPhone. To whoever is in charge of these decisions in the Safari/Webkit team: Please just enable Fullscreen API for web apps on iPhone as soon as possible.
3
2
509
Dec ’24
Xcode 16.1 broken accessibilityLabel
Accessibility got broken after updated till XCode 16.1 There is a call to accessibilityLabel - it sets an a11y label for a title of a view. This used to work (pronounced by VoiceOver) with XCode 15.4 + iOS 17.5. Xcode 16.1 + iOS 18.1 + Physical device/ iOS SImulator - with Accessibility Inspector - no a11y label set. Tried Xcode 16.2 beta 3 - the same result - accessibilityLabel does not work - a11y label is not set.
1
0
406
Dec ’24
iOS 18 - Link to VoiceOver & Display settings
We have an app with a large audience (around 2.1M DAUs) and because of this, we build it with accessibility first in mind. In that app, we link to specific iOS accessibility settings (such as VoiceOver, Display & Text, etc) in our menu screens, to offer the user a shortcut to customize VO behaviour, text size etc. Unfortunately, since iOS 18, these links are no longer working, they all open the Settings app, but don't navigate. It appears (through support) users use these links to easily access the settings, mostly older people trained to go this way in computer courses. We used to open the settings app through the App-prefs scheme, but seems broken in iOS 18. eg. App-prefs:root=ACCESSIBILITY&path=VOICEOVER_TITLE I know about the AccessibilitySettings API, but seems it is only limited to once specific feature. Is there a way we can get these links to work again?
1
1
286
Dec ’24
German VoiceOver says "millibars" instead of "megabytes"
In SwiftUI, iOS 18.1.1, Xcode 16.1, the following control: Text(12345678, format: .byteCount(style: .binary)) displays text with MB (megabytes) unit, but German VoiceOver reads it as "millibars". I tried explicitly specify units with: Text(12345678, format: .byteCount(style: .memory, allowedUnits: .mb)) but the result is the same (German VoiceOver still says "millibars"). Aside from creating own accessibility label, is there any way to go around that?
3
0
383
Nov ’24
Iphone 15 lags
Hello, so basically when I play some game and I get a notification it lags like badly to I think 20-30 fps and stutters, when notification is gone it works normally. Also sometimes when I open control panel its has slow and laggy animation. Bought this phone like a week ago and this makes me sad :(.
1
1
289
Dec ’24
AVSpeechSynthesisMarker - iOS 18 - Sync lost
Hello, in an AVSpeechSynthesisProviderAudioUnit sending word position to host using AVSpeechSynthesisMarker / AVSpeechSynthesisMarker.Mark.word seems to be broken on iOS 18. On the app/client side all the events are received immediately whereas they should be received synchronised with the audio. The exact same code works perfectly on iOS 17 On the AVSpeechSynthesisProviderAudioUnit the AVSpeechSynthesisMarker are appended with the correct Position/SampleOffset let wordPos = NSMakeRange(characterRange.location, characterRange.length) let marker = AVSpeechSynthesisMarker(markerType: AVSpeechSynthesisMarker.Mark.word, forTextRange:wordPos, atByteSampleOffset:byteSampleOffset) // also tried with // let marker = AVSpeechSynthesisMarker(wordRange:wordPos, atByteSampleOffset:byteSampleOffset) markerArray.append(marker) print("word : pos \(characterRange) - offset \(byteSampleOffset)") // send events to host speechSynthesisOutputMetadataBlock?(markerArray, self.request!) word : pos {7, 7} - offset 2208 word : pos {15, 8} - offset 37612 word : pos {24, 6} - offset 80368 word : pos {31, 3} - offset 118652 word : pos {35, 2} - offset 128796 ... But on the client side they are all received in the same time (at the beginning of speech) whereas on iOS 17 they arrive sync to the audio. func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, willSpeakRangeOfSpeechString characterRange: NSRange, utterance: AVSpeechUtterance) { print("characterRange : \(characterRange)") } Using Apple Voice/engine works so there is obviously something to change but documentation of AVSpeechSynthesisProviderAudioUnit / AVSpeechSynthesisMarker seems unchanged Thanks in advance
2
0
380
Nov ’24
Do Rotors add more Complexity to VoiceOver?
This may sound like a bit of an odd question, but this was what I was told this morning by one of our Accessibility managers. This past June at WWDC, I scheduled a lab session with Apple's accessibility folks for a review. I had the pleasure of working with Ryan who helped give the great VoiceOver Testing Talk from WWDC 2018. I believe I've worked with him before in the labs, but regardless, no matter who I meet with in the Accessibility Labs they always provide me with some new nugget of information that I learn, no matter how well versed I might think I am in Accessibility. After the labs, I made all the changes that Ryan suggested and also told other developers on my team of what I was taught. In our app we provide various forms, and each field component that appears in the form has a header text which we apply a header trait to. This allows for the use of a Header Rotor to quickly navigate between all the questions in the form, say if a user wants to return to a previous field etc. I even suggested we should take the time to provide a custom rotor that would allow users to navigate to fields that may be in an error state. If say the user submits the form, and the responses are validated, if 1 or multiple fields be in error we should have a rotor to allow the user to navigate directly to those fields. They may not be able to see the Red text / red outlines of those fields. This morning, I was told that I needed to undo that. That our headerLabel properties should not be marked with the UIAccessibilityTrait.header trait. When I stated that it makes navigation of the form much easier via the Headers Rotor, I was told by the Accessibility Manager this is not the case. I have the MS Teams transcript in front of me, which reads as follows (give or take a few transcript errors) So I went ahead and I just double checked with two of my friends, who are blind and for them on their end, they both said that they would not actually use that, and could add more complexity, because they have—in addition to being blind—but there's also mobility limitations. So they actually can't even use the Rotor at all. They only can use the swipes. Does this make sense to anyone, because it doesn't to me? Thoughts on this?
2
0
325
Nov ’24
Mongolian Spellchecking Support Across Apple Ecosystem
As a Mongolian user, I’ve observed that the Apple ecosystem (macOS, iOS, iPadOS) currently lacks native spellchecking support for the Mongolian language in Cyrillic script. This absence poses significant challenges for users who rely on Apple devices for communication, education, and professional work in Mongolian. Could you share if there are any plans or roadmaps to address this gap? Additionally, I’m eager to contribute ideas, resources, or insights to help make Mongolian language support more accessible within the Apple ecosystem. If there are any guidelines or steps I could take to advocate for or help implement this feature, I’d greatly appreciate your guidance.
1
0
276
Nov ’24
MAJOR UNNOTICED Issues with the photos app
I am an artist (singer songwriter) and I use the Photos app to manage albums related to my various creative projects. And these are some BIG issues that i am SURPRISED never came into the account or maybe were overlooked - Missing Search Bar When Adding Photos to Albums: Why there is no search bar when adding a photo to a bag of hundred of albums? (Artists like me like to organise things into different albums and folders) I can no longer search for albums by name after ios 18 update, which was previously very helpful in quickly locating them. Albums can be arranged & moved in the same folder but there is no way to move albums between DIFFERENT FOLDERS and the only wat is to create a new album in that folder and select and transfer everything and delete that old album.
1
0
425
Dec ’24
VisionOS - Enhancing Accessibility for Individuals with Visual Impairments
Hello, I am reaching out because I believe your product, the Vision Pro, could significantly improve the quality of life for individuals with visual impairments, and I thought my personal experience might be of interest to you. We could discuss this in more detail, but to respect your time, I’ll get straight to the point: I have retinitis pigmentosa, a rare retinal disease for which there is currently no treatment. This condition causes a progressive narrowing of the visual field (potentially leading to blindness) and a deficit in photoreceptors (let’s just say I’m not exactly a night owl). In my case, it has become impossible to go out alone in the dark or even see in dim light. (Goodbye evening parties—I can’t even find the entrance to a nightclub, let alone navigate the dance floor!). However, I’ve discovered that sometimes, simply looking through my phone screen and using its brightness helps me see much better. Over the years, I’ve imagined how amazing it would be if a pair of glasses could simply display the image my eyes are supposed to perceive, but with enhanced brightness. It would allow me to live my life as freely as others, whether that’s venturing out at night or finding that elusive pen lost in the depths of my apartment. I initially looked into the Google Glass project, for example, but it pales in comparison to what Apple is now creating, don’t you think? What amuses me most is that what some see as a tool that isolates users from reality could actually become an inclusion device for people like me, who would use it to go out and engage with the world. (I can’t count how many times I’ve gone home early in winter because of the anxiety caused by the early darkness, or turned down after-work gatherings with my DevOps colleagues.) The Vision Pro could simply restore reality for us by enhancing what has been progressively lost. And that’s just for nighttime! I can only imagine how helpful it could be during the day—for instance, by detecting obstacles or highlighting dangerous zones in a person’s limited field of vision. One could even use OCR technology to map the results of a visual field test and provide tailored assistance. What incredible potential… I dream of a day when ideas like these become a reality, and I wanted to share them with you. This wouldn’t just help me—it could help many others as well. Thank you for taking the time to read this message. I would be delighted to contribute in any way, should these development directions resonate with you now or in the future. Wishing you an excellent evening, Hugo Bled
1
0
394
Nov ’24