Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Post

Replies

Boosts

Views

Activity

MAJOR UNNOTICED Issues with the photos app
I am an artist (singer songwriter) and I use the Photos app to manage albums related to my various creative projects. And these are some BIG issues that i am SURPRISED never came into the account or maybe were overlooked - Missing Search Bar When Adding Photos to Albums: Why there is no search bar when adding a photo to a bag of hundred of albums? (Artists like me like to organise things into different albums and folders) I can no longer search for albums by name after ios 18 update, which was previously very helpful in quickly locating them. Albums can be arranged & moved in the same folder but there is no way to move albums between DIFFERENT FOLDERS and the only wat is to create a new album in that folder and select and transfer everything and delete that old album.
1
0
533
Dec ’24
VisionOS - Enhancing Accessibility for Individuals with Visual Impairments
Hello, I am reaching out because I believe your product, the Vision Pro, could significantly improve the quality of life for individuals with visual impairments, and I thought my personal experience might be of interest to you. We could discuss this in more detail, but to respect your time, I’ll get straight to the point: I have retinitis pigmentosa, a rare retinal disease for which there is currently no treatment. This condition causes a progressive narrowing of the visual field (potentially leading to blindness) and a deficit in photoreceptors (let’s just say I’m not exactly a night owl). In my case, it has become impossible to go out alone in the dark or even see in dim light. (Goodbye evening parties—I can’t even find the entrance to a nightclub, let alone navigate the dance floor!). However, I’ve discovered that sometimes, simply looking through my phone screen and using its brightness helps me see much better. Over the years, I’ve imagined how amazing it would be if a pair of glasses could simply display the image my eyes are supposed to perceive, but with enhanced brightness. It would allow me to live my life as freely as others, whether that’s venturing out at night or finding that elusive pen lost in the depths of my apartment. I initially looked into the Google Glass project, for example, but it pales in comparison to what Apple is now creating, don’t you think? What amuses me most is that what some see as a tool that isolates users from reality could actually become an inclusion device for people like me, who would use it to go out and engage with the world. (I can’t count how many times I’ve gone home early in winter because of the anxiety caused by the early darkness, or turned down after-work gatherings with my DevOps colleagues.) The Vision Pro could simply restore reality for us by enhancing what has been progressively lost. And that’s just for nighttime! I can only imagine how helpful it could be during the day—for instance, by detecting obstacles or highlighting dangerous zones in a person’s limited field of vision. One could even use OCR technology to map the results of a visual field test and provide tailored assistance. What incredible potential… I dream of a day when ideas like these become a reality, and I wanted to share them with you. This wouldn’t just help me—it could help many others as well. Thank you for taking the time to read this message. I would be delighted to contribute in any way, should these development directions resonate with you now or in the future. Wishing you an excellent evening, Hugo Bled
1
0
460
Nov ’24
German VoiceOver says "millibars" instead of "megabytes"
In SwiftUI, iOS 18.1.1, Xcode 16.1, the following control: Text(12345678, format: .byteCount(style: .binary)) displays text with MB (megabytes) unit, but German VoiceOver reads it as "millibars". I tried explicitly specify units with: Text(12345678, format: .byteCount(style: .memory, allowedUnits: .mb)) but the result is the same (German VoiceOver still says "millibars"). Aside from creating own accessibility label, is there any way to go around that?
3
0
477
Nov ’24
Do Rotors add more Complexity to VoiceOver?
This may sound like a bit of an odd question, but this was what I was told this morning by one of our Accessibility managers. This past June at WWDC, I scheduled a lab session with Apple's accessibility folks for a review. I had the pleasure of working with Ryan who helped give the great VoiceOver Testing Talk from WWDC 2018. I believe I've worked with him before in the labs, but regardless, no matter who I meet with in the Accessibility Labs they always provide me with some new nugget of information that I learn, no matter how well versed I might think I am in Accessibility. After the labs, I made all the changes that Ryan suggested and also told other developers on my team of what I was taught. In our app we provide various forms, and each field component that appears in the form has a header text which we apply a header trait to. This allows for the use of a Header Rotor to quickly navigate between all the questions in the form, say if a user wants to return to a previous field etc. I even suggested we should take the time to provide a custom rotor that would allow users to navigate to fields that may be in an error state. If say the user submits the form, and the responses are validated, if 1 or multiple fields be in error we should have a rotor to allow the user to navigate directly to those fields. They may not be able to see the Red text / red outlines of those fields. This morning, I was told that I needed to undo that. That our headerLabel properties should not be marked with the UIAccessibilityTrait.header trait. When I stated that it makes navigation of the form much easier via the Headers Rotor, I was told by the Accessibility Manager this is not the case. I have the MS Teams transcript in front of me, which reads as follows (give or take a few transcript errors) So I went ahead and I just double checked with two of my friends, who are blind and for them on their end, they both said that they would not actually use that, and could add more complexity, because they have—in addition to being blind—but there's also mobility limitations. So they actually can't even use the Rotor at all. They only can use the swipes. Does this make sense to anyone, because it doesn't to me? Thoughts on this?
2
0
363
Nov ’24
Mongolian Spellchecking Support Across Apple Ecosystem
As a Mongolian user, I’ve observed that the Apple ecosystem (macOS, iOS, iPadOS) currently lacks native spellchecking support for the Mongolian language in Cyrillic script. This absence poses significant challenges for users who rely on Apple devices for communication, education, and professional work in Mongolian. Could you share if there are any plans or roadmaps to address this gap? Additionally, I’m eager to contribute ideas, resources, or insights to help make Mongolian language support more accessible within the Apple ecosystem. If there are any guidelines or steps I could take to advocate for or help implement this feature, I’d greatly appreciate your guidance.
1
0
307
Nov ’24
Find My
My husband and I have the same iPhones. We both have location sharing on. When he uses Find My, he can see my location. He has shared his location with me, but my phone always says “No location found,” We have the exact same settings on our phones and have followed the instructions to use Find My. Is there something wrong with my phone since I cannot see his location? I have no trouble seeing the location of another family member. Or is something wrong with my husbands phone? This is so frustrating.
1
0
426
Nov ’24
tvOS Accessibility: How to enable accessibility focus on static text and custom views
Hi guys, I'm trying to add accessibility labels to a static text and custom SwiftUI views. Example: MyView { ... } //.accessibilityElement() .accessibilityElement(children: .combine) //.accessibilityRemoveTraits(.isStaticText) //.accessibilityAddTraits(.isButton) .accessibilityLabel("ACCESSIBILITY LABEL") .accessibilityHint("ACCESSIBILITY HINT") When using 'voiceover' or 'hover text' accessibility features, focus moves only between active elements and not on static elements. When I add .focusable() it works, but I don't want to make those elements focusable when all accessibility features are off. I suppose I could do something like this: .focusable(UIApplication.shared.accessibility.voiceOver.isOn || UIApplication.shared.accessibility.hoverText.isOn) Note: this is just pseudocode, because I don't remember exactly how to detect current accessibility settings. However using focusable() with conditions on hundreds of static texts in an app seems to be overkill. Also the accessibility focus is needed on some control containers where we already have a little more complex handling of focus with conditions in focusable(...) on parent and child elements, so extending it for accesssiblity seems to be too complicated. Is there a simple way to tell accessiblity that an element is focusable specifically for 'hover text' and for 'voiceover'? Example what I want to accomplish for TV content: VStack { HStack { Text(Terminator) if parentalLock { Image(named: .lock) { } .accessibilityLabel(for: hover, "Terminator - parental lock") Text("Sci-Fi * 8pm - 10pm * Remaining 40 min. * Live") .accessibilityLabel(for: hover, "Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live") } .accessibilityLabel(for: voiceover, "Terminator, Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live, parental lock")``` I saw all Accessibility WWDC videos 2016, 2022, 2024 and googling it for several hours, but I coudln't find any solution for static texts and custom views. From those videos it appears .accessibilityLabel() should be enough, but it clearly works only on actvie elements and does not work for other SwiftUI views on tvOS without focusable(). Can this be done without using focusable() with conditions for detection which accessibility feature is on? The problem with focusable would be that for accessibility I may need to read a text for parent view, but focus needs to be placed on a child element. I remember problems when focusable() is set on parent view that child was not focusable or something like that - simply put: complications in focus logic. Thanks.
1
0
638
Nov ’24
Using Voice Control to dictate into an application without fields
Hi I'm a new Mac user having been a long time PC user and software developer. I also have a mobility impairment that has led me to try to use Voice Control as a replacement for Dragon NaturallySpeaking on my PC. I have been trying to use Parallels with a Windows 11 VM and Dragon for my remote work, but that seems to have broken when I downloaded the latest macOS beta. Ideally I'd like to use Voice Control over a VPN/Remote Desktop Connection or, in a pinch, Chrome Remote Desktop. The problem I'm running into is that macOS does not seem to recognize that I am in a text field or other control when I am in the remote application. I have a utility in Windows that will allow me to voice type into an application window even if the cursor is not over a control, but I can't seem to figure out a way to do that in macOS. Is there a way to do what I want to do? Is there a more capable voice recognition software package for macOS? I am running Sequoia 15.2 beta 3 at the moment.
0
0
384
Nov ’24
AVSpeechSynthesisMarker - iOS 18 - Sync lost
Hello, in an AVSpeechSynthesisProviderAudioUnit sending word position to host using AVSpeechSynthesisMarker / AVSpeechSynthesisMarker.Mark.word seems to be broken on iOS 18. On the app/client side all the events are received immediately whereas they should be received synchronised with the audio. The exact same code works perfectly on iOS 17 On the AVSpeechSynthesisProviderAudioUnit the AVSpeechSynthesisMarker are appended with the correct Position/SampleOffset let wordPos = NSMakeRange(characterRange.location, characterRange.length) let marker = AVSpeechSynthesisMarker(markerType: AVSpeechSynthesisMarker.Mark.word, forTextRange:wordPos, atByteSampleOffset:byteSampleOffset) // also tried with // let marker = AVSpeechSynthesisMarker(wordRange:wordPos, atByteSampleOffset:byteSampleOffset) markerArray.append(marker) print("word : pos \(characterRange) - offset \(byteSampleOffset)") // send events to host speechSynthesisOutputMetadataBlock?(markerArray, self.request!) word : pos {7, 7} - offset 2208 word : pos {15, 8} - offset 37612 word : pos {24, 6} - offset 80368 word : pos {31, 3} - offset 118652 word : pos {35, 2} - offset 128796 ... But on the client side they are all received in the same time (at the beginning of speech) whereas on iOS 17 they arrive sync to the audio. func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, willSpeakRangeOfSpeechString characterRange: NSRange, utterance: AVSpeechUtterance) { print("characterRange : \(characterRange)") } Using Apple Voice/engine works so there is obviously something to change but documentation of AVSpeechSynthesisProviderAudioUnit / AVSpeechSynthesisMarker seems unchanged Thanks in advance
2
0
456
Nov ’24
Select input language with ctrl-space stopped working
I have two iPad Pros iPad Pro (12.9 inch) (3rd Generation) iPad Pro 13-inch M4 Both with their Apple Magic Keyboard and trackpad. With iPadOS 17 I could ctrl-space to jump between input languages. Now with 18.1.1 and 18.2 beta this is broken. On the old iPad, the languages used to show EN, then DE. Now it shows EN DE, EN DE. I went to settings and the keyboards were shown as having English and German input for two physical keyboards. I deleted and recreated, now each keyboard has only one language and ctrl-space now alternates between EN and DE. On the new iPad Pro, two keyboards are already set with a single input language, but ctrl-space is not changing the input, it types a space instead. There does not seem to be any way to change the input language using the keyboard.
1
1
564
Nov ’24
Accessibility issue varification.
I am writing an email to a software engineer at Starbucks. In it, I want to make him aware of a Voice Over accessibility issue that I think I know the cause of, but want to verify it here. The issue is nothing happens like it should after entering text into an edit field, then pressing enter. What should happen when pressing enter is that the next page is displayed. However, it does not. Am I right in my guess that the developer has the page hidden? If not, what could it be? Please provide code with comments to fix this issue.
0
0
253
Nov ’24
Swift UI Chart Accessibility issue
Hi Team, We are integrating SwiftUI's Charts BarMark, UI looks good but when we try setting up custom ADA it doesn't reflect/override the accessibility label/value we set manually. Is it iOS defect or is there any workaround? Thanks in advance. Sample: Chart(data) { BarMark( x: .value("Category", $0.department), y: .value("Profit", $0.profit) ) .foregroundStyle(by: .value("Product Category", $0.productCategory)) .accessibilityIdentifier("BarMark") .accessibilityLabel("Dep: \($0.department)") .accessibilityValue("Profile: \($0.profit) Category: \($0.productCategory)") }
1
2
547
Nov ’24
Could not use 'Keyboard Navigation' to navigate to inline navigation title on iOS 18 & iOS 17
Hello everyone, I’m experiencing an issue with the accessibility feature "Keyboard Navigation" in iOS 18.0. Specifically, when enabling the "Allow Full Keyboard Access" option and attempting to navigate to inline navigation titles within stock iOS apps, the navigation doesn’t seem to work as expected. Here’s how to replicate the issue: Enable the accessibility option: Allow Full Keyboard Access (Settings > Accessibility > Keyboards > Full Keyboard Access). Open any stock iOS app that uses an inline navigation style (for example, the Mail or Settings app). Press the Tab key to cycle through items on the inline navigation bar. In iOS 18.0, pressing the Tab key does not allow navigation to the inline navigation title, which was previously possible in iOS 16. This issue is specific to iOS 18 and iOS 17, as it worked fine on the earlier version (iOS 16). Has anyone else encountered this issue or have suggestions for a workaround? Would love to hear your thoughts.
1
1
405
Nov ’24
Reply to 4. 3 Design: Spam????
您好,我们是中国的一家软件开发公司。我们应用程序的底层框架基于美国开源社区的 XMPP 协议框架。它是一个国际开源项目,许多中国软件公司以开源框架为基础,在美国开源社区框架之上开发 UI 和替换功能。但是当我们需要将应用上传到应用商店时,会收到提示(拒绝 4.3A 应用垃圾邮件),因为中国目前没有独立开发这个框架的能力。因此,我们不得不承认,我们的 app 使用了美国开源社区的框架,因为中国的很多其他公司也都使用了美国开源社区的框架。这导致我们在将应用程序上传到应用商店时出现提示 (reject 4.3A application spam),因为其他公司在将其上传到应用商店时也使用相同的底层开源框架。这会导致我们公司在将此应用程序上传到应用商店时收到提示(拒绝 4.3A 应用程序垃圾邮件)。我们公司在这个 App 上花费了大量的精力和金钱,如果不允许中国公司上传到 App Store,就会被拒绝。使用来自美国的开源框架,这导致很多中国公司在后期无法使用美国的开源框架作为开发的基础。我们恳请贵公司为中国的小企业提供指导,并就我们未来的发展提供建议。谢谢。
0
0
295
Nov ’24
iOS 18.1 Deeplink to Wallpaper settings
Prior to iOS 18.1, App-prefs:Wallpaper deeplinked to the wallpaper settings. On iOS 18.1, it now deeplinks to the settings app. Is there a new URL to deeplink to the Wallpaper settings? The following URLs have been tested and do not work on iOS 18.1: App-prefs:Wallpaper App-prefs:wallpaper App-prefs:root=wallpaper App-prefs:root=Wallpaper App-prefs:root=WALLPAPER App-prefs:root=General&path=Wallpaper prefs:root=Wallpaper
1
6
597
Nov ’24
The screen of the watchOS app automatically turns off and pauses operation.
I developed a watchOS app to capture gyro data, save it in real-time as a CSV file, and send it to an iOS app. However, when I start writing with the watch on, the screen dims, and it stops working. It only resumes operation when I tap the screen again. Is there a way to let it run in the background and transmit files in real-time, even when the screen is off?
0
1
560
Nov ’24
VoiceOver: Detect Languages
My app does not automatically switch languages (voices) in VoiceOver when I have VoiceOver on and the screen includes both English and Spanish content. Instead of switching between the correctly accented voice, whatever my manual Voices rotor setting is, that's what the content is announced as. I can manually switch the Voice in the rotor to make words sound inteligible but my main concern is that language changes are not auto-detected even though that feature in my Settings is on. VO does detect language changes in other apps, so I think there must be either misplaced or missing accessibiiltyLanguage strings somewhere in my app. Or is it more than that for localization considerations? I reached out to the Apple Accessibilty team and was directed to open a ticket here, as my question is about the underlying code. I am a novice developer and primarily accessibility SME; i expect that wnen "detect languages" is on in the user settings for VoiceOver, that the voice for the screen reader speech output will automatically switch to the correct language / accent. I recognize there is a problem but am not sure where the breakdown is. I would like guidance how to fix it to relay to my teams. https://developer.apple.com/documentation/objectivec/nsobject/1615192-accessibilitylanguage
1
0
658
Nov ’24