Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Will iCloud service lose data access because users are in different regions?
My app uses CoreData based on iOS 13.0 combined with iCloud to store data. This function automatically manages the data collaboration between CoreData and iCloud. Some users have reported that after going abroad, their original data disappeared, and when they returned to China, the data could be displayed normally again. I'm located in Mainland China. I've learned that iCloud data in China is all stored in Guizhou-Cloud Big Data (the data center in Guizhou). Could this problem be caused by display abnormalities resulting from the switching of the iCloud data storage centers accessed in different regions?
0
0
347
Dec ’24
How can we make elements which are grouped accessible for automation
I have a stackview which have 2 labels class TextView: UIView { @IBOutlet private weak var stackView: UIStackView! { didSet { stackView.isAccessibilityElement = true stackView.accessibilityLabel = label1.text + label2.text } } @IBOutlet private weak var label1: UILabel! { didSet { label1.accessibilityIdentifier = "label1" } } @IBOutlet private weak var: UILabel!{ didSet { label2.accessibilityIdentifier = "label2" } } } My goal here is to have a combines accessibility label for the stackview and yet able to access the accessibilityIdentifier of child elements for automation.
0
0
1.1k
Nov ’24
Registering a macOS app for dynamic text sizing in macOS 15
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day? For context, my app is a macOS-native SwiftUI app.
1
0
560
Jan ’25
macos app using VoiceOver HotSpots fails to jump to correct hotspot
I'm trying to validate my app's handling of voiceover accessibility when using VoiceOver hotspots. What I'm doing: Set a hotspot Validate hotspot references correct control within the hotspot chooser set another hotspot Validate hotspot references both correct controls within the hotspot chooser Try to use one of the hotspots Result: The hotspot has changed to some other control in the app (usually one of the window buttoms (close, minimze, full screen) but at other times are one of the menus or a completely different control in the window. My question is how I can debug what might be going on in the app when the hotspots are resolved and invoked. I'm assuming I have some accessibility property set improperly for these controls that are causing incorrect resolution of the hotspots.
1
0
401
Nov ’24
Clarification on Entitlements, Privacy Manifest, and Info.plist for System-Wide Mouse Click Monitoring and Typing Simulation in macOS App
I am currently developing a macOS application that listens for system-wide mouse clicks to simulate typing with user-provided text. The app requires Accessibility permissions to function properly, and I want to ensure compliance with Apple’s latest privacy and security guidelines. The app listens to global mouse clicks. It simulates keyboard input with user-provided text I would like detailed guidance on the following aspects: What specific entitlements are required to allow system-wide mouse click monitoring and simulating user input ? App Sandbox enable or disable? what keys required to explain global mouse click monitoring and keyboard input simulation in the info.plist What will be the configuration of Privacy Manifest
0
0
410
Jan ’25
Separate bluetooth keyboard and bluetooth scanner inputs
Hi, I have an iOS app where bluetooth scanner and bluetooth keyboard both should work simultaneously. I have used 'pressesBegan' method to fetch the characters which are coming from the bluetooth keyboard. This method is fetching all the bluetooth keyboard inputs correctly. But this method is also called when the characters are coming from the bluetooth scanner. In the 'pressesBegan' method, I have to separate the inputs which are coming from the bluetooth keyboard and are coming from bluetooth scanner. They both have some different use in the app. I have already tried with the fetching speed of the characters, but no luck in case of high speed typing. So characters fetching speed will not work in our case. Is there any way to separate the inputs based on some other factorials? Or any other info in the 'pressesBegan' method which can separate the input that it is coming from scanner or is coming from keyboard. Any suggestion regarding this will be helpful. Thanks in advance.
0
0
359
Jan ’25
Adding Central Kurdish language to VoiceOver functionality
addition of Central Kurdish language support for Text-to-Speech (TTS) and VoiceOver functionality on Apple products. Our TTS model boasts an impressive 99.9% accuracy, making it a highly reliable tool for this purpose. This initiative would bring meaningful benefits to over 10,000 visually impaired and more than 40,000 illiterate individuals in the Kurdistan Region of Iraq, empowering them to access digital information, navigate devices, and perform tasks more independently. The integration of Central Kurdish VoiceOver support would make a significant difference in improving accessibility and quality of life for these individuals, promoting inclusivity and digital literacy in the region.
1
0
531
Nov ’24
Getting precise text position with Swift for MacOS
Hey there! Hope you are starting the year with great joy. My situation I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift My problem I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target. My discoveries so far I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API. Thank you in advanced.
2
0
526
Jan ’25
WhatsApp not showing in Apple Carplay in iOS 18.X
I bought a new iPhone 16 recently and connected with my car (Hyundai Venue) I couldn't able to see WhatsApp. I researched and found some forum, but the suggested steps are not workable or not suitable for latest iOS version. I have updated iOS and WhatsApp, nothing helped to resolve. Note: Earlier I was used Pixel phone I can able to see Whatsapp and I can make a call
2
0
482
Jan ’25
Help Needed: Repeated Local Notifications for Alarm App When App is Closed
I am currently developing an alarm app, and I’ve noticed that apps like Super Alarm and Alarmy are able to send local push notifications every 3 seconds after a specified time, even when the app is completely closed and in Airplane Mode. The notifications continue until the user opens the app. I’m trying to implement this functionality, but I haven’t been able to figure out how. Could anyone provide guidance on how to achieve this?
1
0
497
Nov ’24
Apple Vision Pro - Homonymous Hemianopia
Individuals with a stroke can end up with vision impairments: specifically Homonymous Hemianopia which basically means the individual has lost sight in (as an example) the left half of both eyes. I'm interested in understanding if it would be possible to help individuals with this vision impairment by providing an accessibility config within the Apple Vision Pro which would first determine an individuals field of view (possibly by showing a field of dots across the entire "screen" and having the individual look at the dot and click. Based on the results of this field of view, this would determine how the screen would be presented to the user moving forward. My mom (82 years old) had a stroke recently and was diagnosed with Homonymous Hemianopia. She lived on her IPhone and would love to get back the ability to text message, use Facebook, and order items from Amazon. Please advise if you believe the Apple Vision Pro would be capable of helping in this area with the suggested development, or other thoughts.
1
0
467
Jan ’25
iOS 18 Video Playback Experience
It’s impossible to enjoy video playback with this update. 1) Starting with the play/pause location, not friendly with your hand; 2) How are we supposed to back and forth just a few frames? I was thinking there is a hidden button somewhere to enable the old playback, because that’s waaay absurd. 3) Video playback with buttons enable makes video playback smaller, that’s ridiculous
1
0
441
Oct ’24
[tvOS] VoiceOver Skips Description Text When Info Panel Opens in AVPlayerViewController
When the native info panel (which displays the title, subtitle, description, and custom buttons) opens, the focus immediately shifts to the first button. As a result, VoiceOver skips the description, which is crucial for users relying on accessibility features. I haven’t found a way to detect when it opens. Knowing this would allow me to trigger custom VoiceOver announcements or adjust the focus order dynamically. Are any other people experiencing this issue, and how do we solve it?
1
0
669
Oct ’24