I'm facing a bizarre issue with the Apple's Accessibility APIs. I am registering an AXObserver that listens for, among other things, the kAXSelectedTextChangedNotification. For many new users, the kAXSelectTextChangedNotification is not triggered, even though they have enabled Accessibility permission for the app. Other notifications are getting through (kAXWindowMovedNotification, kAXWindowResizedNotification, kAXValueChangedNotification etc - full list here), just not the kAXSelectedTextChangedNotification!
We've found that we can reproduce the error by removing accessibility permission for the app and rebooting our computers. After restarting and reenabling accessibility permissions, the kAXSelectedTextChangedNotification was not received, even though other notifications were fine.
Strangely, the issue can be resolved by launching Apple's Accessibility Inspector app on an impacted computer. Once the Accessibility Inspector is loaded, the kAXSelectedTextChangedNotifications start coming through as expected. This implies to me that either:
We are missing some needed setup when starting the observers. Accessibility Inspector gets it right, thus ‘starting’ the system properly.
Accessibility Inspector is using some Apple private APIs that we don’t have access to.
Things I’ve tried:
I've tried subscribing the AXSelectedTextChangedNotification to different AXUIElements, including the SystemWide element, the Application element, and children elements from the AXApplication. None of these received the kAXSelectedTextChangedNotification, until Accessibility Inspector is booted up. No surprises here, as Apple's documentation confirms that you should add the notification to the root Application AXUIElement if you want to receive notifications for all its children.
I had a theory that the issue might be due to my code calling AXUIElementCreateApplication multiple times, possibly creating multiple "Applications" in Apple's Accessibility implementation. If that’s the case, the notifications might be sent to the wrong application AXUIElement. However, refactoring my code to only call AXUIElementCreateApplication once didn't resolve the issue.
I thought the issue may be caused by subscribing the AXSelectedTextChangedNotification on the high-level application element (at odds with Apple's documentation). I've tried traversing the child AXUIElements until we find one with the kAXSelectedTextAttribute and then subscribing to that. This did not resolve the issue. I don’t think it's the correct path to continue exploring, given that the notifications are received correctly after AccessibilityInspector is launched.
There is one exception to the above: if I add the kSelectedTextChangedNotification listener to a specific text field AXUIElement, I do receive the notification on that text field. However, this is not practical; I need a solution that will work for all text fields within an app. The Accessibility Inspector appears to be doing something that causes the selected-text-changed notifications to be correctly passed up to the high-level application AXUIElement.
Another thought is that I could traverse the entire Accessibility hierarchy and add listeners to every subview that has the kAXSelectedTextAttribute. However, I don’t like this long-term solution. It will be slow and incomplete: new elements get added and removed frequently. I just want the kAXSelectedTextChangedNotification to be received by the high-level Application AXUIElement, which the documentation suggests it should be. I also have evidence that this can work, since notifications start coming through after Accessibility Inspector is launched. It’s just a matter of discovering how to replicate whatever Accessibility Inspector is doing.
An interesting wrinkle: I implemented the 'traverse' strategy above, but was surprised by how few elements were in the hierarchy. Most apps only go down ~2-3 levels, which didn't seem right to me. Perhaps the Accessibility tree isn't fully initialized? I tried adding a 5-second delay to allow more initialization time, but it didn't change anything.
Does anyone have any ideas? Here's our file.
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hey folksI, I would like to ask for help on this topic:
I think this is exactly the same problem Combobox not working with VoiceOver after… - Apple Community.
VoiceOver also breaks the combobox from the official ARIA W3C website https://www.w3.org/WAI/ARIA/apg/patterns/combobox/examples/combobox-autocomplete-list/. When VO is turned off, I can use the up/down arrow to go through the menu items from the dropdown, but when VO is turned on, the up/down arrows cannot access the dropdown menu items.
Is there an official tutorial on how to control it using voice over?
Kind regards,
Jakub
Topic:
Accessibility & Inclusion
SubTopic:
General
I have a Twitter account that I registered with Apple id and I still don't know the PIN and I'm having a problem with it knowing the PIN I need help
privaterelay.appleid.com
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi Community,
I'm excited to share R Helper, a speech practice app I built with accessibility as the core focus from day one.
App Store: https://apps.apple.com/app/speak-r-clearly/id6751442522
WHY I BUILT THIS
I personally struggled with R sound pronunciation growing up. It affected my confidence in school and job interviews. That experience taught me how important accessible practice tools are.
R Helper helps children and adults practice R sounds with full accessibility support.
ACCESSIBILITY FEATURES IMPLEMENTED
VoiceOver - complete navigation and feedback
Voice Control - hands-free operation
Dynamic Type - scales to large accessibility sizes
Reduce Motion - respects user preference
Dark Mode - user controllable
High Contrast compatibility
Differentiate Without Color
THE CHALLENGE
Most speech practice apps ignore accessibility. I wanted to change that and prove that specialized educational apps can be fully accessible.
KEY FEATURES
Works 100% offline, no internet needed
Zero data collection, privacy first
Generous free tier with all accessibility features included
10 story missions with gamification
7 languages supported including RTL for Arabic
LESSONS LEARNED
Accessibility is not hard when you prioritize it from the start. VoiceOver labels and hints make a huge difference. Testing with accessibility features enabled is essential. Standard SwiftUI components handle most accessibility automatically. Reducing motion significantly helps users with vestibular issues.
TECHNICAL DETAILS
Built with SwiftUI, targets iOS 17 and up. Universal app for iPhone and iPad. Fully offline using CoreData and local storage. No third party analytics, privacy focused.
QUESTIONS FOR THE COMMUNITY
What accessibility features do you find users request most? How do you test accessibility features efficiently?
WHATS NEXT
I'm currently working on expanding the word library, adding more story content, improving haptic feedback
Thanks for reading.
Nour
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Education and Kids
Education
Machine Learning
Apple Intelligence
Hi I'm planning to make macos App and distribute to MacOS App Store.
The question is should i make force update when update is needed.
The reason why I want to make this feature is I don't want to make user use previous version of app.
My plan is like this.
when app needed update, make user reach special page that describe why update is needed and set a button that can download new version of app.
the download will be automatically doing at background don't need to visit app store.
I search several forums and gpt but there is no positive reply of this..
so finally i make a post to know is there no way to make this.
Thank you!
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi everyone,
I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?").
Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar.
I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this.
The Concept: Skeleton-based Normalization
Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input.
Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space.
Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance.
Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers).
Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees").
Why this approach?
Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body.
Privacy: We are processing coordinates, not video streams.
Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life.
Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps.
Looking forward to hearing your thoughts.
Since the last bet upgrade for iPad to 26.3 labels have disappeared.
Going into system/accessibility the toggle setting for labels makes no difference whether on or off.
labels are permanently not there/missing.
Topic:
Accessibility & Inclusion
SubTopic:
General
ar quicklook suddenly is grayed out on iphone 15 pro, I bought the phone new recently ot was working great, 2 days ago updated to ios 18.1.4, ar mode kept opening but i started getting a move iphone over surface message and the object wouldn’t detect surfaces correctly, updated to ios 18.5, now when i open quicklook modesl ar is completely greyed out,
can someone help me fix or detect the issue
thank you
Should I allow the CIJSULAgent to find devices on local network?
CallKit and WebRTC are used to realize the call functionality.
You can select video, voice, or text calling as your calling method.
When making a text call, the voice input is grayed out and cannot be used, is there a solution?
Hello,
When I listen to title in my app with VoiceOver, it makes a strange sound.
This characters make with Korean+number+Alphabet.
Is this combination makes some strange sound with voice over?
I would like to ask if Apple can fix this issue.
Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
I have more than 1000 notes classified in parent/child folders up to 5 levels. From the 5th level of files I can no longer share the note. The note is not shared. It is that of the parent file that is shared.
Thank you very much
Good to you
Christophe
Topic:
Accessibility & Inclusion
SubTopic:
General
I get it, "Why don't you just get an apple watch?" regardless, My Macbok Pro M4 doesn't recognize my charging cable in any of my USBC ports.
The Cable works with any other power supply I plug it into, but the Macbook doesn't even register that a cable is connected.
-- Running Sequioa Beta 15.4 thinking it may have been software related. No change.
-- Settings> Privacy and Security> accessories> Changed between all available options. No change.
-- Option + Apple Logo> system info> Thunderbolt/USB4> none of the ports show that the cable is connected.
-- Any other USB-C Cable is recognized in any of the ports on the Macbook. Just not the cable for the Galaxy Watch.
Again, the cable charges in ANY other USBc ports from ANY other device I connect it to.
Am I missing something? Or is this an intended jab at Samsung from Apple? lol
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello everyone,
I am currently evaluating my app's accessibility features to accurately display the "Accessibility" information on the App Store. I have encountered two specific issues regarding Voice Control testing and would appreciate any guidance.
Voice Command for "Stop Recording" According to the evaluation criteria, if an app supports audio recording or dictation, users must be able to start and stop recording using only their voice.
Behavior: I can successfully trigger the recording using the command "Start Recording". However, I cannot find a command to stop it. Commands like "Stop Recording" or "Stop" are not recognized by the system.
Question: Is there a specific standard voice command intended for stopping a recording?
Item Number Overlays on Non-Interactive Web Elements (WKWebView) I noticed an inconsistency between native views and web content regarding Voice Control item numbering.
Behavior: When testing web content within the app (WKWebView) or in Safari, Voice Control displays item number overlays on non-interactive text elements (such as standard or tags). In native views, static labels do not receive item numbers.
Question: Is this expected behavior for web content? Since these elements are not interactive, I am unsure if this should be considered a bug (fail) or an acceptable exception for the accessibility evaluation.
Has anyone experienced similar issues or know the correct criteria for these cases?
Thank you.
hi giys,can anyone help me bcouse i cant pair my apple watch series 1 with my iPhone 15
Topic:
Accessibility & Inclusion
SubTopic:
General
In the app I'm working on, I have a SwiftUI View embedded in a UIKit Storyboard. The SwiftUI View holds a menu with a list of payment tools, and the ForEach loop looks like this:
ForEach(self.paymentToolsVM.paymentToolsItems, id: \.self) { paymentTool in
Button {
navigationCallback(paymentTool.segueID)
} label: {
PaymentToolsRow(paymentToolName: paymentTool.title, imageName: paymentTool.imageName)
.accessibilityElement()
.accessibilityIdentifier("Billing_\(paymentTool.title.replacingOccurrences(of: " ", with: ""))")
}
if paymentTool != self.paymentToolsVM.paymentToolsItems.last {
Divider()
}
}
So you can see the accessibility ID is there, and it shows up properly when I open up Accessibility Inspector with the simulator, but the testing script isn't picking up on it, and it doesn't show up when the view is inspected in Appium. I have other SwiftUI views embedded in the UIKit view, and the script picks up the buttons on those, so I'm not sure what's different about this one.
If it helps, the script is written in Java with the BDD framework. I can try to get the relevant part of the script if anyone thinks that would be helpful. Otherwise, is there anything else I can try?
When I try to get the frames of a AXUIElementRef using AXUIElementCopyAttributeValue(element, (CFStringRef)attribute, &result) the frames are shifted and rotated on the iOS simulator.
I get the same frames when using the Accessibility Inspector when the Max is selected as the host.
When I switch the host to the iOS simulator the frames are correct.
How is the Accessibility Inspector getting the correct frames? And how can I do the same in my app?
Topic:
Accessibility & Inclusion
SubTopic:
General
Added a view controller in the storyboard, added a tableview in this view, and added a cell under the table, when I run the APP to jump to the page, when using the narration function, I find that when I use three fingers to swipe up or down, a sentence will be broadcast in English, I want to no longer change the accessiblity of the cell, when I perform the behavior of swiping up or down with three fingers, Broadcast how Chinese should be implemented.
While editing the search text using the external keyboard (with VoiceOver on), if I try to navigate the to List using the keyboard, the focus jumps back to the search field immediately, preventing selection of list items. It's important to note that the voiceover navigation alone without a keyboard works as expected.
It’s as if the List never gains focus—every attempt to move focus lands back on the search field.
The code:
struct ContentView: View {
@State var searchText = ""
let items = ["Apple", "Banana", "Cherry", "Date", "Elderberry", "Fig", "Grape"]
var filteredItems: [String] {
if searchText.isEmpty {
return items
} else {
return items.filter { $0.localizedCaseInsensitiveContains(searchText) }
}
}
var body: some View {
if #available(iOS 16.0, *) {
NavigationStack {
List(filteredItems, id: \.self) { item in
Text(item)
}
.navigationTitle("Fruits")
.searchable(text: $searchText)
}
} else {
NavigationView {
List(filteredItems, id: \.self) { item in
Text(item)
}
.navigationTitle("Fruits")
.searchable(text: $searchText)
}
}
}
}
Hey,
We've run into an issue where WKWebView contents are not always available for VoiceOver users. It seems to occur when WKWebView contents are loaded asynchronously.
I have a sample project where this can be reproduced and a video showing the issue. See FB21257352
The only solution we currently see is forcing an update continuously using UIAccessibility.post(notification: .layoutChanged, argument: nil), but this is ofc a last resort as it may have other unintended side effects.