Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
1
0
147
3h
How to Implement Dynamic Type for UITextFields Without Resetting Data
Hello! I was doing some accessibility testing for my app and found out that when the user switches the text size, all of the data in the text fields is reset, which causes major disruption. I've tried looking for documentation, but all I've found is information on how to dynamically scale the UI for different text sizes, which I've already implemented. My guess is that every time Dynamic Type registers a change, it redraws my UI instead of just updating it. How can I make sure the data is not reset when the text size changes?
1
1
455
2d
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
505
3d
Unable to Accept Invite
I am getting this issue when trying to accept an invite to a new test version of our app. ****Unable to Accept invite This invitation cannot be accepted because your Apple Account, xxxxxxxx.me.com, has already been associated to this app.**** Can you help please?
11
7
2.7k
1w
Too many verification codes have been sent.
Hello, I have the following problem. I’m developing a NoCode app using the FlutterFlow platform and have been working on it for over a year. This time, after publishing a new version of the app through FlutterFlow, I tried logging into Apple Store Connect, but I got an error saying that I had made too many login attempts and needed to try again later. However, I hadn’t attempted to log in before that at all. No matter how long I wait—24 hours, 48 hours—the same error keeps appearing, meaning I still can’t access my account. Apple Support hasn’t responded for 4 days, and in total, I’ve been locked out of my account for over 9 days. Please help me understand what might be causing this issue. Apple Store Connect refuses to send me an SMS with the login code.
3
1
1.1k
1w
iOS 26 regression: `DeviceActivityEvent`: `eventDidReachThreshold` called immediately (instead of waiting till threshold is reached)
Hello Albert! I am experiencing some strange bugs around DeviceActivityEvents (part of the DeviceActivity framework) on iOS 26 / iOS 26.1 / iOS 26.2 beta: When creating a DeviceActivityEvent we can assign a threshold and applicationTokens. The idea is, that after the user has spent said threshold on said apps, eventDidReachThreshold() is called. The property includesPastActivity is set to false. On iOS 26 however, it happens (quite reliably after updating to a new beta seed) quite often that eventDidReachThreshold() is called immediately (after a couple of seconds) instead of waiting for the threshold to be met. Is anyone else seeing similar issues on iOS 26 / iOS 26.1 / iOS 26.2 beta? Only workaround I have found is to ask users to revoke and re-grant Screen Time permissions. This only holds for about two weeks though or at most until the next iOS 26 beta update is installed, so it is not a permanent solution unfortunately. Feedback (incl. sysdiagnoses and sample project) is filed under: FB18061981 FB18927456 One of our users has filed their own feedback request as well: FB20817853 Thanks a lot for any help on this!
0
0
269
1w
Potential Structural Swift Concurrency Issue: unsafeForcedSync called from Swift Concurrent context
I occasionally get this error in Xcode’s console: Potential Structural Swift Concurrency Issue: unsafeForcedSync called from Swift Concurrent context. What does this mean, and how can I resolve it? Googling it doesn’t turn up any results. This doesn't crash the app - it’s just an error diagnostic that I see in the Xcode console. The app keeps running before and after the issue. Is there a way I can set a breakpoint to catch this where it happens?
8
2
1.9k
1w
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
1
0
259
2w
Developer Program enrollment still pending after payment
Hi everyone, I subscribed to the Apple Developer Program on Tuesday evening, November 4th, 2025. The payment has already been charged to my bank account, but my account still shows the status “Pending” with the message “Subscribe your membership”. It’s now been several days, and I haven’t received any confirmation email or any request for additional information. I already contacted Apple Support by email, but I’d like to know if other developers have experienced the same situation and how long it took before their account was activated. Thanks in advance for your help and feedback! — Martin
16
7
1.1k
2w
Double Payment Issue — Apple Developer Account Still Not Activated
Hi everyone, I completed my Apple Developer registration on November 8, 2025, and paid the fee. Afterwards, I was charged another €99, so I basically paid twice. But nothing has changed on my account so far — it doesn’t even show as “pending” or “waiting” anymore. I also can’t reach Apple Support, since I’m unable to get a call or send an email. Has anyone else experienced something like this or knows what I can do?
1
1
232
3w
Download Voices screen
System settings => Accessibility => System Voice => the little (i) beside the pulldown => Voices => THIS SCREEN will allow you to download Premium Voices Is there a way to trigger this screen programmatically. Or at least a link to get my users there without having to dig thru that swamp of screens?
0
1
376
3w
Custom Keyboard Extension Not Showing in Settings for Activation
Hi everyone, I’m developing a React Native iOS app that includes a custom keyboard extension for sending stickers across apps. The project builds successfully, and the main app installs fine on my test device. However, I’m not seeing the keyboard extension appear under Settings → General → Keyboard → Keyboards → Add New Keyboard, which means I can’t activate it or grant access. At this point, I’m not even sure if the extension is actually being installed on the device along with the main app. Here’s what I’ve done so far. I created a Keyboard Extension target in Xcode, set the correct bundle identifiers and provisioning profiles, and enabled “Requests Open Access” in the extension’s Info.plist. I built and installed the app on a physical device rather than the simulator to ensure proper testing. My main questions are: how can I confirm that the extension is being installed on the device, and if it isn’t, what might prevent it from installing even though the build completes successfully? Any insights, troubleshooting steps, or guidance would be greatly appreciated.
0
0
655
Nov ’25
WiFi aware demo paring issue
I am developing a program on my chip and attempting to establish a connection with the WiFi Aware demo app launched by iOS 26. Currently, I am encountering an issue during the pairing phase. If I am the subscriber of the service and successfully complete the follow-up frame exchange of pairing bootstrapping, I see the PIN code displayed by iOS. Question 1: How should I use this PIN code? Question 2: Subsequently, I need to negotiate keys with iOS through PASN. What should I use as the password for the PASN SAE process? If I am the subscriber of the service and successfully complete the follow-up frame exchange of pairing bootstrapping, I should display the PIN code. Question 3: How do I generate this PIN code? Question 4: Subsequently, I need to negotiate keys with iOS through PASN. What should I use as the password for the PASN SAE process?
1
0
556
Nov ’25
IOS 26 Full Keyboard Access (navigation) and WKWebView
We use an embedded WKWebView for several screens in our app. Recently, we have been testing keyboard navigation via Full Keyboard Access in our apps. On IOS 18, everything works pretty much as expected. On IOS 26, it does not. On IOS 26, you can "tab" away from the webview and then never tab back to the webview for keyboard navigation. Is this a known issue? Are there workarounds for this issue that anyone is aware of?
2
0
334
Nov ’25
Nearby Interactions, wih camera assistance
I have an app that uses nearby with a custom accessory. works great on iPhone 11-13, starting with iPhone 14, one must use ARkit to get angles we have two problems ARkit is light sensitive, and we do not control the lighting where this app would run.. the 11-13 action works great even in the dark. (our users are blind, this is an accessibility app) ARkit wants to be foreground, but our uses cannot see it, and we have a voice oriented UI that provides navigation instructions.. IF ARkit is foreground, our app doesn't work. with iPhone 15 ProMax, on IOS 18, I got an error, access denied. (not permission denied) now that I am on IOS 26.. bt scan doesn't happen also fails same way on iPhone 17 on IOS26, can't callback now as release signing is no longer done this same code works ok on iOS 17.1 on iPhone 12. Info.plist here info.txt if(SearchedServices == [] ){ services = [TransferService.serviceUUID,QorvoNIService.serviceUUID] } logger.info( "scannerready, starting scan for peripherals \(services) and devices \(IDs)") filteredIDs=IDs; scanning=true; centralManager.scanForPeripherals(withServices: services, options: [CBCentralManagerScanOptionAllowDuplicatesKey: true]) the calling code dataChannel.autoConnect=autoConnect; dataChannel.start(x,ids) // datachannel.start is above self.scanning = true; return "scanning started"; ... log output services from js = and devices= 5FE04CBB services in implementation = bluetooth ready, starting scan for peripherals [] and devices ["5FE04CBB"] scannerready, starting scan for peripherals [6E400001-B5A3-F393-E0A9-E50E24DCCA9E, 2E938FD0-6A61-11ED-A1EB-0242AC120002] and devices ["5FE04CBB"] ⚡️ TO JS {"value":"scanning started"}
2
2
1.3k
Oct ’25