I occasionally get this error in Xcode’s console:
Potential Structural Swift Concurrency Issue: unsafeForcedSync called from Swift Concurrent context.
What does this mean, and how can I resolve it? Googling it doesn’t turn up any results.
This doesn't crash the app - it’s just an error diagnostic that I see in the Xcode console. The app keeps running before and after the issue.
Is there a way I can set a breakpoint to catch this where it happens?
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
There is an issue with Help Books that started with the release of macOS 14.4. The issue is that when an app attempts to go directly to a Help Book page, the help viewer opens to the Help Book's main index page, rather than the specific page requested. As I investigated the issue I found that the requested page was actually part of help viewer's navigation history, and all I had to do was to click the Back navigation arrow and the requested page would be displayed. So it seems like the requested page is momentarily visited but is then (for whatever reason) quickly replaced by the main index page.
Our app uses the AHGotoPage() API for directly accessing our Help Book's pages. This is the same mechanism/code that our app has used for more than a decade and has never caused us any issues. Everything works fine on macOS 14.3.0 and earlier. I've scoured the documentation and can't find any newer APIs for accessing Help pages. I've also tried various other things (e.g. reworking the code, creating new indexes for the app's Help, etc.), but none of it seems to make a difference. As far as I can tell, the issue seems to stem from some change made to the OS.
So my questions are:
Is this a known bug? And if so, is there any ETA on a fix?
Is there something different we should be doing for newer versions of the OS (create indexes differently, use a different API, etc.)?
Topic:
Accessibility & Inclusion
SubTopic:
General
Feedback number: FB20451665
When building with Xcode 26, Voice Over is reporting an extra tab when swiping through tabs. Please see the sample project below:
/*
This is a Sample project to show that I believe there is a Voice Over bug in iOS 26.
When swiping through tabs with Voice Over active, there always appears to be an extra tab.
Here I have 5 tabs, when on tab one VO reads out tab 1 of 6, then tab 2 of 6, all the way to the last tab, when voice over reads out tab 5 of 6. Never tab 6 of 6.
Is there a possibility that voice over is picking up the underlying `more` tab and reading that out?
This has also been reportedly found in the Files app here:
https://www.applevis.com/comment/195441#comment-195441
*/
struct ContentView: View {
var body: some View {
TabView {
/// Activating this has Voice over telling us there are 6 Tabs.
Tab(RootTab.home.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.home.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.home.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.home.title.capitalized) tab")
Tab(RootTab.diary.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.diary.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.diary.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.diary.title.capitalized) tab")
Tab(RootTab.meals.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.meals.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.meals.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.meals.title.capitalized) tab")
Tab(RootTab.knowledge.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.knowledge.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.knowledge.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.knowledge.title.capitalized) tab")
Tab(RootTab.profile.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.profile.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.profile.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.profile.title.capitalized) tab")
/// Activating this also has Voice over telling us there are 6 Tabs.
// ForEach(RootTab.allCases, id: \.self) { tab in
//
// Text("This is the \(tab.title.capitalized) screen")
// .tabItem {
// Label(tab.title.capitalized, systemImage: "circle.fill")
// }
// .accessibilityLabel("\(tab.title.capitalized) tab")
// .accessibilityHint("Double tap to open the \(tab.title.capitalized) tab")
// }
}
}
enum RootTab: CaseIterable {
case home
case diary
case meals
case knowledge
case profile
var title: String {
switch self {
case .home:
"home"
case .diary:
"diary"
case .meals:
"meals"
case .knowledge:
"knowledge"
case .profile:
"profile"
}
}
}
}
I'm curious if anyone else can see this issue, or if anyone knows of a workaround for it.
I am getting this issue when trying to accept an invite to a new test version of our app.
****Unable to Accept invite
This invitation cannot be accepted because your Apple Account, xxxxxxxx.me.com, has already been associated to this app.****
Can you help please?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
In our app we provide a button that initiates a phone call using tel://.
For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel.
If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call.
However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT.
Is this the expected system behavior for emergency numbers on iOS?
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option?
Thanks in advance for clarifying.
I have an app that uses nearby with a custom accessory.
works great on iPhone 11-13,
starting with iPhone 14, one must use ARkit to get angles
we have two problems
ARkit is light sensitive, and we do not control the lighting where this app would run.. the 11-13 action works great even in the dark. (our users are blind, this is an accessibility app)
ARkit wants to be foreground, but our uses cannot see it, and we have a voice oriented UI that provides navigation instructions..
IF ARkit is foreground, our app doesn't work.
with iPhone 15 ProMax, on IOS 18, I got an error, access denied. (not permission denied) now that I am on IOS 26.. bt scan doesn't happen
also fails same way on iPhone 17 on IOS26, can't callback now as release signing is no longer done
this same code works ok on iOS 17.1 on iPhone 12.
Info.plist here
info.txt
if(SearchedServices == [] ){
services = [TransferService.serviceUUID,QorvoNIService.serviceUUID]
}
logger.info(
"scannerready, starting scan for peripherals \(services) and devices \(IDs)")
filteredIDs=IDs;
scanning=true;
centralManager.scanForPeripherals(withServices: services,
options: [CBCentralManagerScanOptionAllowDuplicatesKey: true])
the calling code
dataChannel.autoConnect=autoConnect;
dataChannel.start(x,ids) // datachannel.start is above
self.scanning = true;
return "scanning started";
... log output
services from js = and devices= 5FE04CBB
services in implementation =
bluetooth ready, starting scan for peripherals [] and devices ["5FE04CBB"]
scannerready, starting scan for peripherals [6E400001-B5A3-F393-E0A9-E50E24DCCA9E, 2E938FD0-6A61-11ED-A1EB-0242AC120002] and devices ["5FE04CBB"]
⚡️ TO JS {"value":"scanning started"}
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
Topic:
Accessibility & Inclusion
SubTopic:
General
I am working on capturing 48MP images using the iPhone 16 Pro Max with the Ultra-wide camera. I’ve updated the code to capture the maximum supported dimensions with the following snippet:
if #available(iOS 16.0, *) {
photoOutput.maxPhotoDimensions = device.activeFormat.supportedMaxPhotoDimensions.last!
photoSettings.maxPhotoDimensions = .init(width: 5712, height: 4284)
}
However, I’m still not getting the expected results. My goal is to capture 48MP images, and I want to confirm if the Ultra-wide camera supports this resolution or if I’m missing any other configuration.
Any guidance would be appreciated!
Accessibility Voiceover is not treating navigation bar left button as first focused element.
If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title.
If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI.
how should I do? if I want to focus on the navigation item back button or navigation title.
my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange.
Why did you design it this way? Can you tell me your thinking?
Thanks
Request: Name Recognition → Shortcut for SOS Flashlight + Vibration
Right now, iOS Name Recognition works, but all I can do is flash the tiny notification light. It would be much more useful if Name Recognition could trigger a Shortcut. That way, I could set it to flash the flashlight in an SOS pattern and vibrate, making the alert impossible to miss.
I tried using Custom Alarm, but it won’t let me record my spoken name, so it doesn’t really solve the problem. If Apple allowed Name Recognition to trigger Shortcuts — or expanded “Custom” to support names/words — this would open up far more practical, real-world alerts.
Topic:
Accessibility & Inclusion
SubTopic:
General
I wrote this in the regular forums and they deleted it and told me to write it here because it was dealing with unreleased software. I read that Launchpad is disappearing in Tahoe and I have real concerns about that. For me, that is an accessibility issue. I have both memory problems and scanning problems. So having my apps organized into categories is extremely important to me. Just today I needed to find an app that I didn't remember the name of and I rarely use, but when I need it, it is important to me. Just to see if I could find it without launchpad, I scanned my applications folder and I couldn't find it. I went to launchpad and to the category I knew it was in and it was right there, easy for me to find. Please don't take away our organization options.
It appears iOS only comes with low quality voices installed.
iOS requires the user to go into settings to download higher quality voices to be used with AVSpeechUtterance.
There doesn't seem to be any api that can be used to make this process easier for the app user.
Is there a way / api that would allow an app to download and use a higher quality voice?
Will apple ever install on default higher quality voices?
We really want to use the text to speech api in iOS however the very high amount of user friction to use high quality voices is stopping us. I would appreciate a response.
Thanks
Using the floating keyboard extensively. Often It starts to jump up and down. I have to pinch out to see the large version and pinch in again to restore the floating version. Sometimes just touching a key sets it off. Sometimes returning to a window from which the keyboard is displayed starts the issue. This was never a problem in ipad os 18.
I would like to enable the option to resize windows with the apple pencil pro. I tried but I see that this feature is not enabled.
Topic:
Accessibility & Inclusion
SubTopic:
General
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26
Steps: Open the path above → “Hattori” is not listed and cannot be downloaded
Expected: Hattori is available to download and select
Actual: Hattori is absent from the catalog
Regression: Was available on iOS 18.x on the same device
Hello,
I am working on a Braille keyboard by using HID approach.
Current the device works with iPhone 11 and SE3.
However, when tested in iPhone 6s with iOS 15, although the device can be connected and recognized as Braille device in VoiceOver screen, the phone shows no response to key press report.
Would there be any requirement at points such as HID descriptor for iPhone 6s support on Braille device? If iPhone 6s does not support such devices, what is the minimum system requirements?
Thank you!
On recent versions of macOS, when a window is being shared (via the system screen-capture APIs), the OS sometimes shows a small "shared window" badge in the title bar.
I’ve noticed that this indicator is not consistent:
For some windows, the badge reliably appears when they are being shared.
For other windows, the badge never appears, even though the window is actively shared.
In particular, windows that use a standard system title bar seem to show the indicator more often, while windows with custom-drawn or non-standard chrome do not.
My questions are:
What are the exact conditions under which macOS decides to draw the “shared window” indicator in a window’s title bar?
Is this strictly tied to certain NSWindow styles or masks (e.g. titled vs borderless)?
Is there any API or flag I can use to detect programmatically whether a given window will display this system indicator when shared?
Topic:
Accessibility & Inclusion
SubTopic:
General
why did the screen recorder button disappear? It cannot be found anywhere.
Topic:
Accessibility & Inclusion
SubTopic:
General
I created a desktop app for Mac using Xojo. The app has a controller in the main window and displays advertisements and notices on a connected external display.
I'm currently connecting my iMac24 to a REGZA-55M550M via AirPlay, and displaying video from the iMac to the REGZA, but the connection occasionally drops out. Yesterday, the connection dropped about 3.5 hours after connecting. Of course, I have other apps running on the iMac, but I'm not using any operations that would put a strain on the network or memory.
Does AirPlay connection to non-Apple products become unstable over long periods of time?
Hey folksI, I would like to ask for help on this topic:
I think this is exactly the same problem Combobox not working with VoiceOver after… - Apple Community.
VoiceOver also breaks the combobox from the official ARIA W3C website https://www.w3.org/WAI/ARIA/apg/patterns/combobox/examples/combobox-autocomplete-list/. When VO is turned off, I can use the up/down arrow to go through the menu items from the dropdown, but when VO is turned on, the up/down arrows cannot access the dropdown menu items.
Is there an official tutorial on how to control it using voice over?
Kind regards,
Jakub
Topic:
Accessibility & Inclusion
SubTopic:
General