accessibilityUserInputLabels is working fine with any view I tried this on. Meaning that the control can be toggled with the provided alternative names when using Voice Control.
When setting this property on any UIBarButtonItem though, it seems Voice Control ignores the alternative names provided by setting accessibilityUserInputLabels. For comparison, accessibilityLabel works perfectly when set on UIBarButtonItem.
Is anyone facing the same issue?
Using Xcode 16.0 (16A242) on iOS 18
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Please refer to Feedback report: FB19701007
The Personal Voice file created in English changes to either Spanish or Chinese and no longer works properly. This has been happening since Beta 1 of iOS/iPadOS 26.
I have been unable to pinpoint what causes this to occur. Possibly downloading foreign voices to a device or using different voices via AVSpeechSynthesizer.
I run an app in Xcode on my device that prints to the console info about the installed voices. Initially after creation here is the output:
Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4
Language: en-US
Then I hear a Spanish accent and find this:
Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4
Language: es-MX
Currently it isn't working and here is the output:
Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4
Language: zh-CN
Note that the voice file on all three above is the same. No matter what the user does, the created voice file should never be able to change languages. On my test devices I reset them all by erasing all content and settings and creating a new English Personal Voice and the issue persists.
A side issue is the toggle share across devices doesn't remain off if turned off. I tried to not share to see if that could be the cause, but the toggle turns on automatically. It won’t remain off.
Hi Apple Developer Community,
I'm experiencing persistent issues with the Apple Search Ads API since today morning (August 16, 2025). My application keeps getting "Service Unavailable" errors when trying to connect to the API endpoints.
Error Details:
Error Message: "Service Unavailable"
HTTP Status: 503/500
API Endpoint: https://api.searchads.apple.com/api/v5/*
Frequency: Consistent failures since August 16, 2025
What I've Tried:
Verified API credentials and certificates are valid
Tested multiple API endpoints
Checked network connectivity
The API was working fine until yesterday, and no changes were made to our implementation. Any insights or updates from the community would be greatly appreciated.
Thanks in advance for your help!
iOSアプリでNEAppPushSessionを使い、NEAppPushDelegateの通知を受けてCallKitの着信画面を表示する実装をしていますが、以下の問題に直面しています。
8/13
ログにて下記のエラーが頻発しました。
通知の受け取りテストを約120回してその間ずっとこのエラーが出ていました。
エラー 2025-08-14 11:27:06.793073 +0900 nesessionmanager NESMAppPushSession[SimplePushDefaultConfiguration:7B7218F3-94B5-4AE5-9B9E-94E176694D02] failed to report incoming call to CallKit, error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process." UserInfo={NSDebugDescription=The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process.}
このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。
ですがどうやら通知の監視は開始しているようです。
15時間後の8/14
時間をあけたからか、再度通知が来るようになりました。
ですが再度通知の受け取りテストを行った時に同じエラーログが出ました。
再度通知テストを約120回程行ったら、このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。
ですがどうやら今回も通知の監視は開始しているようです。
15時間後の8/15
今日は15時間かけても通知を取得できませんでした。
ですが同じく通知の監視は開始していそうです。
iPhoneの再起動、Xcodeのクリーンアップ、アンインストールして再インストールなどしても通知は来ないままでした。
また、不思議なことに通知が来ない事象が起きた端末以外でも同じように通知を取得することができません。
他の通知は受け取ることができますが独自の通知であるNEAppPushManagerだけ通知を取得することができません。
質問です。
再度通知を出すためには何をすれば良いでしょうか。
この事象は4099エラーを出しすぎたことにより発生する障害なのでしょうか。
4099エラーを出ている原因は何でしょうか。
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
User Notifications
PushKit
CallKit
Push To Talk
I’m developing an iOS app, and I’ve noticed that when the user enables Accessibility → Display & Text Size → Color Filters → Grayscale, my app icon loses a lot of visual contrast. The original colored version looks fine, but in grayscale it appears “flat” and harder to distinguish, unlike a pure black-and-white design.
What I want to achieve:
Ensure the app icon remains visually clear and high-contrast even when iOS renders it in grayscale.
Ideally, provide an alternate “high-contrast” app icon version when grayscale mode is enabled.
What I’ve tried:
Increased color contrast in the original icon design.
Added outlines and stronger shapes.
Tested with grayscale filters in design tools.
Researched Asset Catalog and alternate icons, but found no documented API to detect or respond to grayscale mode.
Questions:
Is there any API in iOS that allows detecting when the system is in grayscale mode so that I can programmatically switch to an alternate app icon?
If not, are there Apple-recommended best practices for designing app icons that still look clear in grayscale?
Are there any accessibility guidelines specifically addressing icon design for grayscale or color-blind modes?
Additional info:
iOS version tested: iOS 17.5
Development in Swift + SwiftUI, using Asset Catalog for icons.
I am aware that iOS supports alternate icons via setAlternateIconName, but I haven’t found a trigger for grayscale mode.
My team has a robust digital accessibility program and processes for WCAG conformance in our apps. Because of this, there are definitely accessibility defects that get caught and addressed in order of impact and business priority like any other bug. Obviously we want to aim for 100% accessibility for our users, but it's a continual work in progress as new enhancements or changes are released.
I'm stuck on the appropriate measurement to indicate support. If we have 50 common tasks and the most central 10 tasks are solid but some supporting (but also common) tasks have a contrast fail or accessibleLabel missing, does that make the whole app not supporting the feature? If "completing the task" is the rubric there are a whole range of interpretations for that.
In a complex app, I anticipate that a group like ours will have strong support for many of the Accessibility Nutrition Labels accessibility features across tasks and devices, but realistically never be 100% free of defects for a given Apple Accessibility feature, even among core tasks.
As I consider the next steps for Nutrition Labels, I do not see anything in the documentation that gives a sort of baseline or measurement for inclusion. We plan to test all steps to complete a task, and log defects accordingly with an assigned timeline for fixing them (as would be true for functional defects).
The Text view seems to automatically prevent orphaned words on screen, but barring exceptional circumstances such as the font size being too large.
I couldn't find any documentation on this behaviour, how to configure, and would also be very interested in how it's implemented?
Thanks!
Topic:
Accessibility & Inclusion
SubTopic:
General
I have subscribed to the developer program, but it’s already been a day and it still shows “is not enrolled in the Apple Developer Program.”
Topic:
Accessibility & Inclusion
SubTopic:
General
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused.
According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup.
In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact.
So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
Hi,
I've wrapped AVRoutePickerView in SwiftUI using pretty much the code given here, with a few changes:
func makeUIView(context: Context) -> UIView {
let routePickerView = AVRoutePickerView()
// Configure the button's color.
//routePickerView.delegate = context.coordinator
//routePickerView.backgroundColor = .secondarySystemBackground
routePickerView.tintColor = .accent
routePickerView.activeTintColor = .accent
// Indicate whether your app prefers video content.
routePickerView.prioritizesVideoDevices = false
return routePickerView
}
I commented out routePickerView.delegate = context.coordinator because it doesn't compile; context.coordinator is of type Void and I'm not sure how to fix that. I'm not sure if that has anything to do with the issue.
Anyway, this works fine without VoiceOver; if I tap the button, I get the AirPlay popover. But in VoiceOver, if I select the button and double-tap, nothing happens… it just reads the button's accessibilityLabel again. How can I get the AirPlay popover to show in VoiceOver?
I'm looking into how to programmatically control color filters in the Accessibility settings under "System Settings" -> "Accessibility" -> "Color Filters"--in particular the "Intensity" and "Filter type" settings.
As far as I have gathered, changing this setting can only be accomplished using the CoreGraphics APIs or Accessibility APIs (I've poked around GitHub, Stack Overflow, and queried some LLMs), but there doesn't seem to be a clear cut example for doing this using public facing APIs, without ripping off source code from another project wholesale or using private APIs.
My goal is to overlay a color filter at either a per-application or system level to help with accessibility. If there's a way to overlay this capability on an application-by-application basis as a third-party developer, that would be the most ideal scenario. For example, modifying the look and feel/UX for Launchpad, Photos, etc, as a third-party developer without accessing the source code of the application that I'm modifying the look/feel for (with appropriate user consent of course).
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I’m in the process of enrolling my business (Carzo Rent A Car, Prishtine, Kosovo) in the Apple Developer Program, but I have been waiting for my D-U-N-S number to be issued.
I submitted the request to Dun & Bradstreet on July 28, 2025 (Case #9142648) and have only received a system-generated email with a tracking ID (#9086421). There has been no further update.
My questions are:
Is there a way for Apple to expedite or provisionally approve my enrollment while the D-U-N-S number is pending?
How long does Apple typically wait for D&B updates before the enrollment is affected?
Are there any alternative steps I can take to avoid further delays?
Thank you for your guidance.
Topic:
Accessibility & Inclusion
SubTopic:
General
I'm encountering an issue related to BLE device discovery on iOS.
I have a BLE peripheral device that I initially connected to using an iOS device. After this connection, the BLE device's advertised name was programmatically changed by the peripheral. Now, when I try to scan for this device using other iOS devices, it does not appear in the scan results in most apps — including nRF Connect and our own custom BLE app that uses CoreBluetooth.
A few observations:
The device is definitely powered on and advertising (confirmed via Android).
The name change is reflected correctly on Android and on the iOS device that originally connected to it.
Other iOS devices no longer see the device in their scan list.
I have been working to remediate PDFs for a client. The documents/forms have many tables. When I correctly tag a table, using Foxit Editor Pro, it works beautifully on a PC reading it with NVDA. On Mac using VoiceOver the table isn't accessible. It doesn't matter if I try to read it in Adobe Acrobat, Foxit, or Preview. The reader often says the document is empty, omits column headers, and/or associates the wrong header with the column data.
The documents have essentially the same coding behind them as for the web. Why is it they perform so well on a PC with NVDA, but so poorly with Mac VoiceOver? I am a Quality Assurance Specialist. I review websites apps, and documents for accessibility. Why can't I do my job using only my Mac system?
As a Mac user, it frustrates me that I can't use my preferred system for checking documents to see if they are accessible because VoiceOver doesn't work well. I actually have to recommend to my clients and their customers that they need to use a PC with NVDA or Jaws for these documents to be able to get all the information. Unfortunately, most people aren't able to have, or maintain, both systems. Overall, Mac products are very high quality. This, and other issues with VoiceOver, seems to be a large gap in Apple's offerings and functionality.
I would appreciate a human response to the original email I sent about this on 7/30/2025.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello
I tried implementing the ASAM for macOS as per apple guidelines with configuration profile mentioned here but didn't had any success.
Then Apple suggested to use requestGuidedAccessSession in macOS but that is only supported in macOS Catalyst but that also didn't work with valid config profiles too.
Did anyone get success with ASAM mode without assessment entitltlement?
Topic:
Accessibility & Inclusion
SubTopic:
General
I am testing the accessibility feature available in the Settings app called "Speak Screen". The help text in the Setting app states that swiping down with two fingers will cause the screen content to be spoken. However, I've been unable to get this feature to work. Every time I try the double finger swipe down, it behaves the same as the single finger swipe down gesture. Usually this manifests as making scroll views bounce.
I've tried toggling the feature on and off, turning off Reachability, and rebooting my phone, but I can't get the speak screen gesture to work. If I access the speak screen feature from the "Speech Controller" button, then the screens content is spoken, as expected, so I know the feature is enabled. It's just the gesture that doesn't work.
Is there something else I need to do to get this gesture to work? I don't want to tell my users to turn this feature on if I can't verify that the gesture will work with my app.
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection.
Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
Hi,
I want to detect if there is a fullscreen window on each screen.
_AXUIElementGetWindow and kAXFullscreenAttribute methods work, but I have to be in a non-sandbox environment to use them.
Is there any other way that also works? I don't think it's enough to judge if it's fullscreen by comparing the window size to the screen size, since it doesn't work on MacBook with notch, or the menu bar is set to 'auto-hide'.
Thanks.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Accessibility
Mac App Store
Core Graphics
App Sandbox
This was working a few days ago, but it has since stopped and I can't figure out why. I've tried resetting TCC, double-checking my entitlements, restarting, deleting and rebuilding, and nothing works.
My app is a sandboxed macOS SwiftUI LSUIElement app that, when invoked, checks to see if the frontmost process is Terminal, then tries to get the frontmost window’s title.
func
getFrontmostWindowTitle()
throws
-> String?
{
let trusted = AXIsProcessTrusted()
print("getFrontmostWindowTitle AX trusted: \(trusted)")
guard let app = NSWorkspace.shared.frontmostApplication else { return nil }
let appElement = AXUIElementCreateApplication(app.processIdentifier)
var focusedWindow: AnyObject?
let status = AXUIElementCopyAttributeValue(appElement, kAXFocusedWindowAttribute as CFString, &focusedWindow)
guard
status == .success,
let window = focusedWindow
else
{
if status == .cannotComplete
{
throw Errors.needAccessibilityPermission
}
return nil
}
var title: AnyObject?
let titleStatus = AXUIElementCopyAttributeValue(window as! AXUIElement, kAXTitleAttribute as CFString, &title)
guard titleStatus == .success else { return nil }
return title as? String
}
I recently renamed the app, but the Bundle ID has not yet changed. I have com.apple.security.accessibility set to YES in the Entitlements file (although i had to add it manually), and a NSAccessibilityUsageDescription string set in Info.plist.
The first time I ran this, macOS nicely prompted for permission. Now it won't do that, even when I use AXIsProcessTrustedWithOptions() to try to force it.
If I use tccutil to reset accessibility and apple events, it still doesn't prompt. If I drag my app from the build products folder to System Settings, it gets added to the system TCC DB (not the user DB). It shows an auth value of 2 for my app:
% sudo sqlite3 "/Library/Application Support/com.apple.TCC/TCC.db" "SELECT client,auth_value FROM access WHERE service='kTCCServiceAccessibility' OR service='kTCCServiceAppleEvents';"
com.latencyzero.<redacted>|2
<redactd>
I'm at a loss as to what went wrong. I proved out the concept earlier and it worked, and have since spent a lot of time enhancing and polishing the app, and now things aren't working and I'm starting to worry.
Hi,I applied for the COMMUNICATION capability, but have a message that I already have the driving task app entitlement.
After that ,I have applied one more time ,there is no reply anymore.
I do not have the com.apple.developer.carplay-communication capability, that means I can not apply this capability?
What should i do next to get this capatibility?
Thanks