My game app is text-based interactive fiction, containing no audio/video content, making captions unnecessary. Our game is completely accessible to deaf users.
Despite this, in the Accessibility Nutrition Label, I'm only able to leave the "Captions" box checked or unchecked. Leaving it unchecked would leave deaf players with the wrong impression that they can't enjoy our game. Leaving it checked would imply that we do have A/V content with captions included.
In the WWDC video on this, https://developer.apple.com/videos/play/wwdc2025/224/ the video says:
After we completed common tasks, we realized our app doesn’t have any video or audio only content. In this case, we aren’t going to indicate that Landmarks supports Captions. That's okay. This accurately describes the features that people will expect to be available while using the app.
Maybe that's "OK," but I wish the form allowed me to say "This app doesn't contain audio/video content."
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Our company enrolled in the Apple Developer Program as an organization in July 2024. Everything was fine for several months, but in early January 2025, our developer noticed that the certificates were missing. When we logged into our developer account, we were shocked to see a page prompting us to “Enroll Today”—as if we had never joined in the first place.
Clicking the enrollment button led us to an error page stating we cannot enroll.
We immediately reached out to Apple Developer Support via email, but despite multiple attempts, we received no response. Strangely, our apps remain live on the App Store, App Store Connect functions as usual, and we continue receiving payments every month. However, we are completely blocked from developing and releasing updates.
Today, I managed to reach Apple by phone. After being transferred to a senior representative, I was told they couldn’t tell me why this was happening. They only confirmed that a request had been made and that I should “wait.” That’s it—no explanation, no timeline, nothing. While it’s somewhat reassuring that they acknowledge the issue, I’ve already seen other developers with the same problem go unanswered for months.
My suspicion? This account might be linked to an individual developer account from way back in 2015 when Apple’s registration process was far less strict. Could that be the issue? No idea—because Apple won’t say a word.
Meanwhile, both of our apps have been exposed to several bugs, and customers are waiting for updates. If there’s still no response from Apple, I have no choice but to register a new account—purely to continue supporting our users.
CASE ID: 102508598957
hi giys,can anyone help me bcouse i cant pair my apple watch series 1 with my iPhone 15
Topic:
Accessibility & Inclusion
SubTopic:
General
Updated to iOS 26 beta and now the TV remote app in the control center won’t open. I’ve tried the following:
Restart phone
Remove shortcut and re-add
Cant find any other troubleshooting methods for this issue online so I’m guessing it’s a new problem.
Topic:
Accessibility & Inclusion
SubTopic:
General
We have an app under development which allows musicians to unlock contact details of people who posted about an upcoming event. The musician pays a fees to unlock this contact details.
Both the musician & the post owner are registered users. We will reveal the same contact info that the post owner used for account signup verification.
Questions:
Is this allowed? (given that we obtain consent to share contact info to other people and clearly mention this in privacy policy)
If yes, will we have to use App store in-app purchase to facilitate this transaction or are we free to use a payment processor such as Stripe.
Topic:
Accessibility & Inclusion
SubTopic:
General
After 26 IOS update, the colors on my new iPad Pro M4 have become extremely dull almost like those on a very old device. The screen brightness is significantly reduced, and it's now difficult to see UI elements clearly. This is very disappointing considering the device’s high display quality before the update. Please advise if this is a known issue or if there's a fix.
Topic:
Accessibility & Inclusion
SubTopic:
General
I have the following method to insert @mentions to a text field:
func insertMention(user: Token, at range: NSRange) -> Void {
let tokenImage: UIImage = renderMentionToken(text: "@\(user.username)")
let attachment: NSTextAttachment = NSTextAttachment()
attachment.image = tokenImage
attachment.bounds = CGRect(x: 0, y: -3, width: tokenImage.size.width, height: tokenImage.size.height)
attachment.accessibilityLabel = user.username
attachment.accessibilityHint = "Mention of \(user.username)"
let attachmentString: NSMutableAttributedString = NSMutableAttributedString(attributedString: NSAttributedString(attachment: attachment))
attachmentString.addAttribute(.TokenID, value: user.id, range: NSRange(location: 0, length: 1))
attachmentString.addAttribute(.Tokenname, value: user.username, range: NSRange(location: 0, length: 1))
let mutableText: NSMutableAttributedString = NSMutableAttributedString(attributedString: textView.attributedText)
mutableText.replaceCharacters(in: range, with: attachmentString)
mutableText.append(NSAttributedString(string: " "))
textView.attributedText = mutableText
textView.selectedRange = NSRange(location: range.location + 2, length: 0)
mentionRange = nil
tableView.isHidden = true
}
When I use XCode's accessibility inspector to inspect the text input, the inserted token is not read by the inspector - instead a whitespace is shown for the token. I want to set the accessibility-label to the string content of the NSTextAttachment. How?
Can you guys like probably make Visual Intelligence available for the action button on the iPhone 16e? It should be only for iPhones that use A18 and future gen apple chips.
Topic:
Accessibility & Inclusion
SubTopic:
General
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of:
AVSpeechUtterance(string: "中文") // Any Chinese Content
in the dialect specified by:
Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language
instead of the dialect that I specified in AVSpeechUtterance.voice:
AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese
AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin
However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below.
My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18.
Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen.
However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens.
This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior.
Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey everyone,
I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery?
The Core Idea
Imagine this:
Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket.
Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery.
Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures.
On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED).
Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy.
Why This Matters
This isn't just a cool gimmick; it offers significant benefits:
True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else.
Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses.
Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech.
Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source.
This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
Topic:
Accessibility & Inclusion
SubTopic:
General
I'm currently testing the announce notifications feature and I can't seem to find out how to make Siri read aloud the current currency instead of dollars.
My locale is es-CL (Chile). It uses the currency symbol $ and reads as Pesos locally or Chilean Pesos where the number 5000.1 is represented as 5.000,1
This is the notification content
let content = UNMutableNotificationContent()
content.body = "¡Has recibido un pago por $5.000!"
Siri reads it aloud as "¡Has recibido un pago por 5.000 Dolares!" which translates to "You have received a payment for 5,000 Dollars", instead of the expected "¡Has recibido un pago por 5.000 Pesos!" -> "You have received a payment for 5,000 Pesos"
I've tried changing the development region of the app, interpolating the string with NumberFormatter.localizedString(from: 5000, number: .currency), and with others styles( .currencyAccounting, .currencyISOCode and .currencyPlural) without good results. The last one seems to work buts it's not ideal since it outputs "5.000 pesos chilenos" which gets read as "5 pesos chilenos" which is not the correct amount (bug), it's as is you're not on Chile and I personally prefer it to be a symbol instead of words.
I'm testing with my device which is setup with the region "Chile"
Could someone help me find a solution?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Localization
User Notifications
Siri and Voice
I have an issue in my app when it is used together with the assistive access feature.
For authentication, we are using the capacitor firebase authentication plugin (https://www.npmjs.com/package/@capacitor-firebase/authentication) which enables users to login via apple (FirebaseAuthentication.signInWithApple(...)), google (FirebaseAuthentication.signInWithGoogle(...)), or email. Works just fine. However, when the assistive access feature is enabled, the login fails for apple ("The operation couldn't be completed. com.apple.AuthenticationServices.AuthorizationError error 1000) and google ("The user canceled the sign-in flow).
It seems like the popups for sign-in are blocked and therefore an error is returned immediately. The popups may be blocked by assistive access, causing the capacitor plugin to be unable to authenticate.
I have tested this on my iPhone 12 Pro using iOS 17.7
I would appreciate any suggestions to handle this issue!
Topic:
Accessibility & Inclusion
SubTopic:
General
I’ve noticed that the VoiceOver reads currency amounts correctly when they are below thousand.
Then, for higher amounts, for example 12.225,34 € VoiceOver reads ‘twelve point two two five thirty four euros’
If the amount is formatted without the thousand separator (12225,34 €) this problem doesn’t exist. (VO reads twelve thousand two hundred and twenty five euros and thirty four cents)
Why is the thousand separator a problem for VoiceOver if this formatting is coming from the currency and locale?
This issue exists in English. I changed my device language to Italian and German and in both cases the number was read correctly even with the separator.
Is there a way to make it work in English?
Hey,
We've run into an issue where WKWebView contents are not always available for VoiceOver users. It seems to occur when WKWebView contents are loaded asynchronously.
I have a sample project where this can be reproduced and a video showing the issue. See FB21257352
The only solution we currently see is forcing an update continuously using UIAccessibility.post(notification: .layoutChanged, argument: nil), but this is ofc a last resort as it may have other unintended side effects.
Hi,
I want to detect if there is a fullscreen window on each screen.
_AXUIElementGetWindow and kAXFullscreenAttribute methods work, but I have to be in a non-sandbox environment to use them.
Is there any other way that also works? I don't think it's enough to judge if it's fullscreen by comparing the window size to the screen size, since it doesn't work on MacBook with notch, or the menu bar is set to 'auto-hide'.
Thanks.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Accessibility
Mac App Store
Core Graphics
App Sandbox
I’d love to see Apple implement a Bionic Reading feature as a system-wide accessibility option. This type of reading aid highlights the first part of each word in bold to help guide the eyes and improve comprehension.
It’s been shown to be especially helpful for people with ADHD, dyslexia, and other neurodivergent needs. Having a toggle in Settings > Accessibility would be life-changing.
Ideally, it could be:
• Enabled system-wide, or per-app
• Allow customization of how much of the word is bolded
• Available in Safari, Messages, Books, News, etc.
Hi,
Our app has a section where, we show to users how to activate "Silence Unknown Callers", because is a crucial feature for our app. But, we saw that 30% of users drop the process here, because we can't open directly that setting option in phone app.
We are using this url scheme to open phone settings in iOS 18:
if let url = URL(string: "App-prefs:com.apple.mobilephone") {
UIApplication.shared.open(url)
}
But, we don't see other way to open directly the path "silence", like in iOS 17, with this url scheme: prefs:root=Phone&path=SILENCE_CALLS
So, do you know if is possible open that option directly? We want to improve our accessibility.
Thank you!
I've just received an email from Apple regarding the Global Accessibility Awareness Day and some forthcoming sessions to promote their accessibility features.
What a joke.
For many years, Apple refuses to provide the most basic accessibility requirement on macOS:
LET USERS DISABLE ALL NON-CONSENSUAL UNSOLICITED ANIMATIONS AND OTHER UI CONVULSIONS.
The scourge of animations started from macOS Lion.
Yes, many of them can be, fortunately, disabled through some obscure Terminal commands (that is, if the user is lucky enough to discover them on some obscure internet resources).
The "Reduce motion" control in System Settings is a fake option that doesn't do anything.
And there are two most glaring accessibility violations that cannot be disabled:
Scroll bar rollover highlight effect introduced on macOS 10.7.3. Every time you move the cursor over a scroll bar, the bar gets highlighted. It results in bringing the user's attention to random scroll bars for no reason whatsoever just because the cursor happens to pass over the bar at some point. HUNDREDS of unnecessary, annoying events of distraction daily!
Expand/collapse animation of NSOutlineView (such as when we open/close a folder in the list view in the Finder, as well as any other app that's using outline views). It's extremely annoying, distracting, and time-wasting.
All feedback submitted about this through the years remains mostly ignored (except for a few cases where I received some ridiculous replies from employees who, apparently, are barely familiar with Macs in general).
Apple does NOT care about accessibility. Not only this, but it's obvious that Apple is, in fact, intentionally abusing those users who can't tolerate distracting, time-wasting animations and UI convulsions.
I’m working on a macOS Accessibility setup for a French-speaking user and I’ve hit a wall. (I'm not a developper and I'm trying to help my kid with dyslexia)
I successfully built a custom word prediction panel using the Panel Editor (Keyboard) in macOS Accessibility > Keyboard > Accessibility Keyboard.
Here’s what I have so far:
• The prediction panel works system-wide: I can use it to type in Finder, Safari, Notes, TextEdit, and even browser search bars.
• The panel appears above all applications and suggestions show up correctly.
• However, it does not work inside Google Docs (tested in Chrome, Safari, and Firefox). Selecting a word from the panel does nothing in the Docs editor.
I suspect this is because:
• Google Docs does not use a standard macOS text input field.
• Docs is a web app that relies on custom JavaScript editors, contentEditable elements, and canvas rendering, so macOS Accessibility APIs (AXTextField, AXInsertText, etc.) don’t register or inject text events.
• Accessibility tools like the Accessibility Keyboard rely on native macOS text input methods, which don’t hook into Google Docs’ custom editor.
Important:
I’m not a programmer. I’d like to know if there is an easy fix or option in macOS, Google Chrome, or Google Docs that would make my custom prediction panel work, before going into custom development.
Technical setup:
• MacBook Air (M2, 2022)
• RAM: 8 GB
• macOS: Sequoia 15.3.1
• Language: French (system and keyboard)
• Accessibility Keyboard: Enabled via Settings > Accessibility > Keyboard
• Custom panel: Built using Panel Editor (Keyboard), named “Philemon Prédiction”
• Browsers tested: Chrome, Safari, Firefox (same issue)
• Behavior: Panel is visible, suggestions appear, but inserting text does nothing in Google Docs
Has anyone worked around this limitation? Is there a simple setting, workaround, or accessibility option to bridge macOS Accessibility input with Google Docs’ editor?
Thanks a lot!
Topic:
Accessibility & Inclusion
SubTopic:
General
Haptic or Sound queue to allow for the accessibility of the blind (sound) and deaf population (haptic) for even knowing when location services and the camera were last used?
Also, the grey notification rather than the purple notification for location services should appear for the full 24 hours after an application has used the app, if the correct description is within the "copy" of Settings
The green light lets them know that the application has changed to the camera and fade out orange light both could even have subtle simply click sounds, like a
shutter, big haptic, softer sound, but editable in Settings, of course