System settings => Accessibility => System Voice => the little (i) beside the pulldown => Voices => THIS SCREEN will allow you to download Premium Voices
Is there a way to trigger this screen programmatically. Or at least a link to get my users there without having to dig thru that swamp of screens?
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am facing issue of back camera in my iphone 14 plus it is showing the black screen and my iphone is manufacture between april 2023 to april 2024 but its still not eligible for apple program my phone is also getting same issue why its not eligible for it
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars.
Issue details:
My iPhone is set to Region: Chile and Language: Spanish (Chile).
In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars.
A notification with the text:
let content = UNMutableNotificationContent()
content.body = "¡Has recibido un pago por $5.000!"
is read aloud by Siri as:
”¡Has recibido un pago por 5.000 dólares!”
(English: “You have received a payment of five thousand dollars!”)
instead of the correct:
”¡Has recibido un pago por 5.000 pesos!”
(English: “You have received a payment of five thousand pesos!”)
Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177
This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services:
watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions).
Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency.
Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly.
Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency.
I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348.
This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars.
Has anyone found a workaround, or is there any update from Apple on this?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Siri and Voice
User Notifications
Localization
Apple Intelligence
Hello,
Whenever I put accessibility focus on an image and if image has some text in it, voiceover reads that text along with image's accessibility label.
Is there a way to programmatically turn off text recognition on images for accessibility?
I couldn't find any relevant accessibility API's that could help here.
Thanks!
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message.
Does anyone have any ideas what the problem could be?
Topic:
Accessibility & Inclusion
SubTopic:
General
My game app is text-based interactive fiction, containing no audio/video content, making captions unnecessary. Our game is completely accessible to deaf users.
Despite this, in the Accessibility Nutrition Label, I'm only able to leave the "Captions" box checked or unchecked. Leaving it unchecked would leave deaf players with the wrong impression that they can't enjoy our game. Leaving it checked would imply that we do have A/V content with captions included.
In the WWDC video on this, https://developer.apple.com/videos/play/wwdc2025/224/ the video says:
After we completed common tasks, we realized our app doesn’t have any video or audio only content. In this case, we aren’t going to indicate that Landmarks supports Captions. That's okay. This accurately describes the features that people will expect to be available while using the app.
Maybe that's "OK," but I wish the form allowed me to say "This app doesn't contain audio/video content."
Hi,
Our app has a section where, we show to users how to activate "Silence Unknown Callers", because is a crucial feature for our app. But, we saw that 30% of users drop the process here, because we can't open directly that setting option in phone app.
We are using this url scheme to open phone settings in iOS 18:
if let url = URL(string: "App-prefs:com.apple.mobilephone") {
UIApplication.shared.open(url)
}
But, we don't see other way to open directly the path "silence", like in iOS 17, with this url scheme: prefs:root=Phone&path=SILENCE_CALLS
So, do you know if is possible open that option directly? We want to improve our accessibility.
Thank you!
hi giys,can anyone help me bcouse i cant pair my apple watch series 1 with my iPhone 15
Topic:
Accessibility & Inclusion
SubTopic:
General
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include:
Apple Music, Library tab
Photos, Album view
Reminders
In all these cases, VoiceOver reads this element as "More, Button".
In my SwiftUI app, I've implemented a visually identical button.
Menu {
// Button for Menu Item 1
// Button for Menu Item 2
// ...
} label: {
Image(systemName: "ellipsis.circle")
.accessibilityHidden(true)
}
.accessibilityLabel("More")
However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
Updated to iOS 26 beta and now the TV remote app in the control center won’t open. I’ve tried the following:
Restart phone
Remove shortcut and re-add
Cant find any other troubleshooting methods for this issue online so I’m guessing it’s a new problem.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey,
We've run into an issue where WKWebView contents are not always available for VoiceOver users. It seems to occur when WKWebView contents are loaded asynchronously.
I have a sample project where this can be reproduced and a video showing the issue. See FB21257352
The only solution we currently see is forcing an update continuously using UIAccessibility.post(notification: .layoutChanged, argument: nil), but this is ofc a last resort as it may have other unintended side effects.
Hello,
I have the following problem. I’m developing a NoCode app using the FlutterFlow platform and have been working on it for over a year.
This time, after publishing a new version of the app through FlutterFlow, I tried logging into Apple Store Connect, but I got an error saying that I had made too many login attempts and needed to try again later. However, I hadn’t attempted to log in before that at all.
No matter how long I wait—24 hours, 48 hours—the same error keeps appearing, meaning I still can’t access my account. Apple Support hasn’t responded for 4 days, and in total, I’ve been locked out of my account for over 9 days.
Please help me understand what might be causing this issue. Apple Store Connect refuses to send me an SMS with the login code.
Topic:
Accessibility & Inclusion
SubTopic:
General
When using an app via external keyboard, FocusState and .focused used to work just fine until iOS17. Vertical-axis textfields were also accessible without any issues. But after iOS18 update, adding focused modifier removes elements out of focus order of external keyboard.
1 such example is -when a button using focused modifier and @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink, that button is not accessible via Bluetooth (external) keyboard.
TextEditor / Vertical-axis TextFields also seem to be impacted in external-keyboard-focus-order when added inside ScrollView.
Is this a known iOS18 issue with ScrollView / any tip to get this fixed ?
Sample code that can reproduce this issue:
struct ContentView: View {
@State private var showBottomSheet: Bool = false
@State private var goToNextView: Bool = false
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
var body: some View {
NavigationView {
VStack {
Text("Hello, world!")
// This button works fine in Bluetooth keyboard in all versions
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Goto another view") {
goToNextView = true
}
NavigationLink(
destination: View2(),
isActive: $goToNextView
) { EmptyView() }
.accessibility(hidden: true)
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
#Preview {
ContentView()
}
struct View2: View {
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
@State private var showBottomSheet: Bool = false
var body: some View {
ScrollView {
VStack {
Text("check")
// In iOS18, this button doesn't get focused in Bluetooth / external keyboard
// This issue occurs when these 3 combine in iOS 18 - a button using FocusState inside a view that has a ScrollView & it is opened via NavigationLink
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Test button") { }
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
before I start this could just be me and handful of people but I like to reorganize my phone screen to my needs based on what’s going on in life. I was jaut thinking it would be easier if u could get rid of all the folders at once then reorganize or something easier than this long extensive process it is now.
Hope it's okay to post here - I haven't gotten resolution anywhere else. Apple's iOs Live Captions is supposed to translate speech into written text either on the phone (works like a charm!) or via microphone (think meeting in a conference room). Microphone doesn't work anywhere, anytime on a new iPhone 14 purchased November 2024. Anyone out there want to fix this and help a lot of people who have trouble hearing? I'm part of an entire generation that didn't know we were supposed to protect our hearing at concerts and clubs and worse, thought it was cool to snag a spot by the speakers...
I want to insert the medication data which is available from ios 26 from my app to apple health kit. I have tried to get the permission to read and write data but app got crashed while I tried to request that permission. Does apple allow to insert the medication data to apple health kit likewise we are able to add other health and fitness data or not?
let healthStore = HKHealthStore()
@available(iOS 26.0, *)
@objc func requestAuthorization(_ resolve: @escaping RCTPromiseResolveBlock,
rejecter reject: @escaping RCTPromiseRejectBlock) {
guard HKHealthStore.isHealthDataAvailable() else {
print("not available ")
return
}
let doseType = HKObjectType.medicationDoseEventType()
let medType = HKObjectType.userAnnotatedMedicationType()
healthStore.requestAuthorization(toShare: [doseType], read: [doseType]) { success, error in
if let err = error { reject("auth_error", err.localizedDescription, err); return }
self.healthStore.requestPerObjectReadAuthorization(for: medType, predicate: nil) { s, e in
if let err2 = e { reject("per_obj_auth", err2.localizedDescription, err2); return }
resolve(["ok": success && s])
}
}
}
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings.
Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2.
Particularly affects:
Third-party speech synthesis voices (CereProc, Grammatek, etc.)
Apps relying on automatic voice selection based on user preferences
Example:
// User selected CereProc Heather for en-GB in Accessibility settings
let voice = AVSpeechSynthesisVoice(language: "en-GB")
print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default)
Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting.
Tested methods:
AVSpeechSynthesisVoice(language:)
AVSpeechUtterance auto-selection
Reflection for new APIs
All return the system default voice, not the user's preference.
Filed: FB[20271264]
Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
Feedback number: FB20451665
When building with Xcode 26, Voice Over is reporting an extra tab when swiping through tabs. Please see the sample project below:
/*
This is a Sample project to show that I believe there is a Voice Over bug in iOS 26.
When swiping through tabs with Voice Over active, there always appears to be an extra tab.
Here I have 5 tabs, when on tab one VO reads out tab 1 of 6, then tab 2 of 6, all the way to the last tab, when voice over reads out tab 5 of 6. Never tab 6 of 6.
Is there a possibility that voice over is picking up the underlying `more` tab and reading that out?
This has also been reportedly found in the Files app here:
https://www.applevis.com/comment/195441#comment-195441
*/
struct ContentView: View {
var body: some View {
TabView {
/// Activating this has Voice over telling us there are 6 Tabs.
Tab(RootTab.home.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.home.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.home.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.home.title.capitalized) tab")
Tab(RootTab.diary.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.diary.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.diary.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.diary.title.capitalized) tab")
Tab(RootTab.meals.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.meals.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.meals.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.meals.title.capitalized) tab")
Tab(RootTab.knowledge.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.knowledge.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.knowledge.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.knowledge.title.capitalized) tab")
Tab(RootTab.profile.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.profile.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.profile.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.profile.title.capitalized) tab")
/// Activating this also has Voice over telling us there are 6 Tabs.
// ForEach(RootTab.allCases, id: \.self) { tab in
//
// Text("This is the \(tab.title.capitalized) screen")
// .tabItem {
// Label(tab.title.capitalized, systemImage: "circle.fill")
// }
// .accessibilityLabel("\(tab.title.capitalized) tab")
// .accessibilityHint("Double tap to open the \(tab.title.capitalized) tab")
// }
}
}
enum RootTab: CaseIterable {
case home
case diary
case meals
case knowledge
case profile
var title: String {
switch self {
case .home:
"home"
case .diary:
"diary"
case .meals:
"meals"
case .knowledge:
"knowledge"
case .profile:
"profile"
}
}
}
}
I'm curious if anyone else can see this issue, or if anyone knows of a workaround for it.
My team has a robust digital accessibility program and processes for WCAG conformance in our apps. Because of this, there are definitely accessibility defects that get caught and addressed in order of impact and business priority like any other bug. Obviously we want to aim for 100% accessibility for our users, but it's a continual work in progress as new enhancements or changes are released.
I'm stuck on the appropriate measurement to indicate support. If we have 50 common tasks and the most central 10 tasks are solid but some supporting (but also common) tasks have a contrast fail or accessibleLabel missing, does that make the whole app not supporting the feature? If "completing the task" is the rubric there are a whole range of interpretations for that.
In a complex app, I anticipate that a group like ours will have strong support for many of the Accessibility Nutrition Labels accessibility features across tasks and devices, but realistically never be 100% free of defects for a given Apple Accessibility feature, even among core tasks.
As I consider the next steps for Nutrition Labels, I do not see anything in the documentation that gives a sort of baseline or measurement for inclusion. We plan to test all steps to complete a task, and log defects accordingly with an assigned timeline for fixing them (as would be true for functional defects).