before I start this could just be me and handful of people but I like to reorganize my phone screen to my needs based on what’s going on in life. I was jaut thinking it would be easier if u could get rid of all the folders at once then reorganize or something easier than this long extensive process it is now.
                    
                  
                Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
          Post
Replies
Boosts
Views
Activity
                    
                      I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars.
Issue details:
My iPhone is set to Region: Chile and Language: Spanish (Chile).
In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars.
A notification with the text:
let content = UNMutableNotificationContent()
content.body = "¡Has recibido un pago por $5.000!"
is read aloud by Siri as:
”¡Has recibido un pago por 5.000 dólares!”
(English: “You have received a payment of five thousand dollars!”)
instead of the correct:
”¡Has recibido un pago por 5.000 pesos!”
(English: “You have received a payment of five thousand pesos!”)
Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177
This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services:
watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions).
Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency.
Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly.
Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency.
I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348.
This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars.
Has anyone found a workaround, or is there any update from Apple on this?
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            Siri and Voice
          
        
        
      
      
    
      
      
      
        
          
            User Notifications
          
        
        
      
      
    
      
      
      
        
          
            Localization
          
        
        
      
      
    
      
      
      
        
          
            Apple Intelligence
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      I am an artist (singer songwriter) and I use the Photos app to manage albums related to my various creative projects. And these are some BIG issues that i am SURPRISED never came into the account or maybe were overlooked -
Missing Search Bar When Adding Photos to Albums: Why there is no search bar when adding a photo to a bag of hundred of albums? (Artists like me like to organise things into different albums and folders)
I can no longer search for albums by name after ios 18 update, which was previously very helpful in quickly locating them.
Albums can be arranged & moved in the same folder but there is no way to move albums between DIFFERENT FOLDERS and the only wat is to create a new album in that folder and select and transfer everything and delete that old album.
                    
                  
                
                    
                      Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode.
Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work?
Cannot make FaceTime calls with favorite contacts.
Find My iPhone cannot jump to the maps app.
Camera cannot zoom in or out.
Photos cannot be deleted, edited, or shared in a shared album in the photos app.
Photos/videos cannot be sent in messages.
Spotify cannot be accessed from the lock screen.
Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks).
There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
                    
                  
                
                    
                      The only way I found to make the accessibility focus work correctly in the detent in a fullscreen cover is to apply the focus manually.  The issue is in the ContentView the grabber works while in the fullscreen it does not.  Is there something I am missing or is this a bug. I also don't understand why I need to apply focus in the fullscreen cover while in the ContentView I do not.
struct ContentView: View {
    @State private var buttonClicked = false
    @State private var bottomSheetShowing = false
    var body: some View {
        NavigationView {
            VStack {
                Button(action: {
                    buttonClicked = true
                }, label: {
                    Text("First Page Button")
                        .padding()
                        .background(Color.blue)
                        .foregroundColor(.white)
                        .cornerRadius(8)
                })
                .accessibilityLabel("First Page Button")
                FullscreenView2()
            }
            .navigationTitle("Welcome")
            .fullScreenCover(isPresented: $buttonClicked) {
                FullscreenView(buttonClicked: $buttonClicked, bottomSheetShowing: $bottomSheetShowing)
            }
        }
    }
}
struct FullscreenView: View {
    @Binding var buttonClicked: Bool
    @Binding var bottomSheetShowing: Bool
    var body: some View {
        NavigationView {
            VStack {
                Button(action: {
                    bottomSheetShowing = true
                }, label: {
                    Text("Show Bottom Sheet")
                        .padding()
                        .background(Color.green)
                        .foregroundColor(.white)
                        .cornerRadius(8)
                })
            }
            .accessibilityHidden(bottomSheetShowing)
            .navigationTitle("Fullscreen View")
            .toolbar {
                ToolbarItem(placement: .navigationBarLeading) {
                    Button(action: {
                        buttonClicked = false
                    }, label: {
                        Text("Close")
                    })
                    .accessibilityLabel("Close Fullscreen View Button")
                }
            }
            .accessibilityHidden(bottomSheetShowing)
            .onChange(of: bottomSheetShowing, perform: { _ in })
            .sheet(isPresented: $bottomSheetShowing) {
                if #available(iOS 16.0, *) {
                    BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
                        .presentationDetents([.medium, .large])
                } else {
                    BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
                }
            }
        }
    }
}
struct FullscreenView2: View {
    @State var bottomSheetShowing = false
    var body: some View {
        VStack {
            Button(action: {
                bottomSheetShowing = true
            }, label: {
                Text("Show Bottom Sheet")
                    .padding()
                    .background(Color.green)
                    .foregroundColor(.white)
                    .cornerRadius(8)
            })
        }
        .accessibilityHidden(bottomSheetShowing)
        .navigationTitle("Fullscreen View")
        //.accessibilityHidden(bottomSheetShowing)
        .onChange(of: bottomSheetShowing, perform: { _ in })
        .sheet(isPresented: $bottomSheetShowing) {
            if #available(iOS 16.0, *) {
                BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
                    .presentationDetents([.medium, .large])
            } else {
                BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
            }
        }
    }
}
struct BottomSheetView: View {
    @Binding var bottomSheetShowing: Bool
//    @AccessibilityFocusState var isFocused: Bool
    var body: some View {
        VStack(spacing: 20) {
            Text("Bottom Sheet")
                .font(.headline)
                .accessibilityAddTraits(.isHeader)
            Button(action: {
                bottomSheetShowing = false
            }, label: {
                Text("Dismiss")
                    .padding()
                    .background(Color.red)
                    .foregroundColor(.white)
                    .cornerRadius(8)
            })
            .accessibilityLabel("Dismiss Bottom Sheet Button")
        }
        .padding()
        .frame(maxWidth: .infinity, maxHeight: .infinity)
        .background(
            Color(UIColor.systemBackground)
                .edgesIgnoringSafeArea(.all)
        )
        .accessibilityAddTraits(.isModal) // Indicates that this view is a modal
//        .onAppear {
//            // Set initial accessibility focus when the sheet appears
//            DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
//                isFocused = true
//            }
//        }
//        .accessibilityFocused($isFocused)
    }
}
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Hello! I'm adding VoiceOver support for my app, but I'm having an issue where my accessibility value is not being spoken. I have made a helper class that creates an NSString from a double and converts it to the user's region currency.
CurrencyFormatter.m
+ (NSString *) localizedCurrencyStringFromDouble: (double) value {
    NSNumberFormatter *formatter = [[NSNumberFormatter alloc] init];
    formatter.numberStyle = NSNumberFormatterCurrencyStyle;
    formatter.locale = [NSLocale currentLocale];
    NSString *currencyString = [formatter stringFromNumber: @(value)];
    [formatter release];
    return currencyString;
}
View Contoller
self.checkTotalLabel.accessibilityLabel = NSLocalizedString(@"Total Amount", @"Accessibility Label for Total");
self.checkTotalLabel.accessibilityValue = [CurrencyFormatter localizedCurrencyStringFromDouble: total];
I'm confused on whether the value should go into the accessibility label or not. When the currency is just USD and the language is English, it's a simple fix. But when the currency needs to be converted, I'm not sure where to go from here.
If anyone has any guidance, it would help me a lot!
Thank you!
                    
                  
                
                    
                      I have the following method to insert @mentions to a text field:
    func insertMention(user: Token, at range: NSRange) -> Void {
        let tokenImage: UIImage = renderMentionToken(text: "@\(user.username)")
        
        let attachment: NSTextAttachment = NSTextAttachment()
        attachment.image = tokenImage
        attachment.bounds = CGRect(x: 0, y: -3, width: tokenImage.size.width, height: tokenImage.size.height)
        attachment.accessibilityLabel = user.username
        attachment.accessibilityHint = "Mention of \(user.username)"
        
        let attachmentString: NSMutableAttributedString = NSMutableAttributedString(attributedString: NSAttributedString(attachment: attachment))
        attachmentString.addAttribute(.TokenID, value: user.id, range: NSRange(location: 0, length: 1))
        attachmentString.addAttribute(.Tokenname, value: user.username, range: NSRange(location: 0, length: 1))
        
        let mutableText: NSMutableAttributedString = NSMutableAttributedString(attributedString: textView.attributedText)
        mutableText.replaceCharacters(in: range, with: attachmentString)
        mutableText.append(NSAttributedString(string: " "))
        textView.attributedText = mutableText
        textView.selectedRange = NSRange(location: range.location + 2, length: 0)
        mentionRange = nil
        tableView.isHidden = true
    }
When I use XCode's accessibility inspector to inspect the text input, the inserted token is not read by the inspector - instead a whitespace is shown for the token. I want to set the accessibility-label to the string content of the NSTextAttachment. How?
                    
                  
                
                    
                      In the app I'm working on, I have a SwiftUI View embedded in a UIKit Storyboard. The SwiftUI View holds a menu with a list of payment tools, and the ForEach loop looks like this:
    ForEach(self.paymentToolsVM.paymentToolsItems, id: \.self) { paymentTool in
           Button {
                navigationCallback(paymentTool.segueID)
            } label: {
                PaymentToolsRow(paymentToolName: paymentTool.title, imageName: paymentTool.imageName)
                            .accessibilityElement()
                                    .accessibilityIdentifier("Billing_\(paymentTool.title.replacingOccurrences(of: " ", with: ""))")
          }
          if paymentTool != self.paymentToolsVM.paymentToolsItems.last {
                Divider()
            }
        }
So you can see the accessibility ID is there, and it shows up properly when I open up Accessibility Inspector with the simulator, but the testing script isn't picking up on it, and it doesn't show up when the view is inspected in Appium. I have other SwiftUI views embedded in the UIKit view, and the script picks up the buttons on those, so I'm not sure what's different about this one.
If it helps, the script is written in Java with the BDD framework. I can try to get the relevant part of the script if anyone thinks that would be helpful. Otherwise, is there anything else I can try?
                    
                  
                
                    
                      I’ve developed the Pro Talkie app—a walkie-talkie solution designed to keep you connected with family and friends
App Store: https://apps.apple.com/in/app/pro-talkie/id6742051063
Play Store: https://play.google.com/store/apps/details?id=com.protalkie.app
While the app works flawlessly on Android and in the foreground on iOS, I’m facing issues with establishing connections when the app is in the background or terminated on iOS.
Specifically, I’ve attempted the following:
Silent pushes and alert payloads: These are intended to wake the app in the background, but they often fail—notifications may not be received or can be delayed by 20–30 minutes, leading to a poor user experience.
VoIP pushes: These reliably wake the app, but they trigger the incoming call UI, which isn’t suitable for a walkie-talkie app that should connect directly without displaying a call screen.
I’ve enabled all the necessary background modes (audio, remote notifications, VoIP, background fetch, processing), but the challenge remains.
How can I ensure a consistent background connection on iOS without triggering the call UI?
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            APNS
          
        
        
      
      
    
      
      
      
        
          
            User Notifications
          
        
        
      
      
    
      
      
      
        
          
            PushKit
          
        
        
      
      
    
      
      
      
        
          
            Push To Talk
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      SwiftUI provides the accessibilityCustomContent(_:_:) modifier to add additional accessibility information for an element. However, I couldn’t find a similar approach in UIKit.
Is there a way to achieve this in UIKit?
                    
                  
                
                    
                      ar quicklook suddenly is grayed out on iphone 15 pro, I bought the phone new recently ot was working great, 2 days ago updated to ios 18.1.4, ar mode kept opening but i started getting a move iphone over surface message and the object wouldn’t detect surfaces correctly,  updated to ios 18.5, now when i open quicklook modesl ar is completely greyed out,
can someone help me fix or detect the issue
thank you
                    
                  
                
                    
                      Can you guys like probably make Visual Intelligence available for the action button on the iPhone 16e? It should be only for iPhones that use A18 and future gen apple chips.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Is it possible to play certain sounds from a macOS app that remain at full volume when VoiceOver is speaking?
Here's some background:
I want to play sounds from my app (even when it's not in focus) to notify a VoiceOver user that an action is available (this action can be triggerred even when the app is not in focus; and is comfigurable by the user).
I tired using an NSSound for this, but VoiceOver ducks the audio of my sound when it is speaking.
Is there some way to avoid audio ducking for certain sounds? Or is there another, perhaps lower level, audio API that i can use to achieve this?
                    
                  
                
                    
                      Please excuse me if this is obvious. I'm new to Apple development.
Is there a SwiftUI Accessibility Inspector?  I run the standard one, in Xcode 26b3, and it shows me warnings for things that I didn't create in SwiftUI.  I presume that "SwiftUI" is primarily implemented using macros and that these things are either generated or boilerplate lower-level things.  But if so, then why would they trip Accessibility Inspector warnings?  Is there something I can do from SwiftUI to clear them?
Or... is there a demangler somewhere that will translate from these names into something this human might recognize?
I'm targeting macos, btw, if that makes any difference.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Added a view controller in the storyboard, added a tableview in this view, and added a cell under the table, when I run the APP to jump to the page, when using the narration function, I find that when I use three fingers to swipe up or down, a sentence will be broadcast in English, I want to no longer change the accessiblity of the cell, when I perform the behavior of swiping up or down with three fingers, Broadcast how Chinese should be implemented.
                    
                  
                
                    
                      Hi,
I am writing in the hope to receive some clarification about the rationale of the Audit type sufficientElementDescription - in context with Accessibility Audit API.
Please see my test below:
And another example in context with Xcode, where the strings visible in the UI are also set as accessible labels of their respective elements.
Thanks for your help!
                    
                  
                
                    
                      I have a view dynamically overlaid on a UITableView with proper padding (added when certain conditions are met). When VoiceOver focuses on a cell beneath this overlay, the focused element does not scroll into view. I’ve noticed similar behavior in Apple’s first-party Podcasts app.
Please find the attached image for reference. How can I resolve this issue and ensure VoiceOver scrolls the focused cell into view?
                    
                  
                
                    
                      Hello, so basically when I play some game and I get a notification it lags like badly to I think 20-30 fps and stutters, when notification is gone it works normally. Also sometimes when I open control panel its has slow and laggy animation.
Bought this phone like a week ago and this makes me sad :(.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running.
Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            Game Controller
          
        
        
      
      
    
      
      
      
        
          
            Accessibility
          
        
        
      
      
    
      
      
      
        
          
            Focus
          
        
        
      
      
    
      
      
      
        
          
            visionOS
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      Hi,
I'm trying to fix tvOS view for VoiceOver accessibility feature:
TabView { // 5 tabs
    Text(title)
    Button(play)
    ScrollView { // Live
        LazyHStack { 200 items }
    }
    ScrollView { // Continue watching
        LazyHStack { 500 items }
    }
}
When the view shows up VoiceOver reads:
"Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack.
VocieOver should only read "Home tab 1 of 5"
When moving focus to scroll view it reads:
"Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4"
When moving focus to second item it reads:
"Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4"
When moving focus to third item it reads:
"Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4"
It should be just reading what is focused, idealy just
"Live, Item 1, 1 of 200"
then after moving focus on item 2
"Item 2, 2 of 200"
this time without the word "Live" because we are on the same scroll view (the same horizontal list)
Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused.
This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction.
Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists.
How do I disable reading content that is not focused?
I tried:
.accessibilityLabel(isFocused ? title : "")
.accessibilityHidden(!isFocused)
.accessibilityHidden(true) - tried on various levels in view hierarchy
.accessiblityElement(children: .ignore) - even focused item is not read back by voice over
.accessiblityElement(children: .ignore) - even focused item is not read back by voice over
.accessiblityElement(children: .contain) - tried on various levels in view hierarchy
.accessiblityElement(children: .combine) - tried on various levels in view hierarchy
.accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy
.accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy
// the last 2 was basically an attempt to hack it
.accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view.
50+ other attempts at configuring accessibility tags attached to views.
I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before.
Any idea how to fix this?
Thanks.