In VoiceOver, when using Group Navigation style, the cursor first focuses on the semantic group. To navigate inside the group, a two-finger swipe (left or right) can be used. This behavior works for default containers like the Navigation Bar, Tab Bar, and Tool Bar.
How can I achieve the same behavior for a custom view?
I tried setting accessibilityContainerType = .semanticGroup, but it only works for Mac Catalyst. Is there an equivalent approach for iOS?
                    
                  
                Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
          Post
Replies
Boosts
Views
Activity
                    
                      When VoiceOver reads decimal numbers with six or more digits after the decimal, it stops announcing the decimal separator and also adds pauses between each digit.
Text("0.12345") // VoiceOver: "zero **point** one two three four five"
Text("0.123456") // VoiceOver: "zero one, two, three, four, five, six"
How can I force VoiceOver to announce the decimal separator ("point") and not insert pauses regardless of the number of decimal digits?
                    
                  
                
                    
                      I have implemented a SwiftUI view containing a grid of TextField elements, where focus moves automatically to the next field upon input. This behavior works well on iOS 16 and 17, maintaining proper focus highlighting when keyboard full access is enabled.
However, in iOS 18 and above, the keyboard full access focus behaves differently. It always stays behind the actual focus state, causing a mismatch between the visually highlighted field and the active text input. This leads to usability issues, especially for users navigating with an external keyboard.
Below is the SwiftUI code for reference:
struct AutoFocusGridTextFieldsView: View {
    private let fieldCount: Int
    private let columns: Int
    @State private var textFields: [String]
    @FocusState private var focusedField: Int?
    init(fieldCount: Int = 17, columns: Int = 5) {
        self.fieldCount = fieldCount
        self.columns = columns
        _textFields = State(initialValue: Array(repeating: "", count: fieldCount))
    }
    var body: some View {
        let rows = (fieldCount / columns) + (fieldCount % columns == 0 ? 0 : 1)
        VStack(spacing: 10) {
            ForEach(0..<rows, id: \.self) { row in
                HStack(spacing: 10) {
                    ForEach(0..<columns, id: \.self) { col in
                        let index = row * columns + col
                        if index < fieldCount {
                            TextField("", text: $textFields[index])
                                .frame(width: 40, height: 40)
                                .multilineTextAlignment(.center)
                                .textFieldStyle(RoundedBorderTextFieldStyle())
                                .focused($focusedField, equals: index)
                                .onChange(of: textFields[index]) { newValue in
                                    if newValue.count > 1 {
                                        textFields[index] = String(newValue.prefix(1))
                                    }
                                    if !textFields[index].isEmpty {
                                        moveToNextField(from: index)
                                    }
                                }
                        }
                    }
                }
            }
        }
        .padding()
        .onAppear {
            focusedField = 0
        }
    }
    private func moveToNextField(from index: Int) {
        if index + 1 < fieldCount {
            focusedField = index + 1
        }
    }
}
struct AutoFocusGridTextFieldsView_Previews: PreviewProvider {
    static var previews: some View {
        AutoFocusGridTextFieldsView(fieldCount: 10, columns: 5)
    }
}
Has anyone else encountered this issue with FocusState in iOS 18?
I really do believe that this is a bug strictly connected to keyboard navigation since I experienced similar problem also on UIKit equivalent of the view.
Any insights or suggestions would be greatly appreciated!
                    
                  
                
                    
                      I’m trying to customize the keyboard focus appearance in SwiftUI.
In UIKit (see WWDC 2021 session Focus on iPad keyboard navigation), it’s possible to remove the default UIFocusHaloEffect and change a view’s appearance depending on whether it has focus or not.
In SwiftUI I’ve tried the following:
.focusable() // .focusable(true, interactions: .activate)
.focusEffectDisabled()
.focused($isFocused)
However, I’m running into several issues:
.focusable(true, interactions: .activate) causes an infinite loop, so keyboard navigation stops responding
.focusEffectDisabled() doesn’t seem to remove the default focus effect on iOS
Using @FocusState prevents Space from triggering the action when the view has keyboard focus
My main questions:
How can I reliably detect whether a SwiftUI view has keyboard focus? (Is there an alternative to FocusState that integrates better with keyboard navigation on iOS?)
What’s the recommended way in SwiftUI to disable the default focus effect (the blue overlay) and replace it with a custom border?
Any guidance or best practices would be greatly appreciated!
Here's my sample code:
import SwiftUI
struct KeyboardFocusExample: View {
    var body: some View {
        // The ScrollView is required, otherwise the custom focus value resets to false after a few seconds. I also need it for my actual use case
        ScrollView {
            VStack {
                Text("First button")
                    .keyboardFocus()
                    .button {
                        print("First button tapped")
                    }
                
                Text("Second button")
                    .keyboardFocus()
                    .button {
                        print("Second button tapped")
                    }
            }
        }
    }
}
// MARK: - Focus Modifier
struct KeyboardFocusModifier: ViewModifier {
    @FocusState private var isFocused: Bool
    
    func body(content: Content) -> some View {
        content
            .focusable() // ⚠️ Must come before .focused(), otherwise the FocusState won’t be recognized
//            .focusable(true, interactions: .activate) // ⚠️ This causes an infinite loop, so keyboard navigation no longer responds
            .focusEffectDisabled() // ⚠️ Has no effect on iOS
            .focused($isFocused)
        
            // Custom Halo effect
            .padding(4)
            .overlay(
                RoundedRectangle(cornerRadius: 18)
                    .strokeBorder(
                        isFocused ? .red : .clear,
                        lineWidth: 2
                    )
            )
            .padding(-4)
    }
}
extension View {
    public func keyboardFocus() -> some View {
        modifier(KeyboardFocusModifier())
    }
}
// MARK: - Button Modifier
/// ⚠️ Using a Button view makes no difference
struct ButtonModifier: ViewModifier {
    let action: () -> Void
    
    func body(content: Content) -> some View {
        content
            .contentShape(Rectangle())
            .onTapGesture {
                action()
            }
            .accessibilityAction {
                action()
            }
            .accessibilityAddTraits(.isButton)
            .accessibilityElement(children: .combine)
            .accessibilityRespondsToUserInteraction()
    }
}
extension View {
    public func button(action: @escaping () -> Void) -> some View {
        modifier(ButtonModifier(action: action))
    }
}
                    
                  
                
                    
                      I’m trying to add the .header accessibility trait to a UISegmentedControl so that VoiceOver recognizes it accordingly. However, setting the trait using the following code doesn’t seem to have any effect:
segmentControl.accessibilityTraits = segmentControl.accessibilityTraits.union(.header)
Even after applying this, VoiceOver doesn’t announce it as a header. Is there any workaround or recommended approach to achieve this?
                    
                  
                
                    
                      who else has this error?
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      I'm trying to validate my app's handling of voiceover accessibility when using VoiceOver hotspots.
What I'm doing:
Set a hotspot
Validate hotspot references correct control within the hotspot chooser
set another hotspot
Validate hotspot references both correct controls within the hotspot chooser
Try to use one of the hotspots
Result: The hotspot has changed to some other control in the app (usually one of the window buttoms (close, minimze, full screen) but at other times are one of the menus or a completely different control in the window.
My question is how I can debug what might be going on in the app when the hotspots are resolved and invoked. I'm assuming I have some accessibility property set improperly for these controls that are causing incorrect resolution of the hotspots.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      I have a couple follow up questions after the "Accessibility technologies group lab".
I know it was briefly mentioned that user feedback is an excellent way to grow inclusivity in the design an app and utilizing these forums were one for example.
Is inviting folks here on the forum via test flight a reasonable approach to this for a solo developer?
Are there other strategies, avenues, or examples to promote user feedback?
                    
                  
                
                    
                      I use AttributedString to create a string containing a link. And I set the  AttributedString to UILabel. How should I set up the Accessibility feature to make sure that
I can keyboard focus on the substring with link and use keyboard operation to open the link
I can VoiceOver the whole string and VoiceOver the substring with link to open the link
Thanks a lot.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Allow the user to add their own tags to the default emoji tags.
For instance, this emoji, for me, is nonna: 🤌🏻. My efficiency would improve immensely if I could search for it as the “Nonna” emoji, rather than searching for nonna, remembering it doesn’t exist, trying the search for other things it might be called, realising I don’t know what it is, then having to scroll through all the hand emojis twice to find it.
🤌🏻🤞🏼👌
                    
                  
                
                    
                      I m unable to connect CarPlay to Mercedes GLC 2024 with Apple 15PMX. Made all recommendations by Apple, renew the car's sw too. It didn't work despite I tried many times. Funny that my iPhone 11 connects flawlessly both with cord or wireless.
However my iPhone 15PMX does connect to any other cars too either with cable or wireless.
Is it a bug or a really inability?
                    
                  
                
                    
                      I have been working on a feature, where I have a List in SwiftUI with previous and next data loading, user can scroll up and down to load previous/next page data.
Recently, I faced one accessibility issue while testing voice-over, when user lands on the listing screen and swipe across the screen from navigation and when focus comes on list it should highlight the first item visible.
But when user swipes back:
Should it load the previous data and announce the previous item or it should go back to the navigation items?
If it loads the previous item, what if the user wants to go to the navigation to switch to other actions and vice-versa?
Did anyone come across this kind of issue? What can be the standard expected behavior in this case if list has both previous and next page scroll?
I different tried gestures https://support.apple.com/en-in/guide/iphone/iph3e2e2281/ios, but it isn't working
                    
                  
                
                    
                      I am seeing a strange issue where NSObject accessibilityRespondsToUserInteraction returns true on Simulator but false on device.
Checking the same object on simulator with Accessibility inspector I see the object traits as image so why would it return true in that case?
Are there any other way to check the the item might be accessibilityRespondsToUserInteraction OR Clickable beside that property and traits?
(Or is it just  another bug)
                    
                  
                
                    
                      I have an app that uses nearby with a custom accessory.
works great on iPhone 11-13,
starting with iPhone 14, one must use ARkit to get angles
we have two problems
ARkit is light sensitive, and we do not control the lighting where this app would run.. the 11-13 action works great even in the dark. (our users are blind, this is an accessibility app)
ARkit wants to be foreground, but our uses cannot see it, and we have a voice oriented UI that provides navigation instructions..
IF ARkit is foreground, our app doesn't work.
with iPhone 15 ProMax, on IOS 18, I got an error, access denied. (not permission denied) now that I am on IOS 26.. bt scan doesn't happen
also fails same way on iPhone 17 on IOS26, can't callback now as release signing is no longer done
this same code works ok on iOS 17.1 on iPhone 12.
Info.plist here
info.txt
        if(SearchedServices == [] ){
            services = [TransferService.serviceUUID,QorvoNIService.serviceUUID]
        }
        logger.info(
            "scannerready, starting scan for peripherals \(services) and devices \(IDs)")
        filteredIDs=IDs;
        scanning=true;
        centralManager.scanForPeripherals(withServices: services,
                                                options: [CBCentralManagerScanOptionAllowDuplicatesKey: true])
the calling code
dataChannel.autoConnect=autoConnect;
dataChannel.start(x,ids)  // datachannel.start is above
    self.scanning = true;
    return "scanning started";
... log output
services from js =  and devices= 5FE04CBB
services in implementation =
bluetooth ready, starting scan for peripherals [] and devices ["5FE04CBB"]
scannerready, starting scan for peripherals [6E400001-B5A3-F393-E0A9-E50E24DCCA9E, 2E938FD0-6A61-11ED-A1EB-0242AC120002] and devices ["5FE04CBB"]
⚡️  TO JS {"value":"scanning started"}
                    
                  
                
                    
                      again and again this issue is coming , restarted my laptop, have storage , I don't why this issue is coming!!
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      I am developing a program on my chip and attempting to establish a connection with the WiFi Aware demo app launched by iOS 26. Currently, I am encountering an issue during the pairing phase.
If I am the subscriber of the service and successfully complete the follow-up frame exchange of pairing bootstrapping, I see the PIN code displayed by iOS.
Question 1: How should I use this PIN code?
Question 2: Subsequently, I need to negotiate keys with iOS through PASN. What should I use as the password for the PASN SAE process?
If I am the subscriber of the service and successfully complete the follow-up frame exchange of pairing bootstrapping, I should display the PIN code.
Question 3: How do I generate this PIN code?
Question 4: Subsequently, I need to negotiate keys with iOS through PASN. What should I use as the password for the PASN SAE process?
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Hello,
I’m reaching out regarding an issue with our organization’s Apple Developer Program enrollment. We’ve successfully created a developer account and our organization is verified through the D-U-N-S system. The D-U-N-S ID is correctly displayed in our Apple Developer account.
However, the enrollment status still shows: “Your enrollment is being processed.” It’s been 3 months, and we haven’t received any further communication or updates.
Has anyone experienced a similar delay? Is there anything else we should do to expedite the process?
Any guidance or insight would be greatly appreciated.
Thanks,
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      I upgraded to iPadOS 26.1 beta v2 yesterday and suddenly I can’t use share screen in apps like GNeet, Discord, zoom etc.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      While editing the search text using the external keyboard (with VoiceOver on), if I try to navigate the to List using the keyboard, the focus jumps back to the search field immediately, preventing selection of list items. It's important to note that the voiceover navigation alone without a keyboard works as expected.
It’s as if the List never gains focus—every attempt to move focus lands back on the search field.
The code:
struct ContentView: View {
@State var searchText = ""
let items = ["Apple", "Banana", "Cherry", "Date", "Elderberry", "Fig", "Grape"]
var filteredItems: [String] {
    if searchText.isEmpty {
        return items
    } else {
        return items.filter { $0.localizedCaseInsensitiveContains(searchText) }
    }
}
var body: some View {
    if #available(iOS 16.0, *) {
        NavigationStack {
            List(filteredItems, id: \.self) { item in
                Text(item)
            }
            .navigationTitle("Fruits")
            .searchable(text: $searchText)
        }
    } else {
        NavigationView {
            List(filteredItems, id: \.self) { item in
                Text(item)
            }
            .navigationTitle("Fruits")
            .searchable(text: $searchText)
        }
    }
}
}
                    
                  
                
                    
                      why did the screen recorder button disappear?  It cannot be found anywhere.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General