I’m trying to add the .header accessibility trait to a UISegmentedControl so that VoiceOver recognizes it accordingly. However, setting the trait using the following code doesn’t seem to have any effect:
segmentControl.accessibilityTraits = segmentControl.accessibilityTraits.union(.header)
Even after applying this, VoiceOver doesn’t announce it as a header. Is there any workaround or recommended approach to achieve this?
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
While editing the search text using the external keyboard (with VoiceOver on), if I try to navigate the to List using the keyboard, the focus jumps back to the search field immediately, preventing selection of list items. It's important to note that the voiceover navigation alone without a keyboard works as expected.
It’s as if the List never gains focus—every attempt to move focus lands back on the search field.
The code:
struct ContentView: View {
@State var searchText = ""
let items = ["Apple", "Banana", "Cherry", "Date", "Elderberry", "Fig", "Grape"]
var filteredItems: [String] {
if searchText.isEmpty {
return items
} else {
return items.filter { $0.localizedCaseInsensitiveContains(searchText) }
}
}
var body: some View {
if #available(iOS 16.0, *) {
NavigationStack {
List(filteredItems, id: \.self) { item in
Text(item)
}
.navigationTitle("Fruits")
.searchable(text: $searchText)
}
} else {
NavigationView {
List(filteredItems, id: \.self) { item in
Text(item)
}
.navigationTitle("Fruits")
.searchable(text: $searchText)
}
}
}
}
Context:
We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read.
Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected.
Implementation:
We are using below API's to provide line content to voice over.
UIAccessibilityReadingContent
- accessibilityPageContent
- accessibilityFrameForLineNumber
- accessibilityContentForLineNumber
We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content.
We also use below APIs to cross element boundaries for all granular navigations.
accessibilityNextTextNavigationElement
accessibilityPreviousTextNavigationElement
We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
In SwiftUI, the date picker component is breaking in colour contrast accessibility. Below code has been use to create date picker:
struct ContentView: View {
@State private var date = Date()
@State private var selectedDate: Date = .init()
var body: some View {
let min = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
let max = Calendar.current.date(byAdding: .year, value: 4, to: Date()) ?? Date()
DatePicker(
"Start Date",
selection: $date,
in: min ... max,
displayedComponents: [.date]
)
.datePickerStyle(.graphical)
.frame(alignment: .topLeading)
.onAppear {
selectedDate = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
}
}
}
#Preview {
ContentView()
}
attaching the screenshot of failure accessibility.
I’m trying to customize the keyboard focus appearance in SwiftUI.
In UIKit (see WWDC 2021 session Focus on iPad keyboard navigation), it’s possible to remove the default UIFocusHaloEffect and change a view’s appearance depending on whether it has focus or not.
In SwiftUI I’ve tried the following:
.focusable() // .focusable(true, interactions: .activate)
.focusEffectDisabled()
.focused($isFocused)
However, I’m running into several issues:
.focusable(true, interactions: .activate) causes an infinite loop, so keyboard navigation stops responding
.focusEffectDisabled() doesn’t seem to remove the default focus effect on iOS
Using @FocusState prevents Space from triggering the action when the view has keyboard focus
My main questions:
How can I reliably detect whether a SwiftUI view has keyboard focus? (Is there an alternative to FocusState that integrates better with keyboard navigation on iOS?)
What’s the recommended way in SwiftUI to disable the default focus effect (the blue overlay) and replace it with a custom border?
Any guidance or best practices would be greatly appreciated!
Here's my sample code:
import SwiftUI
struct KeyboardFocusExample: View {
var body: some View {
// The ScrollView is required, otherwise the custom focus value resets to false after a few seconds. I also need it for my actual use case
ScrollView {
VStack {
Text("First button")
.keyboardFocus()
.button {
print("First button tapped")
}
Text("Second button")
.keyboardFocus()
.button {
print("Second button tapped")
}
}
}
}
}
// MARK: - Focus Modifier
struct KeyboardFocusModifier: ViewModifier {
@FocusState private var isFocused: Bool
func body(content: Content) -> some View {
content
.focusable() // ⚠️ Must come before .focused(), otherwise the FocusState won’t be recognized
// .focusable(true, interactions: .activate) // ⚠️ This causes an infinite loop, so keyboard navigation no longer responds
.focusEffectDisabled() // ⚠️ Has no effect on iOS
.focused($isFocused)
// Custom Halo effect
.padding(4)
.overlay(
RoundedRectangle(cornerRadius: 18)
.strokeBorder(
isFocused ? .red : .clear,
lineWidth: 2
)
)
.padding(-4)
}
}
extension View {
public func keyboardFocus() -> some View {
modifier(KeyboardFocusModifier())
}
}
// MARK: - Button Modifier
/// ⚠️ Using a Button view makes no difference
struct ButtonModifier: ViewModifier {
let action: () -> Void
func body(content: Content) -> some View {
content
.contentShape(Rectangle())
.onTapGesture {
action()
}
.accessibilityAction {
action()
}
.accessibilityAddTraits(.isButton)
.accessibilityElement(children: .combine)
.accessibilityRespondsToUserInteraction()
}
}
extension View {
public func button(action: @escaping () -> Void) -> some View {
modifier(ButtonModifier(action: action))
}
}
Hey folksI, I would like to ask for help on this topic:
I think this is exactly the same problem Combobox not working with VoiceOver after… - Apple Community.
VoiceOver also breaks the combobox from the official ARIA W3C website https://www.w3.org/WAI/ARIA/apg/patterns/combobox/examples/combobox-autocomplete-list/. When VO is turned off, I can use the up/down arrow to go through the menu items from the dropdown, but when VO is turned on, the up/down arrows cannot access the dropdown menu items.
Is there an official tutorial on how to control it using voice over?
Kind regards,
Jakub
Topic:
Accessibility & Inclusion
SubTopic:
General
I'm facing a bizarre issue with the Apple's Accessibility APIs. I am registering an AXObserver that listens for, among other things, the kAXSelectedTextChangedNotification. For many new users, the kAXSelectTextChangedNotification is not triggered, even though they have enabled Accessibility permission for the app. Other notifications are getting through (kAXWindowMovedNotification, kAXWindowResizedNotification, kAXValueChangedNotification etc - full list here), just not the kAXSelectedTextChangedNotification!
We've found that we can reproduce the error by removing accessibility permission for the app and rebooting our computers. After restarting and reenabling accessibility permissions, the kAXSelectedTextChangedNotification was not received, even though other notifications were fine.
Strangely, the issue can be resolved by launching Apple's Accessibility Inspector app on an impacted computer. Once the Accessibility Inspector is loaded, the kAXSelectedTextChangedNotifications start coming through as expected. This implies to me that either:
We are missing some needed setup when starting the observers. Accessibility Inspector gets it right, thus ‘starting’ the system properly.
Accessibility Inspector is using some Apple private APIs that we don’t have access to.
Things I’ve tried:
I've tried subscribing the AXSelectedTextChangedNotification to different AXUIElements, including the SystemWide element, the Application element, and children elements from the AXApplication. None of these received the kAXSelectedTextChangedNotification, until Accessibility Inspector is booted up. No surprises here, as Apple's documentation confirms that you should add the notification to the root Application AXUIElement if you want to receive notifications for all its children.
I had a theory that the issue might be due to my code calling AXUIElementCreateApplication multiple times, possibly creating multiple "Applications" in Apple's Accessibility implementation. If that’s the case, the notifications might be sent to the wrong application AXUIElement. However, refactoring my code to only call AXUIElementCreateApplication once didn't resolve the issue.
I thought the issue may be caused by subscribing the AXSelectedTextChangedNotification on the high-level application element (at odds with Apple's documentation). I've tried traversing the child AXUIElements until we find one with the kAXSelectedTextAttribute and then subscribing to that. This did not resolve the issue. I don’t think it's the correct path to continue exploring, given that the notifications are received correctly after AccessibilityInspector is launched.
There is one exception to the above: if I add the kSelectedTextChangedNotification listener to a specific text field AXUIElement, I do receive the notification on that text field. However, this is not practical; I need a solution that will work for all text fields within an app. The Accessibility Inspector appears to be doing something that causes the selected-text-changed notifications to be correctly passed up to the high-level application AXUIElement.
Another thought is that I could traverse the entire Accessibility hierarchy and add listeners to every subview that has the kAXSelectedTextAttribute. However, I don’t like this long-term solution. It will be slow and incomplete: new elements get added and removed frequently. I just want the kAXSelectedTextChangedNotification to be received by the high-level Application AXUIElement, which the documentation suggests it should be. I also have evidence that this can work, since notifications start coming through after Accessibility Inspector is launched. It’s just a matter of discovering how to replicate whatever Accessibility Inspector is doing.
An interesting wrinkle: I implemented the 'traverse' strategy above, but was surprised by how few elements were in the hierarchy. Most apps only go down ~2-3 levels, which didn't seem right to me. Perhaps the Accessibility tree isn't fully initialized? I tried adding a 5-second delay to allow more initialization time, but it didn't change anything.
Does anyone have any ideas? Here's our file.
why did the screen recorder button disappear? It cannot be found anywhere.
Topic:
Accessibility & Inclusion
SubTopic:
General
I am invoking the UIImagePickerController of type UIImagePickerControllerSourceTypePhotoLibrary from my viewController. I want shift the keyboard focus to the Cancel button which is the first interactive element on the gallery picker. When a user has full keyboard access turned on they should be able to tap tab and interact with the gallery picker modal. How do I achieve this?
Hi everyone,
I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?").
Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar.
I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this.
The Concept: Skeleton-based Normalization
Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input.
Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space.
Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance.
Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers).
Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees").
Why this approach?
Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body.
Privacy: We are processing coordinates, not video streams.
Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life.
Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps.
Looking forward to hearing your thoughts.
Is there any way to get history?
Topic:
Accessibility & Inclusion
SubTopic:
General
The only way I found to make the accessibility focus work correctly in the detent in a fullscreen cover is to apply the focus manually. The issue is in the ContentView the grabber works while in the fullscreen it does not. Is there something I am missing or is this a bug. I also don't understand why I need to apply focus in the fullscreen cover while in the ContentView I do not.
struct ContentView: View {
@State private var buttonClicked = false
@State private var bottomSheetShowing = false
var body: some View {
NavigationView {
VStack {
Button(action: {
buttonClicked = true
}, label: {
Text("First Page Button")
.padding()
.background(Color.blue)
.foregroundColor(.white)
.cornerRadius(8)
})
.accessibilityLabel("First Page Button")
FullscreenView2()
}
.navigationTitle("Welcome")
.fullScreenCover(isPresented: $buttonClicked) {
FullscreenView(buttonClicked: $buttonClicked, bottomSheetShowing: $bottomSheetShowing)
}
}
}
}
struct FullscreenView: View {
@Binding var buttonClicked: Bool
@Binding var bottomSheetShowing: Bool
var body: some View {
NavigationView {
VStack {
Button(action: {
bottomSheetShowing = true
}, label: {
Text("Show Bottom Sheet")
.padding()
.background(Color.green)
.foregroundColor(.white)
.cornerRadius(8)
})
}
.accessibilityHidden(bottomSheetShowing)
.navigationTitle("Fullscreen View")
.toolbar {
ToolbarItem(placement: .navigationBarLeading) {
Button(action: {
buttonClicked = false
}, label: {
Text("Close")
})
.accessibilityLabel("Close Fullscreen View Button")
}
}
.accessibilityHidden(bottomSheetShowing)
.onChange(of: bottomSheetShowing, perform: { _ in })
.sheet(isPresented: $bottomSheetShowing) {
if #available(iOS 16.0, *) {
BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
.presentationDetents([.medium, .large])
} else {
BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
}
}
}
}
}
struct FullscreenView2: View {
@State var bottomSheetShowing = false
var body: some View {
VStack {
Button(action: {
bottomSheetShowing = true
}, label: {
Text("Show Bottom Sheet")
.padding()
.background(Color.green)
.foregroundColor(.white)
.cornerRadius(8)
})
}
.accessibilityHidden(bottomSheetShowing)
.navigationTitle("Fullscreen View")
//.accessibilityHidden(bottomSheetShowing)
.onChange(of: bottomSheetShowing, perform: { _ in })
.sheet(isPresented: $bottomSheetShowing) {
if #available(iOS 16.0, *) {
BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
.presentationDetents([.medium, .large])
} else {
BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
}
}
}
}
struct BottomSheetView: View {
@Binding var bottomSheetShowing: Bool
// @AccessibilityFocusState var isFocused: Bool
var body: some View {
VStack(spacing: 20) {
Text("Bottom Sheet")
.font(.headline)
.accessibilityAddTraits(.isHeader)
Button(action: {
bottomSheetShowing = false
}, label: {
Text("Dismiss")
.padding()
.background(Color.red)
.foregroundColor(.white)
.cornerRadius(8)
})
.accessibilityLabel("Dismiss Bottom Sheet Button")
}
.padding()
.frame(maxWidth: .infinity, maxHeight: .infinity)
.background(
Color(UIColor.systemBackground)
.edgesIgnoringSafeArea(.all)
)
.accessibilityAddTraits(.isModal) // Indicates that this view is a modal
// .onAppear {
// // Set initial accessibility focus when the sheet appears
// DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
// isFocused = true
// }
// }
// .accessibilityFocused($isFocused)
}
}
Topic:
Accessibility & Inclusion
SubTopic:
General
ar quicklook suddenly is grayed out on iphone 15 pro, I bought the phone new recently ot was working great, 2 days ago updated to ios 18.1.4, ar mode kept opening but i started getting a move iphone over surface message and the object wouldn’t detect surfaces correctly, updated to ios 18.5, now when i open quicklook modesl ar is completely greyed out,
can someone help me fix or detect the issue
thank you
CallKit and WebRTC are used to realize the call functionality.
You can select video, voice, or text calling as your calling method.
When making a text call, the voice input is grayed out and cannot be used, is there a solution?
Hello!
I was doing some accessibility testing for my app and found out that when the user switches the text size, all of the data in the text fields is reset, which causes major disruption.
I've tried looking for documentation, but all I've found is information on how to dynamically scale the UI for different text sizes, which I've already implemented.
My guess is that every time Dynamic Type registers a change, it redraws my UI instead of just updating it.
How can I make sure the data is not reset when the text size changes?
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have more than 1000 notes classified in parent/child folders up to 5 levels. From the 5th level of files I can no longer share the note. The note is not shared. It is that of the parent file that is shared.
Thank you very much
Good to you
Christophe
Topic:
Accessibility & Inclusion
SubTopic:
General
Should I allow the CIJSULAgent to find devices on local network?
I get it, "Why don't you just get an apple watch?" regardless, My Macbok Pro M4 doesn't recognize my charging cable in any of my USBC ports.
The Cable works with any other power supply I plug it into, but the Macbook doesn't even register that a cable is connected.
-- Running Sequioa Beta 15.4 thinking it may have been software related. No change.
-- Settings> Privacy and Security> accessories> Changed between all available options. No change.
-- Option + Apple Logo> system info> Thunderbolt/USB4> none of the ports show that the cable is connected.
-- Any other USB-C Cable is recognized in any of the ports on the Macbook. Just not the cable for the Galaxy Watch.
Again, the cable charges in ANY other USBc ports from ANY other device I connect it to.
Am I missing something? Or is this an intended jab at Samsung from Apple? lol
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I am a student studying accessibility.
I aim to analyze the smartphone usage patterns of visually impaired individuals.
Therefore, I would like to log the VoiceOver usage records of visually impaired iPhone users.
Is there a way to output VoiceOver logs, similar to the AccessibilityService API on Android?
Thank you in advance for your responses.