In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
Accessibility
RSS for tagMake your apps function for a broad range of users using Accessibility APIs across all Apple platforms.
Posts under Accessibility tag
143 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In our application we are using OTP login. When accessibility full keyboard access is enabled, and we are trying to enter OTP in the OTP field that time in iOS 17 focus is moving to the next text field accordingly but in iOS 18 focus is staying the first OTP field only and not moving to the next text field.
In our application we are using a Search bar in a pop over view and we have enabled Accessibility full keyboard access and we are using external keyboard. Now if the focus is on Searcher that time by next Tab key press Search bar will dismiss and focus needs to shift to the next UIElement.
In our application we are using UITableView for data population and that TableView cell contains a button. When we are enabling full keyboard access that time only TableView cell is focusing not the button. We need to focus on cell and button differently.
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients.
We need to know which item has the VoiceOver focus so we can keep track of it.
setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused).
At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder.
Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element?
Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking
If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
In SwiftUI, the date picker component is breaking in colour contrast accessibility. Below code has been use to create date picker:
struct ContentView: View {
@State private var date = Date()
@State private var selectedDate: Date = .init()
var body: some View {
let min = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
let max = Calendar.current.date(byAdding: .year, value: 4, to: Date()) ?? Date()
DatePicker(
"Start Date",
selection: $date,
in: min ... max,
displayedComponents: [.date]
)
.datePickerStyle(.graphical)
.frame(alignment: .topLeading)
.onAppear {
selectedDate = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
}
}
}
#Preview {
ContentView()
}
attaching the screenshot of failure accessibility.
I’m trying to add the .header accessibility trait to a UISegmentedControl so that VoiceOver recognizes it accordingly. However, setting the trait using the following code doesn’t seem to have any effect:
segmentControl.accessibilityTraits = segmentControl.accessibilityTraits.union(.header)
Even after applying this, VoiceOver doesn’t announce it as a header. Is there any workaround or recommended approach to achieve this?
I have implemented a SwiftUI view containing a grid of TextField elements, where focus moves automatically to the next field upon input. This behavior works well on iOS 16 and 17, maintaining proper focus highlighting when keyboard full access is enabled.
However, in iOS 18 and above, the keyboard full access focus behaves differently. It always stays behind the actual focus state, causing a mismatch between the visually highlighted field and the active text input. This leads to usability issues, especially for users navigating with an external keyboard.
Below is the SwiftUI code for reference:
struct AutoFocusGridTextFieldsView: View {
private let fieldCount: Int
private let columns: Int
@State private var textFields: [String]
@FocusState private var focusedField: Int?
init(fieldCount: Int = 17, columns: Int = 5) {
self.fieldCount = fieldCount
self.columns = columns
_textFields = State(initialValue: Array(repeating: "", count: fieldCount))
}
var body: some View {
let rows = (fieldCount / columns) + (fieldCount % columns == 0 ? 0 : 1)
VStack(spacing: 10) {
ForEach(0..<rows, id: \.self) { row in
HStack(spacing: 10) {
ForEach(0..<columns, id: \.self) { col in
let index = row * columns + col
if index < fieldCount {
TextField("", text: $textFields[index])
.frame(width: 40, height: 40)
.multilineTextAlignment(.center)
.textFieldStyle(RoundedBorderTextFieldStyle())
.focused($focusedField, equals: index)
.onChange(of: textFields[index]) { newValue in
if newValue.count > 1 {
textFields[index] = String(newValue.prefix(1))
}
if !textFields[index].isEmpty {
moveToNextField(from: index)
}
}
}
}
}
}
}
.padding()
.onAppear {
focusedField = 0
}
}
private func moveToNextField(from index: Int) {
if index + 1 < fieldCount {
focusedField = index + 1
}
}
}
struct AutoFocusGridTextFieldsView_Previews: PreviewProvider {
static var previews: some View {
AutoFocusGridTextFieldsView(fieldCount: 10, columns: 5)
}
}
Has anyone else encountered this issue with FocusState in iOS 18?
I really do believe that this is a bug strictly connected to keyboard navigation since I experienced similar problem also on UIKit equivalent of the view.
Any insights or suggestions would be greatly appreciated!
VoiceOver reads out all visible content on the screen, which is essential for visually challenged users. However, this raises a privacy concern—what if a user accidentally focuses on sensitive information, like a bank account password, and it gets read aloud?
How can developers prevent VoiceOver from exposing confidential data while still maintaining accessibility? Are there best practices or recommended approaches to handle such scenarios effectively?
In VoiceOver, when using Group Navigation style, the cursor first focuses on the semantic group. To navigate inside the group, a two-finger swipe (left or right) can be used. This behavior works for default containers like the Navigation Bar, Tab Bar, and Tool Bar.
How can I achieve the same behavior for a custom view?
I tried setting accessibilityContainerType = .semanticGroup, but it only works for Mac Catalyst. Is there an equivalent approach for iOS?
I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need?
For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient.
Is there a better way to handle this using VoiceOver?
I’ve tried implementing the accessibilityPerformMagicTap() method in a specific UIViewController, its view, and even in AppDelegate, but I am not receiving any callbacks.
I directly overrode this method in the mentioned areas, but it never gets triggered when performing a magic tap.
How can I properly observe and handle the accessibilityPerformMagicTap() action?
I have a UITextField in my application for entering a state. If I tap on it, a UIPickerView pops up and let's the user select a state (but they can still type too).
The issue relates to Full Keyboard Access. If we select the UITextField using an external keyboard, the UIPickerView appears, but in order to get to it the user has to tab through the whole view controller to get to the UIPickerView at the end.
What would be nice is to a) move focus directly to the UIPickerView (have it highlighted in blue and scrollable right away with keyboard) or b) make the UIPickerView the next view that's accessible when tabbing over or using the arrow keys.
I've tried using:
UIAccessibility notifications (both .screenChanged and .layoutChanged, with and without a delay). This ended up only announcing the view, but didn't help with full keyboard access.
Making the UIPickerView a first responder when it appears.
Attempting to change the accessibilityElements order (but with so many views and views within views, this isn't really a viable option either).
Pressing tab + -> (tab and right arrow button) will quickly take the user to the end of the chain of accessibility elements, in other words, to the UIPickerView. But there has to be a cleaner way of just automatically setting the focus to the UIPickerView or making it the next element by pressing the arrow key.
Hello,
I struggle to do some UI testing using accessibility identifiers when I wrap some SwiftUI view using UIHostingController (our app is mainly coded using UIKit).
Considering this SwiftUI view (simplified for this post):
HStack {
Text(self.title.uppercased())
.albusTheme(.header)
.lineLimit(self.isMultiline ? nil : 1)
.multilineTextAlignment(.leading)
.accessibilityAddTraits(.isStaticText)
.accessibilityIdentifier("section_title")
}
This view and its controller are embedded as a UITableViewHeaderFooterView in a UITableView.
This is an extract of recursiveDescription output:
| | | | | | <_UITableViewHeaderFooterContentView: 0x1076ad720; frame = (0 0; 393 40); layer = <CALayer: 0x6000006b1720>>
| | | | | | | <_TtGC13ListComponent19SwiftUIFieldContentV20ListComponentLibrary17FormSectionHeader_: 0x1076ab980; baseClass = UIControl; frame = (0 0; 393 40); layer = <CALayer: 0x6000006b1da0>>
| | | | | | | | <_TtGC7SwiftUI14_UIHostingViewV20ListComponentLibrary17FormSectionHeader_: 0x1078f9600; frame = (0 0; 393 40); gestureRecognizers = <NSArray: 0x600000e25d70>; backgroundColor = UIExtendedSRGBColorSpace 0.0666667 0.133333 0.227451 1; layer = <SwiftUI.UIHostingViewDebugLayer: 0x6000006b19a0>>
| | | | | | | | | <_TtCOCV7SwiftUI11DisplayList11ViewUpdater8Platform13CGDrawingView: 0x106985550; frame = (16 12.6667; 147.667 14.6667); anchorPoint = (0, 0); opaque = NO; autoresizesSubviews = NO; layer = <_TtCOCV7SwiftUI11DisplayList11ViewUpdater8PlatformP33_65A81BD07F0108B0485D2E15DE104A7514CGDrawingLayer: 0x6000026b8240>>
CGDrawingView seems to hide the underlying view hierarchy. Is there a way to access accessibility settings using the integration of SwiftUI in UIKit?
iOS 18.3.1, iPhone 16 Pro.
I pick photos using connected physical keyboard from the user's photo library using:
.photosPicker(isPresented: $viewModel.isImagePickerPresented, selection: $viewModel.selectedImageItem, matching: .images)
When picker appears, accessibility focus is moved to "dynamic Island" instead of cancel button. There is no possibility to navigate by keyboard in photos picker view without tapping on this view and move focus to this view manually . I noticed the same behavior in Notes app.
I’m currently exploring VoiceOver accessibility in iOS and looking for the best way to reduce the number of swipes required to navigate a UITableView. I’ve come across a couple of potential solutions but am unsure which is preferred.
Solution 1: Grouping Subviews in Each Cell
Combine all subviews inside a UITableViewCell into a single accessibility element.
Provide a concise and meaningful accessibilityLabel.
Use custom actions (UIAccessibilityCustomAction) or accessibilityActivationPoint to handle interactions on specific elements within the cell.
Solution 2: Using UIAccessibilityContainerDataTableCell & UIAccessibilityContainerDataTable
Implement UIAccessibilityContainerDataTable for structured table navigation.
Make each cell conform to UIAccessibilityContainerDataTableCell, defining its row and column positions.
However, I’m finding this approach a bit complex, and I need guidance on properly implementing these protocols.
Additionally, in my case, VoiceOver is not navigating to Section 2—I’m not sure why.
Questions:
Which of these approaches is generally preferred for better VoiceOver navigation?
How do I properly implement UIAccessibilityContainerDataTable so that all sections and rows are navigable?
Any best practices or alternative recommendations?
Would really appreciate any insights or guidance!
Using the new floating tab bar with ios18 on ipad with an RTL language, when the tabbar is correctly flipped and the right buttons become left buttons, the buttons are pushed off the screen on initial load but do come back after some navigating / reloading, have recreated in a fresh project.
As of iOS 18.1 being released we are having issues with our users experiencing issues with our app that relies on strobing the device torch.
We have narrowed this down to being caused on devices with adaptive true-tone flash and have submitted a radar: FB15787160.
The issue seems to be caused by ambient light levels. If run in a dark room, the torch strobes exactly as effectively as in previous iOS versions, if run in a light room, or outdoors, or near a window, the strobe will run for ~1s and then the torch will get stuck on for half a second or so (less frequently it gets stuck off) and then it will strobe again for ~1s and this behaviour repeats indefinitely.
If we go to a darker environment, and background and then foreground the app (this is required) the issue is resolved, until moving to an area with higher ambient light levels again. We have done a lot of debugging, and also discovered that turning off "Auto-Brightness" from Settings -> Accessibility -> Display & Text Size resolves the issue.
We have also viewed logs from Console.app at the time of the issue occurring and it seems to be that there are quite sporadic ambient light level readings at the time at which the issue occurs. The light readings transition from ~100 Lux to ~8000 Lux at the point that the issue starts occurring (seemingly caused by the rear sensor being affected by the torch). With "Auto-Brightness" turned off, it seems these readings stay at lower levels.
This is rendering the primary use case of our app essentially useless, would be great to get to the bottom of it! We can't even really detect it in-app as I believe using SensorKit is restricted to research applications and requires a review process with Apple before accessing?
Edit: It's worth noting this is also affecting other apps with strobe functionality in the exact same way
I'm looking to integrate call / text / facetime history into my app while maintaining the necessary security for the end user. I only need date stamp and contact link or name of the person communicated with, no access to content of messages etc.
How would this be accomplished?
Topic:
App & System Services
SubTopic:
General
Tags:
Developer Tools
Security
Accessibility
Contacts
Hi,
I am setting an accessibilityLabel and accessibilityHint property of a UIAlertAction. However, VoiceOver is only reading the label out. Usually, the label is read out, followed by a short pause and then the hint. Is this a known issue, where hints do not work for this element? I can append the hint to the label, but interested to know if there's something I'm doing wrong.
Regards.