I’m currently exploring VoiceOver accessibility in iOS and looking for the best way to reduce the number of swipes required to navigate a UITableView. I’ve come across a couple of potential solutions but am unsure which is preferred.
Solution 1: Grouping Subviews in Each Cell
Combine all subviews inside a UITableViewCell into a single accessibility element.
Provide a concise and meaningful accessibilityLabel.
Use custom actions (UIAccessibilityCustomAction) or accessibilityActivationPoint to handle interactions on specific elements within the cell.
Solution 2: Using UIAccessibilityContainerDataTableCell & UIAccessibilityContainerDataTable
Implement UIAccessibilityContainerDataTable for structured table navigation.
Make each cell conform to UIAccessibilityContainerDataTableCell, defining its row and column positions.
However, I’m finding this approach a bit complex, and I need guidance on properly implementing these protocols.
Additionally, in my case, VoiceOver is not navigating to Section 2—I’m not sure why.
Questions:
Which of these approaches is generally preferred for better VoiceOver navigation?
How do I properly implement UIAccessibilityContainerDataTable so that all sections and rows are navigable?
Any best practices or alternative recommendations?
Would really appreciate any insights or guidance!
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
It is outrageous that Apple continue to fail to implement the Fullscreen API web standard for web apps on iPhone only, which is so important to accessibility and web app functionality.
The only possible reason for this block is commercial: to promote iOS apps instead of browser based web apps.
To quote a client from a major agency just now - a typical enquiry :
We value accessibility greatly, and we noticed that the embedded player is missing a full screen button on iPhone.
Everything else works perfectly fine, including a full screen button that appears on the mobile webpage on android devices.
Is there any way we can include a button to enable full screen view for our viewers in your player that are going to watch it on iOS devices?
To which, as usual, we have to reply:
Apple unfortunately block fullscreen mode from being used with all web applications on iPhone.
Apple will allow this to be displayed fullscreen on MacBooks and iPads, but currently not on on iPhone - so we have to hide the fullscreen button there.
So fullscreen works on all devices and browsers apart from on iPhone.
As you've seen with Android, all other devices and browsers follow the universal 'Fullscreen API' web standard to allow full screen.
You're probably familiar with seeing the fullscreen button on normal linear videos on iPhone.
These use Apple's native video player, which doesn't let buttons and scripts be used on top of it - just a single video, not an interactive web application.
Our player looks like a video player but it is actually a web app combining multiple different video clips connected together by code and styling.
They block it on iPhones for reasons known only to them, but the assumption is that it is to incentivise people to make iOS apps instead of web apps.
The web development community is hopeful that Apple will change this unfortunate restriction soon, but we have been waiting a long time in vain.
We have to send this to a lot of people. It's a very bad look for Apple.
In less than a month it will be 2025. We have been waiting years for this.
The web standard documentation showing universal support on other devices and browsers is here:
https://developer.mozilla.org/en-US/docs/Web/API/Fullscreen_API
This is not acceptable. It is time for Apple to stop blocking this important accessibility web standard for commercial reasons - only on iPhone. To whoever is in charge of these decisions in the Safari/Webkit team: Please just enable Fullscreen API for web apps on iPhone as soon as possible.
Hello!
I'm trying to improve the accessibility of a UIKit login form in our iOS app. If an error occurs, an error message is shown in a label that is hidden by default. For our VoiceOver users, I want to move the focus to the error message label so that VoiceOver reads out the error message.
I'm trying to achieve this using UIAccessibility.post, but try as I might, it does not work. To better understand the problem, I created a very simple App which shows a button and a label (always visible), and on pressing the button, I post an accessibility notification:
UIAccessibility.post(notification: .layoutChanged, argument: label)
What I expect to happen is for the focus to move from the button to the label. What happens instead is the focus stays with the button and VoiceOver reads out the button's label again. So it seems to process the notification, but ignore the argument.
Am I misunderstanding how accessibility notifications work or is this simply broken at the moment? I am testing this withy my iPhone with the current iOS version 18.2.1
By the way, using the more modern variant leads to the same result:
AccessibilityNotification.LayoutChanged(label).post()
When I am doing a file search, in TextEdit, and on certain webistes the space bar will quit functioning as soon as i start typing. If I hold down the "Option" key it allows the space bar to work as normal. I have checked every setting I can think of and nothing has helped.
Topic:
Accessibility & Inclusion
SubTopic:
General
My game app is text-based interactive fiction, containing no audio/video content, making captions unnecessary. Our game is completely accessible to deaf users.
Despite this, in the Accessibility Nutrition Label, I'm only able to leave the "Captions" box checked or unchecked. Leaving it unchecked would leave deaf players with the wrong impression that they can't enjoy our game. Leaving it checked would imply that we do have A/V content with captions included.
In the WWDC video on this, https://developer.apple.com/videos/play/wwdc2025/224/ the video says:
After we completed common tasks, we realized our app doesn’t have any video or audio only content. In this case, we aren’t going to indicate that Landmarks supports Captions. That's okay. This accurately describes the features that people will expect to be available while using the app.
Maybe that's "OK," but I wish the form allowed me to say "This app doesn't contain audio/video content."
When my macOS Cocoa app displays a modal alert with beginSheetModal(for:completionHandler:), VoiceOver sometimes seems to focus on an "illegal" upper level, where any attempts at navigation will give the unhelpful response "Alert, dialog", until you "drill down" with VO + shift + down or switch apps. After that, things will work as expected.
Is this a known bug? Does it happen to anybody else, or am I doing something wrong?
Hi everyone,
After installing the macOS beta (Tahoe 26.0) on my MacBook Pro M3, I’ve noticed two issues:
Significant increase in system temperature
The laptop feels hot even with light usage like Safari and Figma
Rapid battery drain
Battery is dropping unusually fast compared to macOS Sonoma.
I’ve tried, Restarting the device.
I’m aware this is a beta, but just wondering.
Is anyone else experiencing this?
Is this a known issue?
Would love to hear if others are facing similar problems or if it might be something specific to my setup.
Thanks in advance!
Topic:
Accessibility & Inclusion
SubTopic:
General
Feedback number: FB20451665
When building with Xcode 26, Voice Over is reporting an extra tab when swiping through tabs. Please see the sample project below:
/*
This is a Sample project to show that I believe there is a Voice Over bug in iOS 26.
When swiping through tabs with Voice Over active, there always appears to be an extra tab.
Here I have 5 tabs, when on tab one VO reads out tab 1 of 6, then tab 2 of 6, all the way to the last tab, when voice over reads out tab 5 of 6. Never tab 6 of 6.
Is there a possibility that voice over is picking up the underlying `more` tab and reading that out?
This has also been reportedly found in the Files app here:
https://www.applevis.com/comment/195441#comment-195441
*/
struct ContentView: View {
var body: some View {
TabView {
/// Activating this has Voice over telling us there are 6 Tabs.
Tab(RootTab.home.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.home.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.home.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.home.title.capitalized) tab")
Tab(RootTab.diary.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.diary.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.diary.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.diary.title.capitalized) tab")
Tab(RootTab.meals.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.meals.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.meals.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.meals.title.capitalized) tab")
Tab(RootTab.knowledge.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.knowledge.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.knowledge.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.knowledge.title.capitalized) tab")
Tab(RootTab.profile.title, systemImage: "circle.fill") {
Text("This is the \(RootTab.profile.title.capitalized) screen")
}
.accessibilityLabel("\(RootTab.profile.title.capitalized) tab")
.accessibilityHint("Double tap to open the \(RootTab.profile.title.capitalized) tab")
/// Activating this also has Voice over telling us there are 6 Tabs.
// ForEach(RootTab.allCases, id: \.self) { tab in
//
// Text("This is the \(tab.title.capitalized) screen")
// .tabItem {
// Label(tab.title.capitalized, systemImage: "circle.fill")
// }
// .accessibilityLabel("\(tab.title.capitalized) tab")
// .accessibilityHint("Double tap to open the \(tab.title.capitalized) tab")
// }
}
}
enum RootTab: CaseIterable {
case home
case diary
case meals
case knowledge
case profile
var title: String {
switch self {
case .home:
"home"
case .diary:
"diary"
case .meals:
"meals"
case .knowledge:
"knowledge"
case .profile:
"profile"
}
}
}
}
I'm curious if anyone else can see this issue, or if anyone knows of a workaround for it.
Following the official documentation, I'm trying to create a set of three localised Help Books.
The Help Books should be available in Spanish, English and Polish. Presently, I'm trying to complete English version.
App Structure
This is the plugin application consisting of main app and the plugin. The main app structure would looks as follows:
Files
. <XcodeProject Top>
├── Localizable.xcstrings
├── MyAppExtension
│ ├── MyAppExtension.swift
│ └── <other swift files>.swift
├──MyApp
│ ├── Info.plist
│ ├── +Array.swift
│ ├── +ButtonStyle.swift
│ ├── <other app swift files>.swift
├── Resources
└── MyApp.help
└── MyApp.help
└── Contents
├── Info.plist
└── Resources
├── English.lproj
│ ├── ExactMatch.plist
│ ├── InfoPlist.strings
│ ├── MyApp.helpindex
│ ├── MyApp.html
│ └── pgs
└── shrd
MyApp / MyApp.help / Info.plist file
Consists the following values:
Bundle name: MyApp
HPDBookAccessPath: MyApp.html
HPDBookTitle: My App Help
Default localization: en_gb
MyApp / Info.plist file
Contains the following entries:
Help Book directory name: MyApp.help
Help Book Identifier: MyApp Help
Build phase
The Copy Bundle Resources copies MyApp.help in MyApp/Resources.
Questions
Is the provided folder structure valid for creating a localised help books
Is there anything that is missing from across Info.plist files or is in the wrong places?
Why the MyApp -> Help opens the main help menu, not the app help
There is an issue with Help Books that started with the release of macOS 14.4. The issue is that when an app attempts to go directly to a Help Book page, the help viewer opens to the Help Book's main index page, rather than the specific page requested. As I investigated the issue I found that the requested page was actually part of help viewer's navigation history, and all I had to do was to click the Back navigation arrow and the requested page would be displayed. So it seems like the requested page is momentarily visited but is then (for whatever reason) quickly replaced by the main index page.
Our app uses the AHGotoPage() API for directly accessing our Help Book's pages. This is the same mechanism/code that our app has used for more than a decade and has never caused us any issues. Everything works fine on macOS 14.3.0 and earlier. I've scoured the documentation and can't find any newer APIs for accessing Help pages. I've also tried various other things (e.g. reworking the code, creating new indexes for the app's Help, etc.), but none of it seems to make a difference. As far as I can tell, the issue seems to stem from some change made to the OS.
So my questions are:
Is this a known bug? And if so, is there any ETA on a fix?
Is there something different we should be doing for newer versions of the OS (create indexes differently, use a different API, etc.)?
Topic:
Accessibility & Inclusion
SubTopic:
General
I wrote this in the regular forums and they deleted it and told me to write it here because it was dealing with unreleased software. I read that Launchpad is disappearing in Tahoe and I have real concerns about that. For me, that is an accessibility issue. I have both memory problems and scanning problems. So having my apps organized into categories is extremely important to me. Just today I needed to find an app that I didn't remember the name of and I rarely use, but when I need it, it is important to me. Just to see if I could find it without launchpad, I scanned my applications folder and I couldn't find it. I went to launchpad and to the category I knew it was in and it was right there, easy for me to find. Please don't take away our organization options.
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
Hi,
I've wrapped AVRoutePickerView in SwiftUI using pretty much the code given here, with a few changes:
func makeUIView(context: Context) -> UIView {
let routePickerView = AVRoutePickerView()
// Configure the button's color.
//routePickerView.delegate = context.coordinator
//routePickerView.backgroundColor = .secondarySystemBackground
routePickerView.tintColor = .accent
routePickerView.activeTintColor = .accent
// Indicate whether your app prefers video content.
routePickerView.prioritizesVideoDevices = false
return routePickerView
}
I commented out routePickerView.delegate = context.coordinator because it doesn't compile; context.coordinator is of type Void and I'm not sure how to fix that. I'm not sure if that has anything to do with the issue.
Anyway, this works fine without VoiceOver; if I tap the button, I get the AirPlay popover. But in VoiceOver, if I select the button and double-tap, nothing happens… it just reads the button's accessibilityLabel again. How can I get the AirPlay popover to show in VoiceOver?
I have a parent view containing 10 subviews. To control the VoiceOver navigation order, I set only a few elements in accessibilityElements. However, the remaining elements are not being focused or are completely inaccessible.
Is this the expected behavior? If I only specify a subset of elements in accessibilityElements, does it exclude the rest? What’s the best way to ensure all elements remain accessible while customising the order?
Hello all.
Currently I am trying to get WKWebView to scroll with a physical keyboard and it just will not work. I tried allowsKeyboardScrolling( ) and it did not work. UIWebView works but its no longer supported. Trying to get full keyboard access to work to make our app more accessible but WKWebView does not want to play nice.
Has anyone else had issues trying to use WKWebView with an external keyboard, and if so did you find any solutions? Greatly appreciated!
Description
When calling AppStore.showManageSubscriptions(in:), the system modal for managing subscriptions appears visually. However, it is not automatically focused by VoiceOver, and in some cases, VoiceOver still allows interaction with elements in the underlying view controller, such as buttons and labels. This creates confusion and violates accessibility expectations.
Steps to Reproduce
1. In a UIKit app, present the system subscription sheet via AppStore.showManageSubscriptions(in:).
2. Ensure VoiceOver is enabled on the device.
3. Observe the focus behavior when the modal appears.
4. Try swiping right/left — VoiceOver continues to announce items in the presenting view controller.
Expected Result
The modal should automatically take VoiceOver focus, and all elements behind it should be non-accessible until dismissed.
Actual Result
VoiceOver continues to focus and interact with elements behind the presented modal.
Notes
• Tested on iOS 18.5
• Reproducible on device
• Using Swift/UIKit (not SwiftUI)
Topic:
Accessibility & Inclusion
SubTopic:
General
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
This may sound like a bit of an odd question, but this was what I was told this morning by one of our Accessibility managers.
This past June at WWDC, I scheduled a lab session with Apple's accessibility folks for a review. I had the pleasure of working with Ryan who helped give the great VoiceOver Testing Talk from WWDC 2018. I believe I've worked with him before in the labs, but regardless, no matter who I meet with in the Accessibility Labs they always provide me with some new nugget of information that I learn, no matter how well versed I might think I am in Accessibility.
After the labs, I made all the changes that Ryan suggested and also told other developers on my team of what I was taught. In our app we provide various forms, and each field component that appears in the form has a header text which we apply a header trait to.
This allows for the use of a Header Rotor to quickly navigate between all the questions in the form, say if a user wants to return to a previous field etc. I even suggested we should take the time to provide a custom rotor that would allow users to navigate to fields that may be in an error state. If say the user submits the form, and the responses are validated, if 1 or multiple fields be in error we should have a rotor to allow the user to navigate directly to those fields. They may not be able to see the Red text / red outlines of those fields.
This morning, I was told that I needed to undo that. That our headerLabel properties should not be marked with the UIAccessibilityTrait.header trait.
When I stated that it makes navigation of the form much easier via the Headers Rotor, I was told by the Accessibility Manager this is not the case.
I have the MS Teams transcript in front of me, which reads as follows (give or take a few transcript errors)
So I went ahead and I just double checked with two of my friends, who are blind and for them on their end, they both said that they would not actually use that, and could add more complexity, because they have—in addition to being blind—but there's also mobility limitations. So they actually can't even use the Rotor at all. They only can use the swipes.
Does this make sense to anyone, because it doesn't to me? Thoughts on this?
Topic:
Accessibility & Inclusion
SubTopic:
General
curl -o actions-runner-osx-x64-2.321.0.tar.gz -L https://github.com/actions/runner/releases/download/v2.321.0/actions-runner-osx-x64-2.321.0.tar.gz echo "b2c91416b3e4d579ae69fc2c381fc50dbda13f1b3fcc283187e2c75d1b173072 actions-runner-osx-x64-2.321.0.tar.gz" | shasum -a 256 -c tar xzf ./actions-runner-osx-x64-2.321.0.tar.gz ./config.sh --url https://github.com/funds123/tpsdk --token BMOPQPZNQKB57MXE4BDDKKTHMTHJO ./run.sh
Save this configuration as com.github.actions.runner.plist in
Topic:
Accessibility & Inclusion
SubTopic:
General
Dear developer team,
After updating to iOS 18.3.1 I noticed the font in the Notes app became too small to read comfortably, and I have already got poor eyesight.
There is no way to increase the font size. When I select my preferred text size through Accessibility settings, it only changes the size of headings in the Notes app but the text remains too small in the note itself. I’m using the IPhone 13.
I googled the issue and seems like other users across the Internet are also unhappy about the lack of ability to change the text size in Notes to suit their comfortable levels.
I hope that this issue will be addressed by developers in the next version of the iOS because the reading size in the standard app can affect health for the tired and diminished eyesight.
Kind regards,
Maria
Topic:
Accessibility & Inclusion
SubTopic:
General