iOS 18.3.1, iPhone 16 Pro.
I pick photos using connected physical keyboard from the user's photo library using:
.photosPicker(isPresented: $viewModel.isImagePickerPresented, selection: $viewModel.selectedImageItem, matching: .images)
When picker appears, accessibility focus is moved to "dynamic Island" instead of cancel button. There is no possibility to navigate by keyboard in photos picker view without tapping on this view and move focus to this view manually . I noticed the same behavior in Notes app.
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I’m trying to set the accessibilityActivationPoint directly on a UITableViewCell so that VoiceOver activate on a specific button inside the cell. However, this approach doesn’t seem to work.
Instead, when I override the accessibilityActivationPoint property inside the UITableViewCell subclass and return the desired point, it works as expected.
Why doesn’t setting accessibilityActivationPoint directly on the cell work, but overriding it inside the cell does? Is there a recommended approach for handling this scenario?
The following approach works,
override var accessibilityActivationPoint: CGPoint {
get {
return convert(toggleSwitch.center, to: nil)
}
set{
super.accessibilityActivationPoint = newValue
}
}
but setting accessibility point directly not works
private func configureAccessibility() {
isAccessibilityElement = true
accessibilityLabel = titleLabel.text
accessibilityTraits = .toggleButton
accessibilityActivationPoint = self.convert(toggleSwitch.center, to: self)
accessibilityValue = toggleSwitch.accessibilityValue
}
I’ve tried implementing the accessibilityPerformMagicTap() method in a specific UIViewController, its view, and even in AppDelegate, but I am not receiving any callbacks.
I directly overrode this method in the mentioned areas, but it never gets triggered when performing a magic tap.
How can I properly observe and handle the accessibilityPerformMagicTap() action?
Is it possible to position windows on the floor by changing some setting? Currently, they cannot be placed on the floor due to drag limitations.
Hello all.
Currently I am trying to get WKWebView to scroll with a physical keyboard and it just will not work. I tried allowsKeyboardScrolling( ) and it did not work. UIWebView works but its no longer supported. Trying to get full keyboard access to work to make our app more accessible but WKWebView does not want to play nice.
Has anyone else had issues trying to use WKWebView with an external keyboard, and if so did you find any solutions? Greatly appreciated!
I have a parent view containing 10 subviews. To control the VoiceOver navigation order, I set only a few elements in accessibilityElements. However, the remaining elements are not being focused or are completely inaccessible.
Is this the expected behavior? If I only specify a subset of elements in accessibilityElements, does it exclude the rest? What’s the best way to ensure all elements remain accessible while customising the order?
I downloaded the official camera sample code(https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview )it's a .swiftpm package and created a SwiftUI project. I copied the official sample code into this new project, build it, and ran it on an iPhone 13 for testing. I found that there were black empty areas on the top and bottom of the application interface, which means that the application interface cannot be previewed in full screen. I have tried many methods but cannot preview in full screen. How can I modify the code?
I am building a language learning app for a unlisted primary language. My plan is to go with english. Any other suggestions or heads up? Check screenshot.
Its unfortunate that i have to tag a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
How can I force VoiceOver to read parentheses for math expressions like this:
Text("(2+3)×4") // VoiceOver: Two plus three, times four
I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks.
Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.”
Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
Hi,I applied for the COMMUNICATION capability, but have a message that I already have the driving task app entitlement.
After that ,I have applied one more time ,there is no reply anymore.
I do not have the com.apple.developer.carplay-communication capability, that means I can not apply this capability?
What should i do next to get this capatibility?
Thanks
This was working a few days ago, but it has since stopped and I can't figure out why. I've tried resetting TCC, double-checking my entitlements, restarting, deleting and rebuilding, and nothing works.
My app is a sandboxed macOS SwiftUI LSUIElement app that, when invoked, checks to see if the frontmost process is Terminal, then tries to get the frontmost window’s title.
func
getFrontmostWindowTitle()
throws
-> String?
{
let trusted = AXIsProcessTrusted()
print("getFrontmostWindowTitle AX trusted: \(trusted)")
guard let app = NSWorkspace.shared.frontmostApplication else { return nil }
let appElement = AXUIElementCreateApplication(app.processIdentifier)
var focusedWindow: AnyObject?
let status = AXUIElementCopyAttributeValue(appElement, kAXFocusedWindowAttribute as CFString, &focusedWindow)
guard
status == .success,
let window = focusedWindow
else
{
if status == .cannotComplete
{
throw Errors.needAccessibilityPermission
}
return nil
}
var title: AnyObject?
let titleStatus = AXUIElementCopyAttributeValue(window as! AXUIElement, kAXTitleAttribute as CFString, &title)
guard titleStatus == .success else { return nil }
return title as? String
}
I recently renamed the app, but the Bundle ID has not yet changed. I have com.apple.security.accessibility set to YES in the Entitlements file (although i had to add it manually), and a NSAccessibilityUsageDescription string set in Info.plist.
The first time I ran this, macOS nicely prompted for permission. Now it won't do that, even when I use AXIsProcessTrustedWithOptions() to try to force it.
If I use tccutil to reset accessibility and apple events, it still doesn't prompt. If I drag my app from the build products folder to System Settings, it gets added to the system TCC DB (not the user DB). It shows an auth value of 2 for my app:
% sudo sqlite3 "/Library/Application Support/com.apple.TCC/TCC.db" "SELECT client,auth_value FROM access WHERE service='kTCCServiceAccessibility' OR service='kTCCServiceAppleEvents';"
com.latencyzero.<redacted>|2
<redactd>
I'm at a loss as to what went wrong. I proved out the concept earlier and it worked, and have since spent a lot of time enhancing and polishing the app, and now things aren't working and I'm starting to worry.
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection.
Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
Hi,
I've wrapped AVRoutePickerView in SwiftUI using pretty much the code given here, with a few changes:
func makeUIView(context: Context) -> UIView {
let routePickerView = AVRoutePickerView()
// Configure the button's color.
//routePickerView.delegate = context.coordinator
//routePickerView.backgroundColor = .secondarySystemBackground
routePickerView.tintColor = .accent
routePickerView.activeTintColor = .accent
// Indicate whether your app prefers video content.
routePickerView.prioritizesVideoDevices = false
return routePickerView
}
I commented out routePickerView.delegate = context.coordinator because it doesn't compile; context.coordinator is of type Void and I'm not sure how to fix that. I'm not sure if that has anything to do with the issue.
Anyway, this works fine without VoiceOver; if I tap the button, I get the AirPlay popover. But in VoiceOver, if I select the button and double-tap, nothing happens… it just reads the button's accessibilityLabel again. How can I get the AirPlay popover to show in VoiceOver?
AVPlayer has 3 visual accessibility issues with videos out of the box:
The contrast fails for the current time in the video
The contrast fails for the remaining time in the video
The hit area is too small for the time slider. The WCAG AA requirement is a minimum hit size of 24 x 24. The height of the hit area of the offending region is 8.
Is there a known fix for any of these?
This can be reproduced with this code in an app playground:
import SwiftUI
import AVKit
import UIKit
struct ContentView: View {
private let video = URL(string: "https://server15700.contentdm.oclc.org/dmwebservices/index.php?q=dmGetStreamingFile/p15700coll2/15.mp4/byte/json")!
@State private var player: AVPlayer?
var body: some View {
VStack {
VideoPlayerView(player: player)
.frame(maxWidth: .infinity, maxHeight: 200)
}
.task {
player = try? await loadPlayer(video: video)
}
}
}
private struct VideoPlayerView: UIViewControllerRepresentable {
let player: AVPlayer?
func makeUIViewController(context: Context) -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = player
controller.modalPresentationStyle = .overFullScreen
return controller
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {
uiViewController.player = player
}
}
private func loadPlayer(video: URL) async throws -> AVPlayer {
let videoAsset = AVURLAsset(url: video)
let videoPlusSubtitles = AVMutableComposition()
try await videoPlusSubtitles.add(videoAsset, withMediaType: .video)
try await videoPlusSubtitles.add(videoAsset, withMediaType: .audio)
return await AVPlayer(playerItem: AVPlayerItem(asset: videoPlusSubtitles))
}
private extension AVMutableComposition {
func add(_ asset: AVAsset, withMediaType mediaType: AVMediaType) async throws {
let duration = try await asset.load(.duration)
try await asset.loadTracks(withMediaType: mediaType).first.map { track in
let newTrack = self.addMutableTrack(withMediaType: mediaType, preferredTrackID: kCMPersistentTrackID_Invalid)
let range = CMTimeRangeMake(start: .zero, duration: duration)
try newTrack?.insertTimeRange(range, of: track, at: .zero)
}
}
}
I have been working to remediate PDFs for a client. The documents/forms have many tables. When I correctly tag a table, using Foxit Editor Pro, it works beautifully on a PC reading it with NVDA. On Mac using VoiceOver the table isn't accessible. It doesn't matter if I try to read it in Adobe Acrobat, Foxit, or Preview. The reader often says the document is empty, omits column headers, and/or associates the wrong header with the column data.
The documents have essentially the same coding behind them as for the web. Why is it they perform so well on a PC with NVDA, but so poorly with Mac VoiceOver? I am a Quality Assurance Specialist. I review websites apps, and documents for accessibility. Why can't I do my job using only my Mac system?
As a Mac user, it frustrates me that I can't use my preferred system for checking documents to see if they are accessible because VoiceOver doesn't work well. I actually have to recommend to my clients and their customers that they need to use a PC with NVDA or Jaws for these documents to be able to get all the information. Unfortunately, most people aren't able to have, or maintain, both systems. Overall, Mac products are very high quality. This, and other issues with VoiceOver, seems to be a large gap in Apple's offerings and functionality.
I would appreciate a human response to the original email I sent about this on 7/30/2025.
Topic:
Accessibility & Inclusion
SubTopic:
General
[macOS 15.4] Game Controller Background Input Capture Broken - Accessibility App No Longer Functions
Our application,
https://apps.apple.com/us/app/gamecontroller-mapper/id6737088417
which maps game controller inputs to keyboard/mouse events system-wide, has stopped functioning properly after the macOS 15.4 update. Specifically, the app can no longer capture game controller inputs when running in the background, severely impacting its core functionality.
Environment
macOS version: 15.4
Previous working versions: All versions prior to 15.4
App type: Background utility with accessibility permissions
Hardware: All game controller brands compatible with macOS
Detailed Description
Before macOS 15.4
Our application correctly captured game controller inputs from any brand connected to Mac and successfully translated them to keyboard/mouse events system-wide. Users could control any application (e.g., scrolling through documents in Preview using controller buttons) while our app ran in the background with the accessibility permissions granted.
After macOS 15.4
The application only works when it has active focus (is in the foreground). When any other application gains focus, our app completely stops receiving or detecting any input events from the game controller while running in the background. For instance, pressing the 'down' button on the controller while another app is active results in no event being registered within our application.
We've tried updating the app to work in accessory mode (in the menubar), but the issue persists.
Steps to Reproduce
Install our application on macOS 15.3 or earlier
Grant accessibility permissions when prompted
Connect a compatible game controller (e.g., Xbox or other controller)
Open another application (e.g., Preview with a PDF document)
Press buttons on the controller to navigate the document without touching the keyboard
Expected result on 15.3: Controller inputs are translated to keyboard events, even when our app is in the background
Upgrade to macOS 15.4
Repeat steps 2-5
Actual result on 15.4: Controller inputs are only translated to keyboard events when our application has focus
Technical Implementation
Our app uses:
CGEvent.tapCreate() to create a global event tap
CGEvent for simulating keyboard and mouse events
GCController.extendedGamepad?.valueChangedHandler for detecting controller inputs
Proper NSAccessibilityUsageDescription and appropriate entitlements
GCController.shouldMonitorBackgroundEvents = true to ensure controller events continue when the app is inactive
Possible Relation to Recent Changes
We noticed in the macOS 15.4 Release Notes:
Game Controller - Resolved Issues:
Fixed: Game controllers might stop responding when accessibility features, such as Voice Over, are enabled. (141497799)
We suspect this fix might have introduced a regression or intentional limitation affecting applications like ours that rely on background event simulation with game controller input.
Impact
This change severely impacts:
Applications designed to use game controllers as assistive input devices for users who may have difficulty using traditional keyboard and mouse inputs
Applications for media control, presentation navigation, and other similar use cases
Users who rely on our application for accessibility purposes
Questions
Is this an intentional security change or an unintended side effect of the controller fix mentioned in the release notes?
Are there any new APIs or alternative approaches we should implement to restore functionality?
If this is a system bug, when can we expect a fix?
We would greatly appreciate any guidance on how to restore our application's functionality. Thank you for your assistance.
I'm looking into how to programmatically control color filters in the Accessibility settings under "System Settings" -> "Accessibility" -> "Color Filters"--in particular the "Intensity" and "Filter type" settings.
As far as I have gathered, changing this setting can only be accomplished using the CoreGraphics APIs or Accessibility APIs (I've poked around GitHub, Stack Overflow, and queried some LLMs), but there doesn't seem to be a clear cut example for doing this using public facing APIs, without ripping off source code from another project wholesale or using private APIs.
My goal is to overlay a color filter at either a per-application or system level to help with accessibility. If there's a way to overlay this capability on an application-by-application basis as a third-party developer, that would be the most ideal scenario. For example, modifying the look and feel/UX for Launchpad, Photos, etc, as a third-party developer without accessing the source code of the application that I'm modifying the look/feel for (with appropriate user consent of course).
Topic:
Accessibility & Inclusion
SubTopic:
General
Many of us Bangladeshi iPhone users were upset when Apple changed the font to Bangla in the most recent iOS version (18.4.1). We prefer the old Bangla typeface. I want the old Bangla typeface to return, and so do we. Please consider this.
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused.
According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup.
In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact.
So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
}
// Start listening to the microphone
public void StartListening()
{
if (!isListening)
{
#if UNITY_IOS || UNITY_TVOS
microphoneInput = Microphone.Start(null, true, 10, 44100);
#else
try
{
microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100
if (microphoneInput == null)
{
microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate);
}
#endif
isListening = true;
Debug.Log(Microphone.devices.Length + " Started listening...");
debugText.text = Microphone.devices.Length + "- Started listening...";
}
catch (System.Exception e)
{
Debug.LogError($"Starting microphone failed: {e.Message}");
debugText.text = $"Starting microphone failed: {e.Message}";
}
}
}
void Update()
{
if (isListening && microphoneInput != null)
{
// Analyze the audio for voice activity
float volume = GetAverageVolume();
if (volume > detectionThreshold)
{
Debug.Log("User is speaking!");
lastVoiceTime = Time.time;
SoundDetected = true;
if (Time.time - lastVoiceTime > silenceDuration)
{
Debug.Log("User is silent.");
debugText.text = volume.ToString() + " - User is silent.";
}
slider.value = volume;
}
}
}
private float GetAverageVolume()
{
float[] samples = new float[128];
microphoneInput.GetData(samples, Microphone.GetPosition(null));
float sum = 0f;
foreach (float sample in samples)
{
sum += Mathf.Abs(sample);
}
return sum / samples.Length;
}
Problem:
When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected.
Question:
Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario?
Thanks in advance for any insights!
Best,
Siddharth