Voice Control Disabling System Services After Reboot
I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem.
To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months.
I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem.
I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter.
Sincerely,
Donald Spencer Kirby
Dayton, Ohio
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have been working on a feature, where I have a List in SwiftUI with previous and next data loading, user can scroll up and down to load previous/next page data.
Recently, I faced one accessibility issue while testing voice-over, when user lands on the listing screen and swipe across the screen from navigation and when focus comes on list it should highlight the first item visible.
But when user swipes back:
Should it load the previous data and announce the previous item or it should go back to the navigation items?
If it loads the previous item, what if the user wants to go to the navigation to switch to other actions and vice-versa?
Did anyone come across this kind of issue? What can be the standard expected behavior in this case if list has both previous and next page scroll?
I different tried gestures https://support.apple.com/en-in/guide/iphone/iph3e2e2281/ios, but it isn't working
I am working on capturing 48MP images using the iPhone 16 Pro Max with the Ultra-wide camera. I’ve updated the code to capture the maximum supported dimensions with the following snippet:
if #available(iOS 16.0, *) {
photoOutput.maxPhotoDimensions = device.activeFormat.supportedMaxPhotoDimensions.last!
photoSettings.maxPhotoDimensions = .init(width: 5712, height: 4284)
}
However, I’m still not getting the expected results. My goal is to capture 48MP images, and I want to confirm if the Ultra-wide camera supports this resolution or if I’m missing any other configuration.
Any guidance would be appreciated!
I’m trying to set the accessibilityActivationPoint directly on a UITableViewCell so that VoiceOver activate on a specific button inside the cell. However, this approach doesn’t seem to work.
Instead, when I override the accessibilityActivationPoint property inside the UITableViewCell subclass and return the desired point, it works as expected.
Why doesn’t setting accessibilityActivationPoint directly on the cell work, but overriding it inside the cell does? Is there a recommended approach for handling this scenario?
The following approach works,
override var accessibilityActivationPoint: CGPoint {
get {
return convert(toggleSwitch.center, to: nil)
}
set{
super.accessibilityActivationPoint = newValue
}
}
but setting accessibility point directly not works
private func configureAccessibility() {
isAccessibilityElement = true
accessibilityLabel = titleLabel.text
accessibilityTraits = .toggleButton
accessibilityActivationPoint = self.convert(toggleSwitch.center, to: self)
accessibilityValue = toggleSwitch.accessibilityValue
}
The issue is, I cannot auto acquire bluetooth keyboard focus in PHPickerViewController after enabling 'Full Keyboard Access' in my IPhone 14 with iOS version 18.3.1. The keyboard focus in PHPickerViewController will show, however, after I tapped on the blank space of the PHPickerViewController. How to make the focus on at the first place then?
I'm using UINavigationController and calling setNavigationBarHidden(true, animated: false). Then I use this controller to present PHPickerViewController using some configuration setup below.
self.configuration = PHPickerConfiguration()
configuration.filter = .any(of: filters)
configuration.selectionLimit = selectionLimit
if #available(iOS 15.0, *), allowOrdering {
configuration.selection = .ordered
}
configuration.preferredAssetRepresentationMode = .current
Finally I set the delegate to PHPickerViewController and call UINavigationController.present(PHPickerViewController, animated: true) to render it.
Also I notice animation showing in first video then disappear.
I would like to enable the option to resize windows with the apple pencil pro. I tried but I see that this feature is not enabled.
Topic:
Accessibility & Inclusion
SubTopic:
General
While it is possible to scroll content using VoiceOver on macOS, I was not able to find any NSAccessibility APIs related to it (such as accessibilityScroll: on iOS).
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
Is it possible to position windows on the floor by changing some setting? Currently, they cannot be placed on the floor due to drag limitations.
The issue described here in this stack overflow conversation is still an issue today when it comes to the read back of the last 4 digits in the phone numbers for North American numbers as minus.
Is there a solution other than overriding the accessibleLabel property?
Description
When calling AppStore.showManageSubscriptions(in:), the system modal for managing subscriptions appears visually. However, it is not automatically focused by VoiceOver, and in some cases, VoiceOver still allows interaction with elements in the underlying view controller, such as buttons and labels. This creates confusion and violates accessibility expectations.
Steps to Reproduce
1. In a UIKit app, present the system subscription sheet via AppStore.showManageSubscriptions(in:).
2. Ensure VoiceOver is enabled on the device.
3. Observe the focus behavior when the modal appears.
4. Try swiping right/left — VoiceOver continues to announce items in the presenting view controller.
Expected Result
The modal should automatically take VoiceOver focus, and all elements behind it should be non-accessible until dismissed.
Actual Result
VoiceOver continues to focus and interact with elements behind the presented modal.
Notes
• Tested on iOS 18.5
• Reproducible on device
• Using Swift/UIKit (not SwiftUI)
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need?
For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient.
Is there a better way to handle this using VoiceOver?
How can I force VoiceOver to read parentheses for math expressions like this:
Text("(2+3)×4") // VoiceOver: Two plus three, times four
I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks.
Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.”
Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
Hi,I applied for the COMMUNICATION capability, but have a message that I already have the driving task app entitlement.
After that ,I have applied one more time ,there is no reply anymore.
I do not have the com.apple.developer.carplay-communication capability, that means I can not apply this capability?
What should i do next to get this capatibility?
Thanks
AVPlayer has 3 visual accessibility issues with videos out of the box:
The contrast fails for the current time in the video
The contrast fails for the remaining time in the video
The hit area is too small for the time slider. The WCAG AA requirement is a minimum hit size of 24 x 24. The height of the hit area of the offending region is 8.
Is there a known fix for any of these?
This can be reproduced with this code in an app playground:
import SwiftUI
import AVKit
import UIKit
struct ContentView: View {
private let video = URL(string: "https://server15700.contentdm.oclc.org/dmwebservices/index.php?q=dmGetStreamingFile/p15700coll2/15.mp4/byte/json")!
@State private var player: AVPlayer?
var body: some View {
VStack {
VideoPlayerView(player: player)
.frame(maxWidth: .infinity, maxHeight: 200)
}
.task {
player = try? await loadPlayer(video: video)
}
}
}
private struct VideoPlayerView: UIViewControllerRepresentable {
let player: AVPlayer?
func makeUIViewController(context: Context) -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = player
controller.modalPresentationStyle = .overFullScreen
return controller
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {
uiViewController.player = player
}
}
private func loadPlayer(video: URL) async throws -> AVPlayer {
let videoAsset = AVURLAsset(url: video)
let videoPlusSubtitles = AVMutableComposition()
try await videoPlusSubtitles.add(videoAsset, withMediaType: .video)
try await videoPlusSubtitles.add(videoAsset, withMediaType: .audio)
return await AVPlayer(playerItem: AVPlayerItem(asset: videoPlusSubtitles))
}
private extension AVMutableComposition {
func add(_ asset: AVAsset, withMediaType mediaType: AVMediaType) async throws {
let duration = try await asset.load(.duration)
try await asset.loadTracks(withMediaType: mediaType).first.map { track in
let newTrack = self.addMutableTrack(withMediaType: mediaType, preferredTrackID: kCMPersistentTrackID_Invalid)
let range = CMTimeRangeMake(start: .zero, duration: duration)
try newTrack?.insertTimeRange(range, of: track, at: .zero)
}
}
}
Is the accessibility feature, voice command recording available on the Apple Vision Pro? It does not start on my device.
The Apple Vision Pro is on 26.1.
Regular single voice commands work on the Apple Vision Pro.
Recording commands worked on other devices. (iPad and iPhone)
I'd like to add borders to all buttons in the iOS simulator from my Mac app. First I get the simulator window. Then I access the children of all AXGroup and if it's a button or a static text, I add a border.
But for some buttons this does not work. In the example image the NavigationBarButtons are not found. I guess the problem is, that for some AXGroup the children array access with AXChildren is empty.
Here is some relevant code:
- (NSArray<DDHOverlayElement *> *)overlayChildrenOfUIElement:(AXUIElementRef)element index:(NSInteger)index {
NSMutableArray<DDHOverlayElement *> *tempOverlayElements = [[NSMutableArray alloc] init];
NSLog(@">>> -----------------------------------------------------");
NSString *role = [UIElementUtilities roleOfUIElement:element];
NSRect frame = [UIElementUtilities frameOfUIElement:element];
NSLog(@"%@, role: %@, %@", element, role, [NSValue valueWithRect:frame]);
NSArray *lineage = [UIElementUtilities lineageOfUIElement:element];
NSLog(@"lineage: %@", lineage);
NSArray<NSValue *> *children = [UIElementUtilities childrenOfUIElement:element];
if (children.count < 1) {
NSLog(@"NO CHILDREN");
}
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"----%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
}
NSLog(@"<<< -----------------------------------------------------");
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
if ([role isEqualToString:@"AXButton"] ||
[role isEqualToString:@"AXTextField"] ||
[role isEqualToString:@"AXStaticText"]) {
NSString *tag = [NSString stringWithFormat:@"%ld%ld", (long)index, (long)i];
NSLog(@"tag: %@", tag);
DDHOverlayElement *overlayElement = [[DDHOverlayElement alloc] initWithUIElementValue:child tag:tag];
[tempOverlayElements addObject:overlayElement];
} else if ([role isEqualToString:@"AXGroup"] ||
[role isEqualToString:@"AXToolbar"]) {
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:++index]];
} else if ([role isEqualToString:@"AXWindow"]) {
[self.overlayWindowController setFrame:[UIElementUtilities frameOfUIElement:uiElement]];
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:index]];
}
}
return [tempOverlayElements copy];
}
For some AXGroup the children are found. For some they are empty. I cannot figure out why.
Does anyone have an idea what I'm doing wrong?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I’m in the process of enrolling my business (Carzo Rent A Car, Prishtine, Kosovo) in the Apple Developer Program, but I have been waiting for my D-U-N-S number to be issued.
I submitted the request to Dun & Bradstreet on July 28, 2025 (Case #9142648) and have only received a system-generated email with a tracking ID (#9086421). There has been no further update.
My questions are:
Is there a way for Apple to expedite or provisionally approve my enrollment while the D-U-N-S number is pending?
How long does Apple typically wait for D&B updates before the enrollment is affected?
Are there any alternative steps I can take to avoid further delays?
Thank you for your guidance.
Topic:
Accessibility & Inclusion
SubTopic:
General
Dear developer team,
After updating to iOS 18.3.1 I noticed the font in the Notes app became too small to read comfortably, and I have already got poor eyesight.
There is no way to increase the font size. When I select my preferred text size through Accessibility settings, it only changes the size of headings in the Notes app but the text remains too small in the note itself. I’m using the IPhone 13.
I googled the issue and seems like other users across the Internet are also unhappy about the lack of ability to change the text size in Notes to suit their comfortable levels.
I hope that this issue will be addressed by developers in the next version of the iOS because the reading size in the standard app can affect health for the tired and diminished eyesight.
Kind regards,
Maria
Topic:
Accessibility & Inclusion
SubTopic:
General
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused.
According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup.
In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact.
So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?