My team is designing an app for retail associates that need to share managed iPads. We keep the app in Guided Access mode on our login app until an auth token is obtained. Then the iPad is opened for general use. Upon signout we need to re-enter guided access mode and we can do this via manual signout easily. But with idle signout, ie after 60 minutes of inactivity, we need to be able to make a call from the background (in a locked state even) and sign out the user, clear the pin code and enter single app mode before restarting. So that hopefully once the device restarts, we have the app in a locked state again until the next user provides credentials that can obtain a new auth token.
We are struggling to see if this is even possible. Our bosses will be displeased if we tell them it isn't. So anybody with any tips would be very appreciated.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
When my app is in the background, I create a Live Activity through a push notification with token get from pushToStartTokenUpdates, and this process works fine. However, without opening the app, how can I retrieve the new push token for this Live Activity again and use it for subsequent updates to the Live Activity content?
In our application we are using UITableView for data population and that TableView cell contains a button. When we are enabling full keyboard access that time only TableView cell is focusing not the button. We need to focus on cell and button differently.
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
Hello,
the AVSpeechSynthesisVoice has a audioFileSettings attributes
let utterance = AVSpeechUtterance(string: text)
utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!)
print("- voice \(utterance.voice!.audioFileSettings)")
["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32]
This is declared in
AVSpeechSynthesisVoice {
...
@available(iOS 13.0, *)
open var **audioFileSettings:** [String : Any] { get }
@available(iOS 17.0, *)
open var voiceTraits: AVSpeechSynthesisVoice.Traits { get }
}
How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ?
Cause in AVSpeechSynthesisProviderVoice there is no such field
AVSpeechSynthesisProviderVoice {
open var name: String { get }
open var identifier: String { get }
open var primaryLanguages: [String] { get }
open var supportedLanguages: [String] { get }
open var voiceSize: Int64
open var version: String
open var gender: AVSpeechSynthesisVoiceGender
open var age: Int
}
Regards
Topic:
Accessibility & Inclusion
SubTopic:
General
While it is possible to scroll content using VoiceOver on macOS, I was not able to find any NSAccessibility APIs related to it (such as accessibilityScroll: on iOS).
In our application we are using a Search bar in a pop over view and we have enabled Accessibility full keyboard access and we are using external keyboard. Now if the focus is on Searcher that time by next Tab key press Search bar will dismiss and focus needs to shift to the next UIElement.
I’m trying to set the accessibilityActivationPoint directly on a UITableViewCell so that VoiceOver activate on a specific button inside the cell. However, this approach doesn’t seem to work.
Instead, when I override the accessibilityActivationPoint property inside the UITableViewCell subclass and return the desired point, it works as expected.
Why doesn’t setting accessibilityActivationPoint directly on the cell work, but overriding it inside the cell does? Is there a recommended approach for handling this scenario?
The following approach works,
override var accessibilityActivationPoint: CGPoint {
get {
return convert(toggleSwitch.center, to: nil)
}
set{
super.accessibilityActivationPoint = newValue
}
}
but setting accessibility point directly not works
private func configureAccessibility() {
isAccessibilityElement = true
accessibilityLabel = titleLabel.text
accessibilityTraits = .toggleButton
accessibilityActivationPoint = self.convert(toggleSwitch.center, to: self)
accessibilityValue = toggleSwitch.accessibilityValue
}
How can I force VoiceOver to read parentheses for math expressions like this:
Text("(2+3)×4") // VoiceOver: Two plus three, times four
I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks.
Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.”
Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
Hi guys, I'm facing an issue with the native interface to add a card into the wallet - does someone have some ideas on how to fix/work around that?
STEPS TO REPRODUCE:
Disable VoiceOver (Settings → Accessibility → VoiceOver → Off).
Connect and confirm that you can navigate other iOS interfaces using an external keyboard.
In any app, present a PKAddPassesViewController with a valid .pkpass file.
When the Wallet “Add Pass” sheet appears, attempt to navigate using only the external keyboard (Tab/Arrow/Enter).
Observe that focus does not move to the Cancel or Add buttons, and no elements receive keyboard focus.
EXPECTED RESULT:
All interactive elements in PKAddPassesViewController (e.g., Cancel and Add) should be fully keyboard accessible without requiring VoiceOver. Users should be able to navigate, select, and complete actions using only a hardware keyboard.
ACTUAL RESULT:
Keyboard navigation is not possible.
No elements receive focus.
Users cannot activate Cancel or Add buttons using keyboard input.
The only way to interact is by touch or enabling VoiceOver, which does not satisfy keyboard accessibility requirements.
IMPACT:
Violates WCAG 2.1 Success Criterion 2.1.1 (Keyboard Accessible).
Prevents keyboard-only users (including users with motor disabilities) from adding passes to Wallet.
Affects users of external keyboards who rely on tab/arrow navigation.
Creates an inconsistent accessibility experience compared to other iOS system modals.
Hello,
I’m in the process of enrolling my business (Carzo Rent A Car, Prishtine, Kosovo) in the Apple Developer Program, but I have been waiting for my D-U-N-S number to be issued.
I submitted the request to Dun & Bradstreet on July 28, 2025 (Case #9142648) and have only received a system-generated email with a tracking ID (#9086421). There has been no further update.
My questions are:
Is there a way for Apple to expedite or provisionally approve my enrollment while the D-U-N-S number is pending?
How long does Apple typically wait for D&B updates before the enrollment is affected?
Are there any alternative steps I can take to avoid further delays?
Thank you for your guidance.
Topic:
Accessibility & Inclusion
SubTopic:
General
In our application we are using OTP login. When accessibility full keyboard access is enabled, and we are trying to enter OTP in the OTP field that time in iOS 17 focus is moving to the next text field accordingly but in iOS 18 focus is staying the first OTP field only and not moving to the next text field.
I have a UIImageView as the background of a custom UIView subclass. The image itself does not contain any text. On top of this image view, I have added two UILabels.
To improve accessibility, I converted the entire view into a single accessibility element and set a proper accessibilityLabel. Additionally, I disabled accessibility for the UIImageView and the labels by setting isAccessibilityElement = false.
However, when VoiceOver's Accessibility Recognition's Text Recognition feature is enabled, VoiceOver still detects and announces the text inside the UILabels at the end after reading my custom accessibility properties. This text should not be announced.
It seems that VoiceOver treats the UILabel content as part of the UIImageView. Additionally, when using the Explore Image rotor action, the entire subview is recognized as a single image.
Is this the expected behavior? If so, is there a way to disable VoiceOver’s text recognition for this view while keeping custom accessibility intact?
class BackgroundLabelView: UIView {
private let backgroundImageView = UIImageView()
private let backgroundImageView2 = UIImageView()
private let titleLabel = UILabel()
private let subtitleLabel = UILabel()
override init(frame: CGRect) {
super.init(frame: frame)
setupView()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupView()
configureAceesibility()
}
private func configureAceesibility() {
backgroundImageView.isAccessibilityElement = false
backgroundImageView2.isAccessibilityElement = false
titleLabel.isAccessibilityElement = false
subtitleLabel.isAccessibilityElement = false
isAccessibilityElement = true
accessibilityTraits = .button
}
func configure(backgroundImage: UIImage?, title: String, subtitle: String) {
backgroundImageView.image = backgroundImage
titleLabel.text = title
subtitleLabel.text = subtitle
accessibilityLabel = "Holiday Offer ," + title + "," + subtitle
}
private func setupView() {
backgroundImageView2.contentMode = .scaleAspectFill
backgroundImageView2.clipsToBounds = true
backgroundImageView2.translatesAutoresizingMaskIntoConstraints = false
backgroundImageView2.image = UIImage(resource: .bannerfestival)
addSubview(backgroundImageView2)
backgroundImageView.contentMode = .scaleAspectFit
backgroundImageView.clipsToBounds = true
backgroundImageView.translatesAutoresizingMaskIntoConstraints = false
addSubview(backgroundImageView)
titleLabel.font = UIFont.systemFont(ofSize: 18, weight: .bold)
titleLabel.textColor = .white
titleLabel.translatesAutoresizingMaskIntoConstraints = false
titleLabel.numberOfLines = 0
addSubview(titleLabel)
subtitleLabel.font = UIFont.systemFont(ofSize: 14, weight: .regular)
subtitleLabel.textColor = .white.withAlphaComponent(0.8)
subtitleLabel.translatesAutoresizingMaskIntoConstraints = false
subtitleLabel.numberOfLines = 0
addSubview(subtitleLabel)
NSLayoutConstraint.activate([
backgroundImageView2.leadingAnchor.constraint(equalTo: leadingAnchor),
backgroundImageView2.trailingAnchor.constraint(equalTo: trailingAnchor),
backgroundImageView2.heightAnchor.constraint(equalToConstant: 200),
backgroundImageView.centerYAnchor.constraint(equalTo: centerYAnchor),
backgroundImageView.topAnchor.constraint(equalTo: topAnchor),
backgroundImageView.leadingAnchor.constraint(greaterThanOrEqualTo: leadingAnchor),
backgroundImageView.trailingAnchor.constraint(equalTo: trailingAnchor),
backgroundImageView.bottomAnchor.constraint(equalTo: bottomAnchor),
titleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16),
titleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor),
titleLabel.bottomAnchor.constraint(equalTo: centerYAnchor, constant: -4),
subtitleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16),
subtitleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor),
subtitleLabel.topAnchor.constraint(equalTo: centerYAnchor, constant: 4)
])
}
override func layoutSubviews() {
super.layoutSubviews()
backgroundImageView.layer.cornerRadius = layer.cornerRadius
}
}
Description
When calling AppStore.showManageSubscriptions(in:), the system modal for managing subscriptions appears visually. However, it is not automatically focused by VoiceOver, and in some cases, VoiceOver still allows interaction with elements in the underlying view controller, such as buttons and labels. This creates confusion and violates accessibility expectations.
Steps to Reproduce
1. In a UIKit app, present the system subscription sheet via AppStore.showManageSubscriptions(in:).
2. Ensure VoiceOver is enabled on the device.
3. Observe the focus behavior when the modal appears.
4. Try swiping right/left — VoiceOver continues to announce items in the presenting view controller.
Expected Result
The modal should automatically take VoiceOver focus, and all elements behind it should be non-accessible until dismissed.
Actual Result
VoiceOver continues to focus and interact with elements behind the presented modal.
Notes
• Tested on iOS 18.5
• Reproducible on device
• Using Swift/UIKit (not SwiftUI)
Topic:
Accessibility & Inclusion
SubTopic:
General
}
// Start listening to the microphone
public void StartListening()
{
if (!isListening)
{
#if UNITY_IOS || UNITY_TVOS
microphoneInput = Microphone.Start(null, true, 10, 44100);
#else
try
{
microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100
if (microphoneInput == null)
{
microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate);
}
#endif
isListening = true;
Debug.Log(Microphone.devices.Length + " Started listening...");
debugText.text = Microphone.devices.Length + "- Started listening...";
}
catch (System.Exception e)
{
Debug.LogError($"Starting microphone failed: {e.Message}");
debugText.text = $"Starting microphone failed: {e.Message}";
}
}
}
void Update()
{
if (isListening && microphoneInput != null)
{
// Analyze the audio for voice activity
float volume = GetAverageVolume();
if (volume > detectionThreshold)
{
Debug.Log("User is speaking!");
lastVoiceTime = Time.time;
SoundDetected = true;
if (Time.time - lastVoiceTime > silenceDuration)
{
Debug.Log("User is silent.");
debugText.text = volume.ToString() + " - User is silent.";
}
slider.value = volume;
}
}
}
private float GetAverageVolume()
{
float[] samples = new float[128];
microphoneInput.GetData(samples, Microphone.GetPosition(null));
float sum = 0f;
foreach (float sample in samples)
{
sum += Mathf.Abs(sample);
}
return sum / samples.Length;
}
Problem:
When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected.
Question:
Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario?
Thanks in advance for any insights!
Best,
Siddharth
I have an app that uses nearby with a custom accessory.
works great on iPhone 11-13,
starting with iPhone 14, one must use ARkit to get angles
we have two problems
ARkit is light sensitive, and we do not control the lighting where this app would run.. the 11-13 action works great even in the dark. (our users are blind, this is an accessibility app)
ARkit wants to be foreground, but our uses cannot see it, and we have a voice oriented UI that provides navigation instructions..
IF ARkit is foreground, our app doesn't work.
with iPhone 15 ProMax, on IOS 18, I got an error, access denied. (not permission denied) now that I am on IOS 26.. bt scan doesn't happen
also fails same way on iPhone 17 on IOS26, can't callback now as release signing is no longer done
this same code works ok on iOS 17.1 on iPhone 12.
Info.plist here
info.txt
if(SearchedServices == [] ){
services = [TransferService.serviceUUID,QorvoNIService.serviceUUID]
}
logger.info(
"scannerready, starting scan for peripherals \(services) and devices \(IDs)")
filteredIDs=IDs;
scanning=true;
centralManager.scanForPeripherals(withServices: services,
options: [CBCentralManagerScanOptionAllowDuplicatesKey: true])
the calling code
dataChannel.autoConnect=autoConnect;
dataChannel.start(x,ids) // datachannel.start is above
self.scanning = true;
return "scanning started";
... log output
services from js = and devices= 5FE04CBB
services in implementation =
bluetooth ready, starting scan for peripherals [] and devices ["5FE04CBB"]
scannerready, starting scan for peripherals [6E400001-B5A3-F393-E0A9-E50E24DCCA9E, 2E938FD0-6A61-11ED-A1EB-0242AC120002] and devices ["5FE04CBB"]
⚡️ TO JS {"value":"scanning started"}
Hello,
In our app we provide a button that initiates a phone call using tel://.
For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel.
If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call.
However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT.
Is this the expected system behavior for emergency numbers on iOS?
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option?
Thanks in advance for clarifying.
I would like to enable the option to resize windows with the apple pencil pro. I tried but I see that this feature is not enabled.
Topic:
Accessibility & Inclusion
SubTopic:
General
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused.
According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup.
In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact.
So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window
AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1
CFTypeRef frontWindow = NULL;
AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );
if ( err != kAXErrorSuccess ){
NSLog(@"failed with error: %i",err);
}
NSLog(@"app: %@ frontWindow: %@",app,frontWindow);
'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?