In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients.
We need to know which item has the VoiceOver focus so we can keep track of it.
setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused).
At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder.
Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element?
Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking
If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Do we need to take approval for any entitlement for the extension Unwanted Communication because currently I do not see my app in the Settings under SMS/Call Reporting extensions.
I have reported this bug on multiple macOS version and it never gets fixed, so I am posting it here in hopes someone at Apple will see this and fix the issue. I use voice control extensively (in fact, it is THE reason I use Macs. I am an amputee, and voice control makes life much easier and there is nothing even close on Windows (or Linux, but Linux is barely usable for people with with two arms)). Voice control has a setting to have a screen overlay numbers, names, or a grid, to help indicate what item you're referring to. There is an option to have no overlay. However, even when overlay is set to None, the numbers overlay still appears on screen, even when I haven't triggered something by voice. If I right click on the desktop, for example, the numbers appear on the menu. This bug has been in macOS for as long as I can remember. I really hope someone at Apple can fix this. There are quite a few other bugs I've reported with Feedback assistant over the years that go unfixed, this is one of the more annoying ones.
Topic:
Accessibility & Inclusion
SubTopic:
General
We have a requirement to manage the shortcuts and hotkeys in our application, and have it to be intuitive and support multi-lingual fully. The understanding that we have currently is that most universal shortcuts and hotkeys on MacOS/iOS are expressed using English/Latin characters’ – and now, when a ‘pure foreign language physical or virtual keyboard’ is the ‘input device’ – we are unclear how the user would invoke such a hotkey.
Now, considering cases where other language keyboards have no Latin characters, in these environments, managing shortcuts and hotkeys becomes a rather difficult task. Taking a very simple example, the shortcut for Printing a page is Command/Control + 'P'. This can be an issue on Non English character keyboards like Arabic, where not only are there no letters for P, there is also no equivalent phonetic character as well, since the language itself does not have it.
Also – when we are wanting customizability of a hotkey by the user, how would the user express ‘which is the key combination for a given action they want to perform’.
So, based on these conditions, in order to provide the most comprehensive and optimal experience for the user in their own language, what is it that Apple recommend we do here, for Hotkeys/Shortcuts support in Pure Languages
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
InputMethodKit
Internationalization
Shortcuts
Localization
When my app is in the background, I create a Live Activity through a push notification with token get from pushToStartTokenUpdates, and this process works fine. However, without opening the app, how can I retrieve the new push token for this Live Activity again and use it for subsequent updates to the Live Activity content?
In our application we are using UITableView for data population and that TableView cell contains a button. When we are enabling full keyboard access that time only TableView cell is focusing not the button. We need to focus on cell and button differently.
In our application we are using a Search bar in a pop over view and we have enabled Accessibility full keyboard access and we are using external keyboard. Now if the focus is on Searcher that time by next Tab key press Search bar will dismiss and focus needs to shift to the next UIElement.
I get it, "Why don't you just get an apple watch?" regardless, My Macbok Pro M4 doesn't recognize my charging cable in any of my USBC ports.
The Cable works with any other power supply I plug it into, but the Macbook doesn't even register that a cable is connected.
-- Running Sequioa Beta 15.4 thinking it may have been software related. No change.
-- Settings> Privacy and Security> accessories> Changed between all available options. No change.
-- Option + Apple Logo> system info> Thunderbolt/USB4> none of the ports show that the cable is connected.
-- Any other USB-C Cable is recognized in any of the ports on the Macbook. Just not the cable for the Galaxy Watch.
Again, the cable charges in ANY other USBc ports from ANY other device I connect it to.
Am I missing something? Or is this an intended jab at Samsung from Apple? lol
Topic:
Accessibility & Inclusion
SubTopic:
General
In our application we are using OTP login. When accessibility full keyboard access is enabled, and we are trying to enter OTP in the OTP field that time in iOS 17 focus is moving to the next text field accordingly but in iOS 18 focus is staying the first OTP field only and not moving to the next text field.
This has been an ongoing issue and continues in Tahoe. When dictating into Gmail in Safari, whole portions of sentences are copy and pasted, making the text a mess. I have reported this in feedback for a couple years, and it has never been resolved.
Topic:
Accessibility & Inclusion
SubTopic:
General
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
There is an issue with Help Books that started with the release of macOS 14.4. The issue is that when an app attempts to go directly to a Help Book page, the help viewer opens to the Help Book's main index page, rather than the specific page requested. As I investigated the issue I found that the requested page was actually part of help viewer's navigation history, and all I had to do was to click the Back navigation arrow and the requested page would be displayed. So it seems like the requested page is momentarily visited but is then (for whatever reason) quickly replaced by the main index page.
Our app uses the AHGotoPage() API for directly accessing our Help Book's pages. This is the same mechanism/code that our app has used for more than a decade and has never caused us any issues. Everything works fine on macOS 14.3.0 and earlier. I've scoured the documentation and can't find any newer APIs for accessing Help pages. I've also tried various other things (e.g. reworking the code, creating new indexes for the app's Help, etc.), but none of it seems to make a difference. As far as I can tell, the issue seems to stem from some change made to the OS.
So my questions are:
Is this a known bug? And if so, is there any ETA on a fix?
Is there something different we should be doing for newer versions of the OS (create indexes differently, use a different API, etc.)?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello everyone,
I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5.
This has also been reported via Feedback Assistant with
Feedback ID: FB17806167
Description:
When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation.
Specifically:
The pointer cannot move past the center of the screen
Horizontal and vertical (X/Y) movements appear to be swapped or misaligned
Natural movement of the pointer is not possible
It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape.
This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps.
Steps to Reproduce:
Enable AssistiveTouch
Connect a Bluetooth mouse to the iPhone
Rotate the device to landscape orientation
Try moving the mouse pointer across the screen
→ Notice that:
Pointer cannot move past the center
Horizontal/vertical input is interpreted incorrectly (as if still in portrait)
Expected Behavior:
The mouse pointer should move across the entire screen correctly, regardless of device orientation.
Actual Behavior:
In landscape orientation, the pointer is either restricted to part of the screen or misaligned.
It behaves as if the device is still in portrait.
Horizontal mouse movement causes vertical pointer movement, and vice versa
User experience feels broken and unintuitive
Feature Suggestion:
Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS.
I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior.
Thanks in advance for any insights or suggestions.
Best regards,
Jannis
I’ve developed the Pro Talkie app—a walkie-talkie solution designed to keep you connected with family and friends
App Store: https://apps.apple.com/in/app/pro-talkie/id6742051063
Play Store: https://play.google.com/store/apps/details?id=com.protalkie.app
While the app works flawlessly on Android and in the foreground on iOS, I’m facing issues with establishing connections when the app is in the background or terminated on iOS.
Specifically, I’ve attempted the following:
Silent pushes and alert payloads: These are intended to wake the app in the background, but they often fail—notifications may not be received or can be delayed by 20–30 minutes, leading to a poor user experience.
VoIP pushes: These reliably wake the app, but they trigger the incoming call UI, which isn’t suitable for a walkie-talkie app that should connect directly without displaying a call screen.
I’ve enabled all the necessary background modes (audio, remote notifications, VoIP, background fetch, processing), but the challenge remains.
How can I ensure a consistent background connection on iOS without triggering the call UI?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
APNS
User Notifications
PushKit
Push To Talk
Hello,
the AVSpeechSynthesisVoice has a audioFileSettings attributes
let utterance = AVSpeechUtterance(string: text)
utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!)
print("- voice \(utterance.voice!.audioFileSettings)")
["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32]
This is declared in
AVSpeechSynthesisVoice {
...
@available(iOS 13.0, *)
open var **audioFileSettings:** [String : Any] { get }
@available(iOS 17.0, *)
open var voiceTraits: AVSpeechSynthesisVoice.Traits { get }
}
How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ?
Cause in AVSpeechSynthesisProviderVoice there is no such field
AVSpeechSynthesisProviderVoice {
open var name: String { get }
open var identifier: String { get }
open var primaryLanguages: [String] { get }
open var supportedLanguages: [String] { get }
open var voiceSize: Int64
open var version: String
open var gender: AVSpeechSynthesisVoiceGender
open var age: Int
}
Regards
Topic:
Accessibility & Inclusion
SubTopic:
General
My team is designing an app for retail associates that need to share managed iPads. We keep the app in Guided Access mode on our login app until an auth token is obtained. Then the iPad is opened for general use. Upon signout we need to re-enter guided access mode and we can do this via manual signout easily. But with idle signout, ie after 60 minutes of inactivity, we need to be able to make a call from the background (in a locked state even) and sign out the user, clear the pin code and enter single app mode before restarting. So that hopefully once the device restarts, we have the app in a locked state again until the next user provides credentials that can obtain a new auth token.
We are struggling to see if this is even possible. Our bosses will be displeased if we tell them it isn't. So anybody with any tips would be very appreciated.
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore.
Is There any reason for this to happening?.
Is anyone aware of this behaviour?
Any suggestion would be really appreciated.
Cheers
}
// Start listening to the microphone
public void StartListening()
{
if (!isListening)
{
#if UNITY_IOS || UNITY_TVOS
microphoneInput = Microphone.Start(null, true, 10, 44100);
#else
try
{
microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100
if (microphoneInput == null)
{
microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate);
}
#endif
isListening = true;
Debug.Log(Microphone.devices.Length + " Started listening...");
debugText.text = Microphone.devices.Length + "- Started listening...";
}
catch (System.Exception e)
{
Debug.LogError($"Starting microphone failed: {e.Message}");
debugText.text = $"Starting microphone failed: {e.Message}";
}
}
}
void Update()
{
if (isListening && microphoneInput != null)
{
// Analyze the audio for voice activity
float volume = GetAverageVolume();
if (volume > detectionThreshold)
{
Debug.Log("User is speaking!");
lastVoiceTime = Time.time;
SoundDetected = true;
if (Time.time - lastVoiceTime > silenceDuration)
{
Debug.Log("User is silent.");
debugText.text = volume.ToString() + " - User is silent.";
}
slider.value = volume;
}
}
}
private float GetAverageVolume()
{
float[] samples = new float[128];
microphoneInput.GetData(samples, Microphone.GetPosition(null));
float sum = 0f;
foreach (float sample in samples)
{
sum += Mathf.Abs(sample);
}
return sum / samples.Length;
}
Problem:
When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected.
Question:
Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario?
Thanks in advance for any insights!
Best,
Siddharth
SwiftUI provides the accessibilityCustomContent(_:_:) modifier to add additional accessibility information for an element. However, I couldn’t find a similar approach in UIKit.
Is there a way to achieve this in UIKit?
Hello
So if you use the Bulgarian keyboard, you get these characters:
явертъуиопюасдфгхйклшщзьцжбнмч
This isn’t really right for Bulgaria, because т should look like m, and д should look like g, and other characters should look like rotated or mirrored Latin characters. E.g., г should look like a backwards s.
Compare the Bulgaria Wikipedia page in Bulgarian: https://bg.m.wikipedia.org/wiki/%D0%91%D1%8A%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F
with the Bulgaria Wikipedia page in Russian: https://ru.m.wikipedia.org/wiki/%D0%91%D0%BE%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F
Notice that the letters are different.
Anyhow, the ios Bulgarian font is just Russian Cyrillic, and that seems like an unintended bug rather than an intentional stylistic choice, basically.
Topic:
Accessibility & Inclusion
SubTopic:
General