Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Crash when changing the accessibilityElements to custom UIAccessibilityElement
I got a UIControl, and I want to make it behavior like a custom UIAccessibilityElement. UIControl *control = [[UIControl alloc] init]; control.isAccessibilityElement = NO; CustomAccessibilityElement *elem = [[CustomAccessibilityElement alloc] initWithAccessibilityContainer:control]; elem.isAccessibilityElement = YES; // some custom setting here control.accessibilityElements = @[elem]; It worked well with an iPhone 13, iOS 15.5.1, but crashed with an iPhone SE, iOS 15.4.1 and the crash msg is : "-[UIAccessibilityElement _addAccessibilityElementsAndOrderedContainersWithOptions:toCollection:]: unrecognized selector sent to instance 0x283b7c680" Can you tell me the reason ? Thanks a lot.
1
0
459
Oct ’24
how accessible is enough for Accessibility Nutrition Labels?
My team has a robust digital accessibility program and processes for WCAG conformance in our apps. Because of this, there are definitely accessibility defects that get caught and addressed in order of impact and business priority like any other bug. Obviously we want to aim for 100% accessibility for our users, but it's a continual work in progress as new enhancements or changes are released. I'm stuck on the appropriate measurement to indicate support. If we have 50 common tasks and the most central 10 tasks are solid but some supporting (but also common) tasks have a contrast fail or accessibleLabel missing, does that make the whole app not supporting the feature? If "completing the task" is the rubric there are a whole range of interpretations for that. In a complex app, I anticipate that a group like ours will have strong support for many of the Accessibility Nutrition Labels accessibility features across tasks and devices, but realistically never be 100% free of defects for a given Apple Accessibility feature, even among core tasks. As I consider the next steps for Nutrition Labels, I do not see anything in the documentation that gives a sort of baseline or measurement for inclusion. We plan to test all steps to complete a task, and log defects accordingly with an assigned timeline for fixing them (as would be true for functional defects).
2
0
83
Aug ’25
Microphone Not Working When Running Unity Vision Pro App Normally
} // Start listening to the microphone public void StartListening() { if (!isListening) { #if UNITY_IOS || UNITY_TVOS microphoneInput = Microphone.Start(null, true, 10, 44100); #else try { microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100 if (microphoneInput == null) { microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate); } #endif isListening = true; Debug.Log(Microphone.devices.Length + " Started listening..."); debugText.text = Microphone.devices.Length + "- Started listening..."; } catch (System.Exception e) { Debug.LogError($"Starting microphone failed: {e.Message}"); debugText.text = $"Starting microphone failed: {e.Message}"; } } } void Update() { if (isListening && microphoneInput != null) { // Analyze the audio for voice activity float volume = GetAverageVolume(); if (volume > detectionThreshold) { Debug.Log("User is speaking!"); lastVoiceTime = Time.time; SoundDetected = true; if (Time.time - lastVoiceTime > silenceDuration) { Debug.Log("User is silent."); debugText.text = volume.ToString() + " - User is silent."; } slider.value = volume; } } } private float GetAverageVolume() { float[] samples = new float[128]; microphoneInput.GetData(samples, Microphone.GetPosition(null)); float sum = 0f; foreach (float sample in samples) { sum += Mathf.Abs(sample); } return sum / samples.Length; } Problem: When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected. Question: Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario? Thanks in advance for any insights! Best, Siddharth
2
0
389
Feb ’25
AVSpeechSynthesisProviderVoice audioFileSettings field
Hello, the AVSpeechSynthesisVoice has a audioFileSettings attributes let utterance = AVSpeechUtterance(string: text) utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!) print("- voice \(utterance.voice!.audioFileSettings)") ["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32] This is declared in AVSpeechSynthesisVoice { ... @available(iOS 13.0, *) open var **audioFileSettings:** [String : Any] { get } @available(iOS 17.0, *) open var voiceTraits: AVSpeechSynthesisVoice.Traits { get } } How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ? Cause in AVSpeechSynthesisProviderVoice there is no such field AVSpeechSynthesisProviderVoice { open var name: String { get } open var identifier: String { get } open var primaryLanguages: [String] { get } open var supportedLanguages: [String] { get } open var voiceSize: Int64 open var version: String open var gender: AVSpeechSynthesisVoiceGender open var age: Int } Regards
2
0
109
Mar ’25
Bulgarian cyrillic alphabet appears to be Russian, but should look Bulgarian
Hello So if you use the Bulgarian keyboard, you get these characters: явертъуиопюасдфгхйклшщзьцжбнмч This isn’t really right for Bulgaria, because т should look like m, and д should look like g, and other characters should look like rotated or mirrored Latin characters. E.g., г should look like a backwards s. Compare the Bulgaria Wikipedia page in Bulgarian: https://bg.m.wikipedia.org/wiki/%D0%91%D1%8A%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F with the Bulgaria Wikipedia page in Russian: https://ru.m.wikipedia.org/wiki/%D0%91%D0%BE%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F Notice that the letters are different. Anyhow, the ios Bulgarian font is just Russian Cyrillic, and that seems like an unintended bug rather than an intentional stylistic choice, basically.
1
0
307
Feb ’25
Hi,I applied for the COMMUNICATION capability failed, what should I do next?
Hi,I applied for the COMMUNICATION capability, but have a message that I already have the driving task app entitlement. After that ,I have applied one more time ,there is no reply anymore. I do not have the com.apple.developer.carplay-communication capability, that means I can not apply this capability? What should i do next to get this capatibility? Thanks
2
0
1.1k
Jul ’25
Guided Access Mode From Background
My team is designing an app for retail associates that need to share managed iPads. We keep the app in Guided Access mode on our login app until an auth token is obtained. Then the iPad is opened for general use. Upon signout we need to re-enter guided access mode and we can do this via manual signout easily. But with idle signout, ie after 60 minutes of inactivity, we need to be able to make a call from the background (in a locked state even) and sign out the user, clear the pin code and enter single app mode before restarting. So that hopefully once the device restarts, we have the app in a locked state again until the next user provides credentials that can obtain a new auth token. We are struggling to see if this is even possible. Our bosses will be displeased if we tell them it isn't. So anybody with any tips would be very appreciated.
2
0
205
Mar ’25
Accessibility full keyboard access issue.
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
2
0
523
Mar ’25
iOS 18 Siri Shortcut with a phrase does not appear in Shortcuts App any more
I was able to add shortcuts with parameters and use them from the Shprtcuts app in iOS 17, nevertheless Siri intent did never work. I upgraded to iOS 18 my app and my mobile. Now, the shortcut only appears in shortcuts app if no parameter is added to it. When I try to set a parameter, the shortcut does not appear any mora in Shortcuts app. struct ShortcutsProvider: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: OpenAppIntent(), phrases: [ "Show (.$screen) in (.applicationName)" ], shortTitle: "Open", systemImageName: "iphone.badge.play" ) } } struct OpenAppIntent: AppIntent { static var title: LocalizedStringResource = "Show" static let description = IntentDescription("Shows a screen.") static var openAppWhenRun: Bool = true static var authenticationPolicy = IntentAuthenticationPolicy.alwaysAllowed @Parameter(title: "screen") var screen: String @MainActor func perform() async throws -> some IntentResult { return .result() } } extension ScreenOption: AppEntity { struct OpenAppQuery: EntityQuery { @IntentParameterDependency<OpenAppIntent>( \.$screen ) var openAppIntent func entities(for: [ScreenOption.ID]) async throws -> [ScreenOption] { return [] } func suggestedEntities() async throws -> [ScreenOption] { return [] } } var displayRepresentation: DisplayRepresentation { .init(stringLiteral: "\(title)") } static var defaultQuery: OpenAppQuery = OpenAppQuery() static var typeDisplayRepresentation: TypeDisplayRepresentation = .init(name: "Screen") } extension ScreenOption: EntityIdentifierConvertible { static func entityIdentifier(for entityIdentifierString: String) -> ScreenOption? { allCases.filter { $0.rawValue == entityIdentifierString }.first } public var entityIdentifierString: String { rawValue } public init?(entityIdentifierString: String) { guard let screenOption = ScreenOption.entityIdentifier(for: entityIdentifierString) else { return nil } self = screenOption } }
1
0
723
Nov ’24
Does iPhone 15 Pro Use a Single Microphone or Multiple Microphones for Voice and Sound Recognition?
Hello, I have a question regarding the voice and sound recognition features on the iPhone 15 Pro. The iPhone 15 Pro is equipped with four microphones, and I understand that for features like Apple’s sound recognition and when invoking Siri, the microphone(s) must always be active. My question is whether the device uses a single microphone (mono channel) for these functions or if multiple microphones are activated simultaneously. I would appreciate clarification on how the microphones are utilized in sound and voice recognition features. Thank you for your assistance. Best regards.
1
0
686
Oct ’24
VoiceOver cursor focus tracking
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients. We need to know which item has the VoiceOver focus so we can keep track of it. setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused). At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder. Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element? Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
4
0
451
Mar ’25
cd on terminal
When I'm in TERMINAL and I issue the cd command it says no such file or directory. I cut this from the terminal session: Last login: Tue Nov 12 20:10:57 on ttys000 The default interactive shell is now zsh. To update your account to use zsh, please run chsh -s /bin/zsh. For more details, please visit https://support.apple.com/kb/HT208050. iMac:~ robertsantovasco$ cd desktop iMac:desktop robertsantovasco$ cd L1 demo -bash: cd: L1: No such file or directory iMac:desktop robertsantovasco$ cd L1 -bash: cd: L1: No such file or directory iMac:desktop robertsantovasco$ cd L1 demo -bash: cd: L1: No such file or directory iMac:desktop robertsantovasco$ cd / iMac:/ robertsantovasco$ cd desktop -bash: cd: desktop: No such file or directory iMac:/ robertsantovasco$ cd L1 demo -bash: cd: L1: No such file or directory iMac:/ robertsantovasco$ cd L1 demo -bash: cd: L1: No such file or directory iMac:/ robertsantovasco$ cd desktop -bash: cd: desktop: No such file or directory iMac:/ robertsantovasco$ cd desktop -bash: cd: desktop: No such file or directory iMac:/ robertsantovasco$ CD desktop /usr/bin/CD: line 4: cd: desktop: No such file
5
0
669
Nov ’24