Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

cd on terminal
When I'm in TERMINAL and I issue the cd command it says no such file or directory. I cut this from the terminal session: Last login: Tue Nov 12 20:10:57 on ttys000 The default interactive shell is now zsh. To update your account to use zsh, please run chsh -s /bin/zsh. For more details, please visit https://support.apple.com/kb/HT208050. iMac:~ robertsantovasco$ cd desktop iMac:desktop robertsantovasco$ cd L1 demo -bash: cd: L1: No such file or directory iMac:desktop robertsantovasco$ cd L1 -bash: cd: L1: No such file or directory iMac:desktop robertsantovasco$ cd L1 demo -bash: cd: L1: No such file or directory iMac:desktop robertsantovasco$ cd / iMac:/ robertsantovasco$ cd desktop -bash: cd: desktop: No such file or directory iMac:/ robertsantovasco$ cd L1 demo -bash: cd: L1: No such file or directory iMac:/ robertsantovasco$ cd L1 demo -bash: cd: L1: No such file or directory iMac:/ robertsantovasco$ cd desktop -bash: cd: desktop: No such file or directory iMac:/ robertsantovasco$ cd desktop -bash: cd: desktop: No such file or directory iMac:/ robertsantovasco$ CD desktop /usr/bin/CD: line 4: cd: desktop: No such file
5
0
675
Nov ’24
PerformAccessibilityAudit and sufficientElementDescription clarification
Hi, I am writing in the hope to receive some clarification about the rationale of the Audit type sufficientElementDescription - in context with Accessibility Audit API. Please see my test below: And another example in context with Xcode, where the strings visible in the UI are also set as accessible labels of their respective elements. Thanks for your help!
1
0
476
Jan ’25
Speak Selection broken with SwiftUI text
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection. Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
1
0
216
Jul ’25
I cannot pair my PHONAK hearing aids
I cannot pair my PHONAK hearing aids after I upgraded to iOS 18.3 under hearing devices, it just keeps on switching will not find anything. I’ve undelete them and reinstall them four times. Uninstalled the app. Under Bluetooth, it finds it and has it. But under hearing devices where I can control everything it will not. Once I add the hearing aids, my phone goes completely silent. I do not know if anyone calls or texts me
4
0
479
Jan ’25
iOS 18 Siri Shortcut with a phrase does not appear in Shortcuts App any more
I was able to add shortcuts with parameters and use them from the Shprtcuts app in iOS 17, nevertheless Siri intent did never work. I upgraded to iOS 18 my app and my mobile. Now, the shortcut only appears in shortcuts app if no parameter is added to it. When I try to set a parameter, the shortcut does not appear any mora in Shortcuts app. struct ShortcutsProvider: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: OpenAppIntent(), phrases: [ "Show (.$screen) in (.applicationName)" ], shortTitle: "Open", systemImageName: "iphone.badge.play" ) } } struct OpenAppIntent: AppIntent { static var title: LocalizedStringResource = "Show" static let description = IntentDescription("Shows a screen.") static var openAppWhenRun: Bool = true static var authenticationPolicy = IntentAuthenticationPolicy.alwaysAllowed @Parameter(title: "screen") var screen: String @MainActor func perform() async throws -> some IntentResult { return .result() } } extension ScreenOption: AppEntity { struct OpenAppQuery: EntityQuery { @IntentParameterDependency<OpenAppIntent>( \.$screen ) var openAppIntent func entities(for: [ScreenOption.ID]) async throws -> [ScreenOption] { return [] } func suggestedEntities() async throws -> [ScreenOption] { return [] } } var displayRepresentation: DisplayRepresentation { .init(stringLiteral: "\(title)") } static var defaultQuery: OpenAppQuery = OpenAppQuery() static var typeDisplayRepresentation: TypeDisplayRepresentation = .init(name: "Screen") } extension ScreenOption: EntityIdentifierConvertible { static func entityIdentifier(for entityIdentifierString: String) -> ScreenOption? { allCases.filter { $0.rawValue == entityIdentifierString }.first } public var entityIdentifierString: String { rawValue } public init?(entityIdentifierString: String) { guard let screenOption = ScreenOption.entityIdentifier(for: entityIdentifierString) else { return nil } self = screenOption } }
1
0
735
Nov ’24
Microphone Not Working When Running Unity Vision Pro App Normally
} // Start listening to the microphone public void StartListening() { if (!isListening) { #if UNITY_IOS || UNITY_TVOS microphoneInput = Microphone.Start(null, true, 10, 44100); #else try { microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100 if (microphoneInput == null) { microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate); } #endif isListening = true; Debug.Log(Microphone.devices.Length + " Started listening..."); debugText.text = Microphone.devices.Length + "- Started listening..."; } catch (System.Exception e) { Debug.LogError($"Starting microphone failed: {e.Message}"); debugText.text = $"Starting microphone failed: {e.Message}"; } } } void Update() { if (isListening && microphoneInput != null) { // Analyze the audio for voice activity float volume = GetAverageVolume(); if (volume > detectionThreshold) { Debug.Log("User is speaking!"); lastVoiceTime = Time.time; SoundDetected = true; if (Time.time - lastVoiceTime > silenceDuration) { Debug.Log("User is silent."); debugText.text = volume.ToString() + " - User is silent."; } slider.value = volume; } } } private float GetAverageVolume() { float[] samples = new float[128]; microphoneInput.GetData(samples, Microphone.GetPosition(null)); float sum = 0f; foreach (float sample in samples) { sum += Mathf.Abs(sample); } return sum / samples.Length; } Problem: When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected. Question: Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario? Thanks in advance for any insights! Best, Siddharth
2
0
397
Feb ’25
MacOS Sequoia support for VoiceOver AppleScript automation
We are unable to programmatically enable AppleScript automation for VoiceOver on macOS 15 (Sequoia) In macOS 15, Apple moved the VoiceOver configuration from: ~/Library/Preferences/com.apple.VoiceOver4/default.plist to a sandboxed path: ~/Library/Group Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist Steps to Reproduce: Use a macOS 15 (ARM64) machine (or GitHub Actions runner image with macOS 15 ARM). Open VoiceOver: open /System/Library/CoreServices/VoiceOver.app Set the SCREnableAppleScript flag to true in the new sandboxed .plist: plutil -replace SCREnableAppleScript -bool true ~/Library/Group\ Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist Confirm csrutil status is either disabled or not enforced. Attempt to control VoiceOver via AppleScript (e.g., using osascript voiceOverPerform.applescript). Observe that the AppleScript command fails with no useful output (exit code 1), and VoiceOver does not respond to automation.
3
0
157
Jun ’25
iOS Eye Tracking Accessibility Feature API
Hi, is there any way to interact with the eye tracking accessibility feature in iOS and iPadOS? I want to be able to trigger an eye tracker recalibration programmatically. Also if possible, I would like to be able to retrieve gaze location data. These are not intended to be features on an app being published to the app store but rather a custom made accessibility app.
2
0
867
Oct ’24
NSAlert button background/contrast
If I use NSAlert the buttons look like this: The Cancel button has a gray background. We got complaints about the bad contrast and people pointed out that the alerts from System Settings look like this: Here the Cancel button has a white background. Unfortunately I did not find out how to make the buttons in my own alerts look like those in System Settings. Setting the button's bezel color to white did not work. Any help would be highly appreciated. Thanks. Best regards, Marc
4
0
503
Feb ’25
URL Schemes for Direct Access to Software Update Settings on iOS Devices (18.0.1)
Hello everyone :) For a project application we try to create a QR code that leads the user to the Software Update settings on iPhone after scanning to improve the accessibility for people who are not experienced in using smartphones. On iOS 18.0.1 I see that using App-prefs: opens the settings app which is a good starting point. Furthermore, using e.g. App-prefs:com.apple.MobileSMS directs the user to the Settings of the Messages app. But things like App-prefs:root=SoftwareUpdate don't work. That's why I wanted to kindly ask, if someone may have other ideas or knows what the exact URL might be for opening the Software Update settings or at least the General settings (App-prefs:root=General doesn't work). If anyone has insights or can share the exact URL scheme for this purpose, I would greatly appreciate it! Thank you so much!
2
0
1k
Oct ’24
To say I'm extremely hurt is an understatement! 😔
Voice Control Disabling System Services After Reboot I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem. To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months. I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem. I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter. Sincerely, Donald Spencer Kirby Dayton, Ohio
2
0
152
Jun ’25
how accessible is enough for Accessibility Nutrition Labels?
My team has a robust digital accessibility program and processes for WCAG conformance in our apps. Because of this, there are definitely accessibility defects that get caught and addressed in order of impact and business priority like any other bug. Obviously we want to aim for 100% accessibility for our users, but it's a continual work in progress as new enhancements or changes are released. I'm stuck on the appropriate measurement to indicate support. If we have 50 common tasks and the most central 10 tasks are solid but some supporting (but also common) tasks have a contrast fail or accessibleLabel missing, does that make the whole app not supporting the feature? If "completing the task" is the rubric there are a whole range of interpretations for that. In a complex app, I anticipate that a group like ours will have strong support for many of the Accessibility Nutrition Labels accessibility features across tasks and devices, but realistically never be 100% free of defects for a given Apple Accessibility feature, even among core tasks. As I consider the next steps for Nutrition Labels, I do not see anything in the documentation that gives a sort of baseline or measurement for inclusion. We plan to test all steps to complete a task, and log defects accordingly with an assigned timeline for fixing them (as would be true for functional defects).
2
0
87
Aug ’25
WKWebView not scrollable with a physical keyboard
Hello all. Currently I am trying to get WKWebView to scroll with a physical keyboard and it just will not work. I tried allowsKeyboardScrolling( ) and it did not work. UIWebView works but its no longer supported. Trying to get full keyboard access to work to make our app more accessible but WKWebView does not want to play nice. Has anyone else had issues trying to use WKWebView with an external keyboard, and if so did you find any solutions? Greatly appreciated!
3
0
433
Feb ’25
macOS (Tahoe) Beta 26.0 — Overheating & Rapid Battery Drain on MacBook Pro M3
Hi everyone, After installing the macOS beta (Tahoe 26.0) on my MacBook Pro M3, I’ve noticed two issues: Significant increase in system temperature The laptop feels hot even with light usage like Safari and Figma Rapid battery drain Battery is dropping unusually fast compared to macOS Sonoma. I’ve tried, Restarting the device. I’m aware this is a beta, but just wondering. Is anyone else experiencing this? Is this a known issue? Would love to hear if others are facing similar problems or if it might be something specific to my setup. Thanks in advance!
3
0
395
Jun ’25
Bulgarian cyrillic alphabet appears to be Russian, but should look Bulgarian
Hello So if you use the Bulgarian keyboard, you get these characters: явертъуиопюасдфгхйклшщзьцжбнмч This isn’t really right for Bulgaria, because т should look like m, and д should look like g, and other characters should look like rotated or mirrored Latin characters. E.g., г should look like a backwards s. Compare the Bulgaria Wikipedia page in Bulgarian: https://bg.m.wikipedia.org/wiki/%D0%91%D1%8A%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F with the Bulgaria Wikipedia page in Russian: https://ru.m.wikipedia.org/wiki/%D0%91%D0%BE%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F Notice that the letters are different. Anyhow, the ios Bulgarian font is just Russian Cyrillic, and that seems like an unintended bug rather than an intentional stylistic choice, basically.
1
0
311
Feb ’25
Accessibility full keyboard access issue.
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
2
0
526
Mar ’25
System clock ancs notification is not sent on iPhone >= 14
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore. Is There any reason for this to happening?. Is anyone aware of this behaviour? Any suggestion would be really appreciated. Cheers
2
0
135
Jun ’25
AVSpeechSynthesisMarker - iOS 18 - Sync lost
Hello, in an AVSpeechSynthesisProviderAudioUnit sending word position to host using AVSpeechSynthesisMarker / AVSpeechSynthesisMarker.Mark.word seems to be broken on iOS 18. On the app/client side all the events are received immediately whereas they should be received synchronised with the audio. The exact same code works perfectly on iOS 17 On the AVSpeechSynthesisProviderAudioUnit the AVSpeechSynthesisMarker are appended with the correct Position/SampleOffset let wordPos = NSMakeRange(characterRange.location, characterRange.length) let marker = AVSpeechSynthesisMarker(markerType: AVSpeechSynthesisMarker.Mark.word, forTextRange:wordPos, atByteSampleOffset:byteSampleOffset) // also tried with // let marker = AVSpeechSynthesisMarker(wordRange:wordPos, atByteSampleOffset:byteSampleOffset) markerArray.append(marker) print("word : pos \(characterRange) - offset \(byteSampleOffset)") // send events to host speechSynthesisOutputMetadataBlock?(markerArray, self.request!) word : pos {7, 7} - offset 2208 word : pos {15, 8} - offset 37612 word : pos {24, 6} - offset 80368 word : pos {31, 3} - offset 118652 word : pos {35, 2} - offset 128796 ... But on the client side they are all received in the same time (at the beginning of speech) whereas on iOS 17 they arrive sync to the audio. func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, willSpeakRangeOfSpeechString characterRange: NSRange, utterance: AVSpeechUtterance) { print("characterRange : \(characterRange)") } Using Apple Voice/engine works so there is obviously something to change but documentation of AVSpeechSynthesisProviderAudioUnit / AVSpeechSynthesisMarker seems unchanged Thanks in advance
4
0
554
Dec ’24