Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

USB-C Cable for Galaxy Watch 7 not recognized by Macbook pro M4
I get it, "Why don't you just get an apple watch?" regardless, My Macbok Pro M4 doesn't recognize my charging cable in any of my USBC ports. The Cable works with any other power supply I plug it into, but the Macbook doesn't even register that a cable is connected. -- Running Sequioa Beta 15.4 thinking it may have been software related. No change. -- Settings> Privacy and Security> accessories> Changed between all available options. No change. -- Option + Apple Logo> system info> Thunderbolt/USB4> none of the ports show that the cable is connected. -- Any other USB-C Cable is recognized in any of the ports on the Macbook. Just not the cable for the Galaxy Watch. Again, the cable charges in ANY other USBc ports from ANY other device I connect it to. Am I missing something? Or is this an intended jab at Samsung from Apple? lol
1
0
745
Mar ’25
Accessibility full keyboard access issue.
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
2
0
566
Mar ’25
Not able to receive silent pushes in background
I’ve developed the Pro Talkie app—a walkie-talkie solution designed to keep you connected with family and friends App Store: https://apps.apple.com/in/app/pro-talkie/id6742051063 Play Store: https://play.google.com/store/apps/details?id=com.protalkie.app While the app works flawlessly on Android and in the foreground on iOS, I’m facing issues with establishing connections when the app is in the background or terminated on iOS. Specifically, I’ve attempted the following: Silent pushes and alert payloads: These are intended to wake the app in the background, but they often fail—notifications may not be received or can be delayed by 20–30 minutes, leading to a poor user experience. VoIP pushes: These reliably wake the app, but they trigger the incoming call UI, which isn’t suitable for a walkie-talkie app that should connect directly without displaying a call screen. I’ve enabled all the necessary background modes (audio, remote notifications, VoIP, background fetch, processing), but the challenge remains. How can I ensure a consistent background connection on iOS without triggering the call UI?
1
0
459
Mar ’25
AVSpeechSynthesisProviderVoice audioFileSettings field
Hello, the AVSpeechSynthesisVoice has a audioFileSettings attributes let utterance = AVSpeechUtterance(string: text) utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!) print("- voice \(utterance.voice!.audioFileSettings)") ["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32] This is declared in AVSpeechSynthesisVoice { ... @available(iOS 13.0, *) open var **audioFileSettings:** [String : Any] { get } @available(iOS 17.0, *) open var voiceTraits: AVSpeechSynthesisVoice.Traits { get } } How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ? Cause in AVSpeechSynthesisProviderVoice there is no such field AVSpeechSynthesisProviderVoice { open var name: String { get } open var identifier: String { get } open var primaryLanguages: [String] { get } open var supportedLanguages: [String] { get } open var voiceSize: Int64 open var version: String open var gender: AVSpeechSynthesisVoiceGender open var age: Int } Regards
2
0
144
Mar ’25
AssistiveTouch pointer cannot move past center of screen in landscape orientation on iPhone
Hello everyone, I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5. This has also been reported via Feedback Assistant with Feedback ID: FB17806167 Description: When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation. Specifically: The pointer cannot move past the center of the screen Horizontal and vertical (X/Y) movements appear to be swapped or misaligned Natural movement of the pointer is not possible It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape. This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps. Steps to Reproduce: Enable AssistiveTouch Connect a Bluetooth mouse to the iPhone Rotate the device to landscape orientation Try moving the mouse pointer across the screen → Notice that: Pointer cannot move past the center Horizontal/vertical input is interpreted incorrectly (as if still in portrait) Expected Behavior: The mouse pointer should move across the entire screen correctly, regardless of device orientation. Actual Behavior: In landscape orientation, the pointer is either restricted to part of the screen or misaligned. It behaves as if the device is still in portrait. Horizontal mouse movement causes vertical pointer movement, and vice versa User experience feels broken and unintuitive Feature Suggestion: Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS. I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior. Thanks in advance for any insights or suggestions. Best regards, Jannis
2
0
330
Jun ’25
Guided Access Mode From Background
My team is designing an app for retail associates that need to share managed iPads. We keep the app in Guided Access mode on our login app until an auth token is obtained. Then the iPad is opened for general use. Upon signout we need to re-enter guided access mode and we can do this via manual signout easily. But with idle signout, ie after 60 minutes of inactivity, we need to be able to make a call from the background (in a locked state even) and sign out the user, clear the pin code and enter single app mode before restarting. So that hopefully once the device restarts, we have the app in a locked state again until the next user provides credentials that can obtain a new auth token. We are struggling to see if this is even possible. Our bosses will be displeased if we tell them it isn't. So anybody with any tips would be very appreciated.
2
0
278
Mar ’25
System clock ancs notification is not sent on iPhone >= 14
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore. Is There any reason for this to happening?. Is anyone aware of this behaviour? Any suggestion would be really appreciated. Cheers
2
0
188
Jun ’25
Microphone Not Working When Running Unity Vision Pro App Normally
} // Start listening to the microphone public void StartListening() { if (!isListening) { #if UNITY_IOS || UNITY_TVOS microphoneInput = Microphone.Start(null, true, 10, 44100); #else try { microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100 if (microphoneInput == null) { microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate); } #endif isListening = true; Debug.Log(Microphone.devices.Length + " Started listening..."); debugText.text = Microphone.devices.Length + "- Started listening..."; } catch (System.Exception e) { Debug.LogError($"Starting microphone failed: {e.Message}"); debugText.text = $"Starting microphone failed: {e.Message}"; } } } void Update() { if (isListening && microphoneInput != null) { // Analyze the audio for voice activity float volume = GetAverageVolume(); if (volume > detectionThreshold) { Debug.Log("User is speaking!"); lastVoiceTime = Time.time; SoundDetected = true; if (Time.time - lastVoiceTime > silenceDuration) { Debug.Log("User is silent."); debugText.text = volume.ToString() + " - User is silent."; } slider.value = volume; } } } private float GetAverageVolume() { float[] samples = new float[128]; microphoneInput.GetData(samples, Microphone.GetPosition(null)); float sum = 0f; foreach (float sample in samples) { sum += Mathf.Abs(sample); } return sum / samples.Length; } Problem: When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected. Question: Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario? Thanks in advance for any insights! Best, Siddharth
2
0
502
Feb ’25
macOS (Tahoe) Beta 26.0 — Overheating & Rapid Battery Drain on MacBook Pro M3
Hi everyone, After installing the macOS beta (Tahoe 26.0) on my MacBook Pro M3, I’ve noticed two issues: Significant increase in system temperature The laptop feels hot even with light usage like Safari and Figma Rapid battery drain Battery is dropping unusually fast compared to macOS Sonoma. I’ve tried, Restarting the device. I’m aware this is a beta, but just wondering. Is anyone else experiencing this? Is this a known issue? Would love to hear if others are facing similar problems or if it might be something specific to my setup. Thanks in advance!
3
0
505
Jun ’25
Bulgarian cyrillic alphabet appears to be Russian, but should look Bulgarian
Hello So if you use the Bulgarian keyboard, you get these characters: явертъуиопюасдфгхйклшщзьцжбнмч This isn’t really right for Bulgaria, because т should look like m, and д should look like g, and other characters should look like rotated or mirrored Latin characters. E.g., г should look like a backwards s. Compare the Bulgaria Wikipedia page in Bulgarian: https://bg.m.wikipedia.org/wiki/%D0%91%D1%8A%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F with the Bulgaria Wikipedia page in Russian: https://ru.m.wikipedia.org/wiki/%D0%91%D0%BE%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F Notice that the letters are different. Anyhow, the ios Bulgarian font is just Russian Cyrillic, and that seems like an unintended bug rather than an intentional stylistic choice, basically.
1
0
360
Feb ’25
Developer Mode Restart without HomeButton
After enabling Developer Mode on my iPhone and restarting it, the device asks me to press the Home button to confirm. Unfortunately, my Home button is broken, so I can’t access Developer Mode. The iPhone itself still works, but I can’t enable the mode. Is there any way to bypass this without the Home button?
3
0
105
Mar ’25
"Captions" in the Accessibility Nutrition Label for text-based apps
My game app is text-based interactive fiction, containing no audio/video content, making captions unnecessary. Our game is completely accessible to deaf users. Despite this, in the Accessibility Nutrition Label, I'm only able to leave the "Captions" box checked or unchecked. Leaving it unchecked would leave deaf players with the wrong impression that they can't enjoy our game. Leaving it checked would imply that we do have A/V content with captions included. In the WWDC video on this, https://developer.apple.com/videos/play/wwdc2025/224/ the video says: After we completed common tasks, we realized our app doesn’t have any video or audio only content. In this case, we aren’t going to indicate that Landmarks supports Captions. That's okay. This accurately describes the features that people will expect to be available while using the app. Maybe that's "OK," but I wish the form allowed me to say "This app doesn't contain audio/video content."
3
1
131
Jun ’25
Unable to set dialect of Chinese of AVSpeechSynthesisVoice in iOS 18
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of: AVSpeechUtterance(string: "中文") // Any Chinese Content in the dialect specified by: Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language instead of the dialect that I specified in AVSpeechUtterance.voice: AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below. My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18. Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen. However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens. This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior. Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
1
1
118
Jun ’25
Feature Idea: Autonomous, Motion-Powered Clock Display on iPhone.
Hey everyone, I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery? The Core Idea Imagine this: Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket. Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery. Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures. On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED). Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy. Why This Matters This isn't just a cool gimmick; it offers significant benefits: True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else. Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses. Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech. Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source. This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
1
1
124
Jun ’25