Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Why does the macOS window sharing indicator appear for some windows but not others?
On recent versions of macOS, when a window is being shared (via the system screen-capture APIs), the OS sometimes shows a small "shared window" badge in the title bar. I’ve noticed that this indicator is not consistent: For some windows, the badge reliably appears when they are being shared. For other windows, the badge never appears, even though the window is actively shared. In particular, windows that use a standard system title bar seem to show the indicator more often, while windows with custom-drawn or non-standard chrome do not. My questions are: What are the exact conditions under which macOS decides to draw the “shared window” indicator in a window’s title bar? Is this strictly tied to certain NSWindow styles or masks (e.g. titled vs borderless)? Is there any API or flag I can use to detect programmatically whether a given window will display this system indicator when shared?
0
0
1.1k
Sep ’25
Need app blocking permission for Screen Time Limit app - CAN'T GET ANSWER FROM SUPPORT FOR 3 WEEKS. APP HAS 100K FOLLOWERS ON SOCIAL MEDIA ALREADY
Hey everyone! I am developing a screen time limit app to help people spend less time in distracting apps. It works this way: people choose unhealthy apps for them and opposite productivity apps. In the app you can exchange time spent on healthy habits to scroll or use other distracting apps. This idea was loved by social media, and the app already has 100k followers on social media without even being launched yet. So I am waiting just for one feature permission from Apple, and they have not given me any answer since I applied 3 weeks ago. There are a lot of similar apps on the market, and this feature exists in other screen time limit apps. Why is app blocking permission needed? Time Exchange Functionality: Users independently select which apps are productive and which are distracting for them. The system blocks the "negative" apps until the user accumulates enough time in the "positive" ones. This encourages healthy device usage. Full User Control: All apps to be blocked are manually selected by the user in the settings. The extension does not impose any restrictions without explicit permission. Transparency and Security: Blocking happens locally, with no data collected about app usage. We adhere to Apple’s privacy policy. Compliance with App Store Guidelines: We understand that app blocking is a sensitive feature, but in our case it: Is used for the benefit of the user (digital detox, productivity improvement). Does not interfere with system processes or other developers’ apps. Does not misuse access to APIs. My question to the forum is: Did you have similar problems, and how did you resolve them? Are there any ways to speed up the process or contact someone from the approval team directly? Should I give up and release it on Android? I am very disappointed and frustrated. Hope to get some useful tips. Thank you very much!
0
1
136
May ’25
Attaching procedural audio to an ARKit SCNNode
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post Working Implementation with Static Audio File: let audioPlayer = SCNAudioPlayer(source: audioSource) node.addAudioPlayer(audioPlayer) Attempted Implementation with Procedural Audio: // Audio generation code } let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode) node.addAudioPlayer(audioPlayer) In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here: Apple docs Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating: Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use. However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
0
0
164
Apr ’25
CallKit連携でError 4099が頻発し、NEAppPushDelegateの通知が取得できず着信画面が表示されない問題について
iOSアプリでNEAppPushSessionを使い、NEAppPushDelegateの通知を受けてCallKitの着信画面を表示する実装をしていますが、以下の問題に直面しています。 8/13 ログにて下記のエラーが頻発しました。 通知の受け取りテストを約120回してその間ずっとこのエラーが出ていました。 エラー 2025-08-14 11:27:06.793073 +0900 nesessionmanager NESMAppPushSession[SimplePushDefaultConfiguration:7B7218F3-94B5-4AE5-9B9E-94E176694D02] failed to report incoming call to CallKit, error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process." UserInfo={NSDebugDescription=The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process.} このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。 ですがどうやら通知の監視は開始しているようです。 15時間後の8/14 時間をあけたからか、再度通知が来るようになりました。 ですが再度通知の受け取りテストを行った時に同じエラーログが出ました。 再度通知テストを約120回程行ったら、このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。 ですがどうやら今回も通知の監視は開始しているようです。 15時間後の8/15 今日は15時間かけても通知を取得できませんでした。 ですが同じく通知の監視は開始していそうです。 iPhoneの再起動、Xcodeのクリーンアップ、アンインストールして再インストールなどしても通知は来ないままでした。 また、不思議なことに通知が来ない事象が起きた端末以外でも同じように通知を取得することができません。 他の通知は受け取ることができますが独自の通知であるNEAppPushManagerだけ通知を取得することができません。 質問です。 再度通知を出すためには何をすれば良いでしょうか。 この事象は4099エラーを出しすぎたことにより発生する障害なのでしょうか。 4099エラーを出ている原因は何でしょうか。
0
0
416
Aug ’25
VoiceOver Headings Accessibility Rotor with SwiftUI on iOS
Hi, On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor). In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense. There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following. .accessibilityRotor(.headings) { ForEach(entries) { entry in // entry.id must be the same as the id of the SwiftUI view it is about AccessibilityRotorEntry(entry.name, id: entry.id) } } It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up. I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them. So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else) The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed). Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder. Thanks
0
0
84
Jun ’25
iPadOS 26 Floating Keyboard jumping up and down
Using the floating keyboard extensively. Often It starts to jump up and down. I have to pinch out to see the large version and pinch in again to restore the floating version. Sometimes just touching a key sets it off. Sometimes returning to a window from which the keyboard is displayed starts the issue. This was never a problem in ipad os 18.
0
0
423
Sep ’25
Custom prediction panel not working in Google Docs
I’m working on a macOS Accessibility setup for a French-speaking user and I’ve hit a wall. (I'm not a developper and I'm trying to help my kid with dyslexia) I successfully built a custom word prediction panel using the Panel Editor (Keyboard) in macOS Accessibility > Keyboard > Accessibility Keyboard. Here’s what I have so far: • The prediction panel works system-wide: I can use it to type in Finder, Safari, Notes, TextEdit, and even browser search bars. • The panel appears above all applications and suggestions show up correctly. • However, it does not work inside Google Docs (tested in Chrome, Safari, and Firefox). Selecting a word from the panel does nothing in the Docs editor. I suspect this is because: • Google Docs does not use a standard macOS text input field. • Docs is a web app that relies on custom JavaScript editors, contentEditable elements, and canvas rendering, so macOS Accessibility APIs (AXTextField, AXInsertText, etc.) don’t register or inject text events. • Accessibility tools like the Accessibility Keyboard rely on native macOS text input methods, which don’t hook into Google Docs’ custom editor. Important: I’m not a programmer. I’d like to know if there is an easy fix or option in macOS, Google Chrome, or Google Docs that would make my custom prediction panel work, before going into custom development. Technical setup: • MacBook Air (M2, 2022) • RAM: 8 GB • macOS: Sequoia 15.3.1 • Language: French (system and keyboard) • Accessibility Keyboard: Enabled via Settings > Accessibility > Keyboard • Custom panel: Built using Panel Editor (Keyboard), named “Philemon Prédiction” • Browsers tested: Chrome, Safari, Firefox (same issue) • Behavior: Panel is visible, suggestions appear, but inserting text does nothing in Google Docs Has anyone worked around this limitation? Is there a simple setting, workaround, or accessibility option to bridge macOS Accessibility input with Google Docs’ editor? Thanks a lot!
0
1
921
Aug ’25
App Store Connect – “Unable to Handle This Request” Error
Hello, I'm currently unable to access App Store Connect. When I try to open https://appstoreconnect.apple.com, I receive the following error message: “appstoreconnect.apple.com is currently unable to handle this request.” I’ve tried the following steps, but the issue persists: Cleared browser cache and cookies Tried different browsers (Safari, Chrome) Attempted from multiple devices and networks Is this a known issue or is there any workaround available? Would appreciate any help or update on the current status. Thank you,
0
0
80
Jun ’25
Don't get the completed shipping contact after authorizaiton
We have get the response from Apple pay after the the customer doing the face ID & touch ID authorization. But the shiping contact is not complete, for examble: ` { "addressLines": [ "1************ kwy" ], "administrativeArea": "FL", "country": "", "countryCode": "", "emailAddress": "S*********le.com", "familyName": "******i", "givenName": "******m", "locality": "*******s", "phoneNumber": "+*******79", "phoneticFamilyName": "", "phoneticGivenName": "", "postalCode": "*****3", "subAdministrativeArea": "", "subLocality": "" },` as the documents said, it should be the completed shipping contact, but the country & countrycode is null https://developer.apple.com/documentation/apple_pay_on_the_web/applepaypayment/1916097-shippingcontact
1
0
418
Dec ’24
VoiceOver: Detect Languages
My app does not automatically switch languages (voices) in VoiceOver when I have VoiceOver on and the screen includes both English and Spanish content. Instead of switching between the correctly accented voice, whatever my manual Voices rotor setting is, that's what the content is announced as. I can manually switch the Voice in the rotor to make words sound inteligible but my main concern is that language changes are not auto-detected even though that feature in my Settings is on. VO does detect language changes in other apps, so I think there must be either misplaced or missing accessibiiltyLanguage strings somewhere in my app. Or is it more than that for localization considerations? I reached out to the Apple Accessibilty team and was directed to open a ticket here, as my question is about the underlying code. I am a novice developer and primarily accessibility SME; i expect that wnen "detect languages" is on in the user settings for VoiceOver, that the voice for the screen reader speech output will automatically switch to the correct language / accent. I recognize there is a problem but am not sure where the breakdown is. I would like guidance how to fix it to relay to my teams. https://developer.apple.com/documentation/objectivec/nsobject/1615192-accessibilitylanguage
1
0
859
Nov ’24
Mongolian Spellchecking Support Across Apple Ecosystem
As a Mongolian user, I’ve observed that the Apple ecosystem (macOS, iOS, iPadOS) currently lacks native spellchecking support for the Mongolian language in Cyrillic script. This absence poses significant challenges for users who rely on Apple devices for communication, education, and professional work in Mongolian. Could you share if there are any plans or roadmaps to address this gap? Additionally, I’m eager to contribute ideas, resources, or insights to help make Mongolian language support more accessible within the Apple ecosystem. If there are any guidelines or steps I could take to advocate for or help implement this feature, I’d greatly appreciate your guidance.
1
0
338
Dec ’24
Registering a macOS app for dynamic text sizing in macOS 15
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day? For context, my app is a macOS-native SwiftUI app.
1
0
570
Jan ’25
USB-C Cable for Galaxy Watch 7 not recognized by Macbook pro M4
I get it, "Why don't you just get an apple watch?" regardless, My Macbok Pro M4 doesn't recognize my charging cable in any of my USBC ports. The Cable works with any other power supply I plug it into, but the Macbook doesn't even register that a cable is connected. -- Running Sequioa Beta 15.4 thinking it may have been software related. No change. -- Settings> Privacy and Security> accessories> Changed between all available options. No change. -- Option + Apple Logo> system info> Thunderbolt/USB4> none of the ports show that the cable is connected. -- Any other USB-C Cable is recognized in any of the ports on the Macbook. Just not the cable for the Galaxy Watch. Again, the cable charges in ANY other USBc ports from ANY other device I connect it to. Am I missing something? Or is this an intended jab at Samsung from Apple? lol
1
0
711
Mar ’25
AccessibilityHint for UIAlertAction
Hi, I am setting an accessibilityLabel and accessibilityHint property of a UIAlertAction. However, VoiceOver is only reading the label out. Usually, the label is read out, followed by a short pause and then the hint. Is this a known issue, where hints do not work for this element? I can append the hint to the label, but interested to know if there's something I'm doing wrong. Regards.
1
0
322
Mar ’25
Screen reader not reading the month July when we use the shorter version "Jul" in app
When iOS screen reader reads the month "July" in its shorter version "Jul" its not reading it correctly as month, where as all other months name it reading it correctly in shorter version, so as a result all dates comes under that month when we display in front end and use a screen reader to read it then it will read out as number not date. I have tried the longer version with the screen reader and then its reads correctly July as well.
1
0
7.3k
Mar ’25
Is there any way to make user forced update?
Hi I'm planning to make macos App and distribute to MacOS App Store. The question is should i make force update when update is needed. The reason why I want to make this feature is I don't want to make user use previous version of app. My plan is like this. when app needed update, make user reach special page that describe why update is needed and set a button that can download new version of app. the download will be automatically doing at background don't need to visit app store. I search several forums and gpt but there is no positive reply of this.. so finally i make a post to know is there no way to make this. Thank you!
1
0
461
Mar ’25