Allow the user to add their own tags to the default emoji tags.
For instance, this emoji, for me, is nonna: 🤌🏻. My efficiency would improve immensely if I could search for it as the “Nonna” emoji, rather than searching for nonna, remembering it doesn’t exist, trying the search for other things it might be called, realising I don’t know what it is, then having to scroll through all the hand emojis twice to find it.
🤌🏻🤞🏼👌
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Double-tap three fingers and drag to change zoom” should suppress “Three Finger to Drag”. Currently these gestures are triggered simultaneously, for no real reasons. I saw different behaviors for different environments, but none is desired.
Current and desired behavior:
This seems an issue so I filed a feedback.
Went to privacy and security on iPhone 11 and development mode is not there
I’m trying to customize the keyboard focus appearance in SwiftUI.
In UIKit (see WWDC 2021 session Focus on iPad keyboard navigation), it’s possible to remove the default UIFocusHaloEffect and change a view’s appearance depending on whether it has focus or not.
In SwiftUI I’ve tried the following:
.focusable() // .focusable(true, interactions: .activate)
.focusEffectDisabled()
.focused($isFocused)
However, I’m running into several issues:
.focusable(true, interactions: .activate) causes an infinite loop, so keyboard navigation stops responding
.focusEffectDisabled() doesn’t seem to remove the default focus effect on iOS
Using @FocusState prevents Space from triggering the action when the view has keyboard focus
My main questions:
How can I reliably detect whether a SwiftUI view has keyboard focus? (Is there an alternative to FocusState that integrates better with keyboard navigation on iOS?)
What’s the recommended way in SwiftUI to disable the default focus effect (the blue overlay) and replace it with a custom border?
Any guidance or best practices would be greatly appreciated!
Here's my sample code:
import SwiftUI
struct KeyboardFocusExample: View {
var body: some View {
// The ScrollView is required, otherwise the custom focus value resets to false after a few seconds. I also need it for my actual use case
ScrollView {
VStack {
Text("First button")
.keyboardFocus()
.button {
print("First button tapped")
}
Text("Second button")
.keyboardFocus()
.button {
print("Second button tapped")
}
}
}
}
}
// MARK: - Focus Modifier
struct KeyboardFocusModifier: ViewModifier {
@FocusState private var isFocused: Bool
func body(content: Content) -> some View {
content
.focusable() // ⚠️ Must come before .focused(), otherwise the FocusState won’t be recognized
// .focusable(true, interactions: .activate) // ⚠️ This causes an infinite loop, so keyboard navigation no longer responds
.focusEffectDisabled() // ⚠️ Has no effect on iOS
.focused($isFocused)
// Custom Halo effect
.padding(4)
.overlay(
RoundedRectangle(cornerRadius: 18)
.strokeBorder(
isFocused ? .red : .clear,
lineWidth: 2
)
)
.padding(-4)
}
}
extension View {
public func keyboardFocus() -> some View {
modifier(KeyboardFocusModifier())
}
}
// MARK: - Button Modifier
/// ⚠️ Using a Button view makes no difference
struct ButtonModifier: ViewModifier {
let action: () -> Void
func body(content: Content) -> some View {
content
.contentShape(Rectangle())
.onTapGesture {
action()
}
.accessibilityAction {
action()
}
.accessibilityAddTraits(.isButton)
.accessibilityElement(children: .combine)
.accessibilityRespondsToUserInteraction()
}
}
extension View {
public func button(action: @escaping () -> Void) -> some View {
modifier(ButtonModifier(action: action))
}
}
Hello,
after installing iOS 18 on our smartphones we experience a dramatic CPU load of the thread "AccessibilityUIServer" after installing one of our apps. This results in a drastic battery drainage and heats up the iPhone very fast. Under previous iOS versions this was no problem.
Do you have any ideas how this could happen?
Topic:
Accessibility & Inclusion
SubTopic:
General
When would it be a good idea to utilize numbered lists with VoiceOver accessibility. As an example, for tab bars it will read about "2 of 5".
In speaking with an Accessibility engineer at WWDC this year, the said that its good practice, but we ran out of time in the call to dig further deeper. When or how do you know when you should read out the item count "2 of 5".
It makes sense to me in say a tab group or a chip group, but I don't see it being good in say potentially a UITableView with potentially a various number of sections of which each section itself can include multiple cells.
I've had trouble finding additional guidance on this topic, if anyone can provide recommendations or their thoughts, that would be great.
Topic:
Accessibility & Inclusion
SubTopic:
General
I am trying to implement voice over to my game, and have encountered an issue where a static text will take focus of all other interactions. I have a tutorial scene where I have one short "static text" accessibility node, but rest of gameplay is without such. This static text field occupies small part of screen and works fine, but I am not able to click on anything else, including any of my gameplay elements, wherever on the screen I click, it just re-highlights that static text.
It there a requirement for all elements to use Accessibility Nodes and can't have mixed setup with some not having them ?
How can I get around it?
Question number 2: What decides which Accessibility node gets selected when entering a new UI screen, I have multiple buttons and am observing rather random behaviour, every time different button is highlighted first.
Question number 3: The plugin documentation mentiones runtime support in play mode, are there any specific steps for this to work as I can't seem to be able to. I have VoiceOver enabled on macOS unity is on macOS (also tried iOS) platform but it doesn't seem to do anything. Note my buttons and label accessibility nodes work correctly on iOS build.
Thanks in advance for any help
iOSアプリでNEAppPushSessionを使い、NEAppPushDelegateの通知を受けてCallKitの着信画面を表示する実装をしていますが、以下の問題に直面しています。
8/13
ログにて下記のエラーが頻発しました。
通知の受け取りテストを約120回してその間ずっとこのエラーが出ていました。
エラー 2025-08-14 11:27:06.793073 +0900 nesessionmanager NESMAppPushSession[SimplePushDefaultConfiguration:7B7218F3-94B5-4AE5-9B9E-94E176694D02] failed to report incoming call to CallKit, error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process." UserInfo={NSDebugDescription=The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process.}
このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。
ですがどうやら通知の監視は開始しているようです。
15時間後の8/14
時間をあけたからか、再度通知が来るようになりました。
ですが再度通知の受け取りテストを行った時に同じエラーログが出ました。
再度通知テストを約120回程行ったら、このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。
ですがどうやら今回も通知の監視は開始しているようです。
15時間後の8/15
今日は15時間かけても通知を取得できませんでした。
ですが同じく通知の監視は開始していそうです。
iPhoneの再起動、Xcodeのクリーンアップ、アンインストールして再インストールなどしても通知は来ないままでした。
また、不思議なことに通知が来ない事象が起きた端末以外でも同じように通知を取得することができません。
他の通知は受け取ることができますが独自の通知であるNEAppPushManagerだけ通知を取得することができません。
質問です。
再度通知を出すためには何をすれば良いでしょうか。
この事象は4099エラーを出しすぎたことにより発生する障害なのでしょうか。
4099エラーを出ている原因は何でしょうか。
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
User Notifications
PushKit
CallKit
Push To Talk
I am developing an application for macOS that runs exclusively in the menu bar, and in this application, I am using a ColorPicker to allow the user to choose a color. However, the ColorPicker is not working properly in the menu bar. What should I do to resolve this issue?
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures.
Specifically... iframes.
There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch.
If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users.
VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes.
VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor.
While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
I am developing an app using DeviceActivity.
let schedule = DeviceActivitySchedule(
intervalStart: DateComponents(hour : 0, minute : 0, second: 1),
intervalEnd: DateComponents(hour : 23, minute : 59, second: 59),
repeats: true
)
I found that on the second day, intervalDidStart(for: DeviceActivityName) gets called multiple times.
I also tried
let schedule = DeviceActivitySchedule(
intervalStart: DateComponents(hour : 0, minute : 0, second: 1),
intervalEnd: DateComponents(hour : 23, minute : 59, second: 59),
repeats: false
)
and started monitoring for the next day in the intervalDidEnd(for: DeviceActivityName) method.
but , intervalDidStart(for: DeviceActivityName) still gets called multiple times.
How should I resolve this issue?
I would like to enable the option to resize windows with the apple pencil pro. I tried but I see that this feature is not enabled.
Topic:
Accessibility & Inclusion
SubTopic:
General
Environment:xcode 16.2
WidgetKit: Image(uiImage: UIImage(named: "jp_jump")!).resizable().scaledToFit().frame(width: 58, height: 16).padding(EdgeInsets(top: 0, leading: 16, bottom: 0, trailing: 0))
”jp_jump“: Local color picture load widget crashes
info:
Thread 4: EXC_RESOURCE (RESOURCE_TYPE_MEMORY: high watermark memory limit exceeded) (limit=30 MB)
I have a view dynamically overlaid on a UITableView with proper padding (added when certain conditions are met). When VoiceOver focuses on a cell beneath this overlay, the focused element does not scroll into view. I’ve noticed similar behavior in Apple’s first-party Podcasts app.
Please find the attached image for reference. How can I resolve this issue and ensure VoiceOver scrolls the focused cell into view?
My app does not automatically switch languages (voices) in VoiceOver when I have VoiceOver on and the screen includes both English and Spanish content. Instead of switching between the correctly accented voice, whatever my manual Voices rotor setting is, that's what the content is announced as. I can manually switch the Voice in the rotor to make words sound inteligible but my main concern is that language changes are not auto-detected even though that feature in my Settings is on.
VO does detect language changes in other apps, so I think there must be either misplaced or missing accessibiiltyLanguage strings somewhere in my app. Or is it more than that for localization considerations?
I reached out to the Apple Accessibilty team and was directed to open a ticket here, as my question is about the underlying code.
I am a novice developer and primarily accessibility SME; i expect that wnen "detect languages" is on in the user settings for VoiceOver, that the voice for the screen reader speech output will automatically switch to the correct language / accent. I recognize there is a problem but am not sure where the breakdown is. I would like guidance how to fix it to relay to my teams.
https://developer.apple.com/documentation/objectivec/nsobject/1615192-accessibilitylanguage
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Swift
Accessibility
Localization
Internationalization
Hi,
I'm trying to fix tvOS view for VoiceOver accessibility feature:
TabView { // 5 tabs
Text(title)
Button(play)
ScrollView { // Live
LazyHStack { 200 items }
}
ScrollView { // Continue watching
LazyHStack { 500 items }
}
}
When the view shows up VoiceOver reads:
"Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack.
VocieOver should only read "Home tab 1 of 5"
When moving focus to scroll view it reads:
"Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4"
When moving focus to second item it reads:
"Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4"
When moving focus to third item it reads:
"Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4"
It should be just reading what is focused, idealy just
"Live, Item 1, 1 of 200"
then after moving focus on item 2
"Item 2, 2 of 200"
this time without the word "Live" because we are on the same scroll view (the same horizontal list)
Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused.
This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction.
Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists.
How do I disable reading content that is not focused?
I tried:
.accessibilityLabel(isFocused ? title : "")
.accessibilityHidden(!isFocused)
.accessibilityHidden(true) - tried on various levels in view hierarchy
.accessiblityElement(children: .ignore) - even focused item is not read back by voice over
.accessiblityElement(children: .ignore) - even focused item is not read back by voice over
.accessiblityElement(children: .contain) - tried on various levels in view hierarchy
.accessiblityElement(children: .combine) - tried on various levels in view hierarchy
.accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy
.accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy
// the last 2 was basically an attempt to hack it
.accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view.
50+ other attempts at configuring accessibility tags attached to views.
I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before.
Any idea how to fix this?
Thanks.
Has the NEHotspotNetwork framework removed the fetchAvailableNetworksWithCompletionHandler API? xcode version: 16.2 ios sdk: 18.2
I want to get wifi's list by NEHotspotNetwork framework, but i didn't find any useful infomation about it
I am testing the accessibility feature available in the Settings app called "Speak Screen". The help text in the Setting app states that swiping down with two fingers will cause the screen content to be spoken. However, I've been unable to get this feature to work. Every time I try the double finger swipe down, it behaves the same as the single finger swipe down gesture. Usually this manifests as making scroll views bounce.
I've tried toggling the feature on and off, turning off Reachability, and rebooting my phone, but I can't get the speak screen gesture to work. If I access the speak screen feature from the "Speech Controller" button, then the screens content is spoken, as expected, so I know the feature is enabled. It's just the gesture that doesn't work.
Is there something else I need to do to get this gesture to work? I don't want to tell my users to turn this feature on if I can't verify that the gesture will work with my app.
Hi everybody,
I'm trying to build a QR-Code Scanner and Generator App for IOS.
Whenever I try to implement the camera the app crashes with this comment:
This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data.
I tried to reduce the app to the minimum of nothing but camera with the same result.
Any ideas?
Tank you and
best Regards
Horst Schippers
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I’m in the process of enrolling my business (Carzo Rent A Car, Prishtine, Kosovo) in the Apple Developer Program, but I have been waiting for my D-U-N-S number to be issued.
I submitted the request to Dun & Bradstreet on July 28, 2025 (Case #9142648) and have only received a system-generated email with a tracking ID (#9086421). There has been no further update.
My questions are:
Is there a way for Apple to expedite or provisionally approve my enrollment while the D-U-N-S number is pending?
How long does Apple typically wait for D&B updates before the enrollment is affected?
Are there any alternative steps I can take to avoid further delays?
Thank you for your guidance.
Topic:
Accessibility & Inclusion
SubTopic:
General