Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

iOS 18 Local network prohibited
有在Info.plist配置权限使用说明(NSLocalNetworkUsageDescription),App本地网络权限也已经打开,但App请求局域网设备接口时,仍返回异常: Error Domain=NSURLErrorDomain Code=-1009 "The Internet connection appears to be offline." UserInfo={_kCFStreamErrorCodeKey=50, NSUnderlyingError=0x302d79d40 {Error Domain=kCFErrorDomainCFNetwork Code=-1009 "(null)" UserInfo={_NSURLErrorNWPathKey=unsatisfied (Local network prohibited), interface: en0[802.11], ipv4, uses wifi, _kCFStreamErrorCodeKey=50, _kCFStreamErrorDomainKey=1}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask .<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=( "LocalDataTask .<1>" ), NSLocalizedDescription=The Internet connection appears to be offline., NSErrorFailingURLStringKey=http://192.168.1.1:80/***, NSErrorFailingURLKey=http://192.168.1.1:80/***, _kCFStreamErrorDomainKey=1}
2
2
865
Jan ’25
Getting precise text position with Swift for MacOS
Hey there! Hope you are starting the year with great joy. My situation I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift My problem I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target. My discoveries so far I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API. Thank you in advanced.
2
0
526
Jan ’25
Apple Vision Pro - Homonymous Hemianopia
Individuals with a stroke can end up with vision impairments: specifically Homonymous Hemianopia which basically means the individual has lost sight in (as an example) the left half of both eyes. I'm interested in understanding if it would be possible to help individuals with this vision impairment by providing an accessibility config within the Apple Vision Pro which would first determine an individuals field of view (possibly by showing a field of dots across the entire "screen" and having the individual look at the dot and click. Based on the results of this field of view, this would determine how the screen would be presented to the user moving forward. My mom (82 years old) had a stroke recently and was diagnosed with Homonymous Hemianopia. She lived on her IPhone and would love to get back the ability to text message, use Facebook, and order items from Amazon. Please advise if you believe the Apple Vision Pro would be capable of helping in this area with the suggested development, or other thoughts.
1
0
464
Jan ’25
New ipados 18.3 beta troubleshooting
Hi apple, ive been having a problem with my ipad pro 5th generation. since updating my ipad it has been acting weird lately… to be specific it keeps closing twice randomly and the widgets turn white andcmy screen keeps going black when i go on apps it keeps exiting out of the app also the new siri is so slow and wouldnt answer when i say [hey siri] only on random ocasions please help me fix this problem because i need my ipad for studying… thank you.
1
0
704
Jan ’25
Iphone 14 plus
I am facing issue of back camera in my iphone 14 plus it is showing the black screen and my iphone is manufacture between april 2023 to april 2024 but its still not eligible for apple program my phone is also getting same issue why its not eligible for it
1
1
321
Jan ’25
Issue with “Vocal Shortcuts” disabling after one day of use
Hello Apple Community, I'm experiencing an issue with "Vocal Shortcuts" on iOS. I created a trigger in Vocal Shortcuts to run a specific shortcut, and it works perfectly on the first day. However, by the next day, the voice command stops functioning entirely. To make it work again, I have to disable and then re-enable Vocal Shortcuts in the settings. I've tested this on multiple devices (iPhone 11, iPhone 13, and iPhone X), all running the latest iOS version, and the same problem occurs on each one. Is there any additional configuration needed, or could this be a bug? Any advice or insights would be greatly appreciated! Thank you in advance,
2
0
592
Jan ’25
Add custom emoji tags
Allow the user to add their own tags to the default emoji tags. For instance, this emoji, for me, is nonna: 🤌🏻. My efficiency would improve immensely if I could search for it as the “Nonna” emoji, rather than searching for nonna, remembering it doesn’t exist, trying the search for other things it might be called, realising I don’t know what it is, then having to scroll through all the hand emojis twice to find it. 🤌🏻🤞🏼👌
1
0
476
Jan ’25
Audio Driver kit Read opration Sync Error。
When My Usb interface working on recording, the sync is not good work. I found every IO will in_io_buffer_frame_size is same, it is not sync to UpdateCurrentZeroTimestamp. So The Audio driver Kit Read opration is not same like Write? What is the way sync with Usb in data? If only play audio with UpdateCurrentZeroTimestamp, it working fine. Thanks!
5
0
614
Jan ’25
Making VoiceOver more concise on a SwiftUI Menu
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include: Apple Music, Library tab Photos, Album view Reminders In all these cases, VoiceOver reads this element as "More, Button". In my SwiftUI app, I've implemented a visually identical button. Menu { // Button for Menu Item 1 // Button for Menu Item 2 // ... } label: { Image(systemName: "ellipsis.circle") .accessibilityHidden(true) } .accessibilityLabel("More") However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
2
1
478
Jan ’25
How to lookup keybinding translation across input sources
I have an application that binds a menu item to trigger on ⌘]. When I set the US input source, I press ⌘] in order to trigger that item. However, when I switch the input source to QWERTZ (German), the trigger changes to ⌘Ä automatically by the OS. It seems to translate keystrokes for different input sources. The problem is that I also render the keybindings in a window in my application, and my application is not aware of this translation. Furthermore, I have other key shortcuts in my application which are not bound to menu items, and I want to make sure those get translated too. Does AppKit expose a way to lookup what a keystroke will be when MacOS translates it, i.e. lookup ⌘Ä from ⌘] when the current layout is QWERTZ? I can't find anything in Apple's docs. I tried converting a character to virtual key code based on the US layout and then mapping it back to a character based on the QWERTZ layout. That doesn't seem to be the same b/c that ends up converting ] to + instead which seems to be based on physical key location, different from how the keybindings are handled. Update: I notice similar behavior for VS Code's menu bar, e.g. in their "Terminal" menu. Switching to German changes some bindings. This does not occur at all in iTerm's menu bar, I suspect b/c their menu items are specified in a different way, xib files with hard-coded key equivalents
0
0
366
Jan ’25
Programmatically setting accessibility focus broken?
Hello! I'm trying to improve the accessibility of a UIKit login form in our iOS app. If an error occurs, an error message is shown in a label that is hidden by default. For our VoiceOver users, I want to move the focus to the error message label so that VoiceOver reads out the error message. I'm trying to achieve this using UIAccessibility.post, but try as I might, it does not work. To better understand the problem, I created a very simple App which shows a button and a label (always visible), and on pressing the button, I post an accessibility notification: UIAccessibility.post(notification: .layoutChanged, argument: label) What I expect to happen is for the focus to move from the button to the label. What happens instead is the focus stays with the button and VoiceOver reads out the button's label again. So it seems to process the notification, but ignore the argument. Am I misunderstanding how accessibility notifications work or is this simply broken at the moment? I am testing this withy my iPhone with the current iOS version 18.2.1 By the way, using the more modern variant leads to the same result: AccessibilityNotification.LayoutChanged(label).post()
3
0
488
Jan ’25
VoiceOver spells word letter by letter
We currently have an odd issue with VoiceOver spelling a word letter by letter while the same word is spoken as a whole for other items. The app is in German. I have a View in SwiftUI whose button traits are removed, then a label "Start Tab 1 von 5" is added. "Tab is spoken as a whole word here, all fine. If I change the label to "Tab-Schaltfläche" or for example "SimplyGo Tab 3 von 5", then "Tab" is spoken as "T A B", letter by letter. is there a way to force VoiceOver to speak it as a whole?
4
0
1.2k
Jan ’25
PerformAccessibilityAudit and sufficientElementDescription clarification
Hi, I am writing in the hope to receive some clarification about the rationale of the Audit type sufficientElementDescription - in context with Accessibility Audit API. Please see my test below: And another example in context with Xcode, where the strings visible in the UI are also set as accessible labels of their respective elements. Thanks for your help!
1
0
471
Jan ’25
Unity Apple Plugin Accessibility Voice over issue with focus
I am trying to implement voice over to my game, and have encountered an issue where a static text will take focus of all other interactions. I have a tutorial scene where I have one short "static text" accessibility node, but rest of gameplay is without such. This static text field occupies small part of screen and works fine, but I am not able to click on anything else, including any of my gameplay elements, wherever on the screen I click, it just re-highlights that static text. It there a requirement for all elements to use Accessibility Nodes and can't have mixed setup with some not having them ? How can I get around it? Question number 2: What decides which Accessibility node gets selected when entering a new UI screen, I have multiple buttons and am observing rather random behaviour, every time different button is highlighted first. Question number 3: The plugin documentation mentiones runtime support in play mode, are there any specific steps for this to work as I can't seem to be able to. I have VoiceOver enabled on macOS unity is on macOS (also tried iOS) platform but it doesn't seem to do anything. Note my buttons and label accessibility nodes work correctly on iOS build. Thanks in advance for any help
1
0
420
Jan ’25
Unable to Accept Invite
I am getting this issue when trying to accept an invite to a new test version of our app. ****Unable to Accept invite This invitation cannot be accepted because your Apple Account, xxxxxxxx.me.com, has already been associated to this app.**** Can you help please?
4
5
644
Jan ’25
tvOS: GCController does not send button press events for "Button A" and "Button Center" when VoiceOver is On
When turning VoiceOver ON, GCController does not send button press events for "Button A" and "Button Center". This happens when using Siri 2nd generation remote (with dedicated arrow buttons on the circle around center button) and also when using iOS remote. I didn't test it on old Siri 1st generation with touchpad without arrow buttons. Example: gameController.microGamepad?.allButtons.forEach { button in button.valueChangedHandler = { [weak self] _, _, _ in self?.buttonHandler(gameController: gameController, button: button) } private func buttonHandler(gameController: GCController, button: GCControllerButtonInput) { print("BUTTON: Pressed \(button.description) isPressed=\(button.isPressed) isTouched=\(button.isTouched)") } #endif VoiceOver ON (incorrect behavior): BUTTON: Pressed Direction Pad Left (value: 0.030, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Direction Pad Down (value: 0.079, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Direction Pad Down (value: 0.000, pressed: 0) isPressed=false isTouched=false VoiceOver OFF (correct behavior): BUTTON: Pressed Direction Pad Left (value: 0.137, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Direction Pad Up (value: 0.078, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Button A (value: 1.000, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Button Center (value: 1.000, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Button A (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Button Center (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Direction Pad Up (value: 0.000, pressed: 0) isPressed=false isTouched=false I could use for detection Direction Pad Left/Right/Up/Down and detect position between -0.7 and +0.7 and handle it as center button press, because I use that on old Siri remote where I need to distinguish center button and arrows (for switching TV channels by Up/Down and Skip forward/back by Left/Right arrows), but for new Siri remote it would be unnecessary workaround. Does anybody know why the center/select button is not detected when VoiceOver is ON. Is there another way of detecting it using GCController? I don't want to use SwiftUI onTapGesture for this one particular case. Is it an unexpected bug in tvOS APIs or is there some specific reason why center button is not handled by GCController when VoiceOver is ON? Thanks.
0
0
604
Jan ’25
Haptics seem to stop working in iOS 18.3 (22D5055b)
I checked the latest release notes for latest beta, and there doesn't seem to be a fix for this. But basically, the vibrations that you receive for when you long press a message to react, or hold down on an app in Home Screen, seem to stop working after a while. This issue is reoccurring randomly. Steps to repro: Not fully sure on this, but I'm on iPhone 16 pro max and running the iOS 18.3 dev beta described in the title. I have the default haptics enabled in which you receive a vibration when you long press on a message in iMessage or Messenger, and also when you long press on an app on the Home Screen. These seem to stop working, along with any other vibrations apart from calls and notifications) after a while. The only workaround is to restart the iPhone entirely. anyone else face the same?
2
1
1.3k
Jan ’25