Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Accessibility Documentation

Posts under Accessibility tag

134 Posts
Sort by:
Post not yet marked as solved
2 Replies
1k Views
I'm not sure how a phone number should be read by VoiceOver. Is there a particular format or API that I can use to read phone numbers in my app? At the moment I'm just assigning the phone number as a String to a UILabel and VoiceOver reads the accessibilityLabel.
Posted
by
Post not yet marked as solved
2 Replies
2.9k Views
In our app, we display contacts in UITableView. Let us say I have 300 contacts in my AddressBook, and all of them will be displayed in this table. Below this table, I have a UIButton to perform some action like invite selected contacts. With Voice Over enabled, when I get to the UITableView, it doesn't let me to proceed to the UIButton, unless I go over all 300 contacts. Is there a solution to override this and make it more friendly to the visually impaired users?
Posted
by
Post not yet marked as solved
11 Replies
27k Views
Hi, I am developing an App in React Native for this i need Xcode. However when i start my Expo Developer Tool Metro Bundler There when I click Run IOS Simulator. Always From the VS Code Terminal I am getting the same message like Xcode needs to be installed (don't worry, you won't have to use it), would you like to continue to the App Store? However I have already Xcode installed the after that when i type Y it takes me to App Store where i find the option open Xcode it's because i have already installed Xcode. Please guide me through this because i am not able to see my App on IOS devices
Posted
by
Post not yet marked as solved
5 Replies
3.2k Views
We are currently experiencing a usability issue in our App. We also discovered this issue for sites in Safari as well. While using Voiceover in iOS 13.3+, we've discovered that VO skips all tables that are using a caption. This occurs when a user swipes to read the contents of the page. We also observed that using the "rotor" and choosing tables, it will not recognize that there is a table on the page. This has been repeated by multiple users on different devices. Our testing also encompassed VO on macOS Catalina and VO worked as expected for all tables we tested. Has anyone else come across this issue?
Posted
by
Post not yet marked as solved
1 Replies
1.2k Views
VoiceOver announces the accessibility information for an element twice when I shift focus to that element. I set the focus using: UIAccessibility.post(notification: .screenChanged, argument: accessibilityElement) This is invoked at the dismissal of a modal view, after the accessibility value of the element has been changed. When the element receives focus, VoiceOver announces the element and it's old value, then announces the element and it's new value. Is there a way to prevent VoiceOver from announcing the element the first time? I only want this to happen after the new value has been set.
Posted
by
Post not yet marked as solved
1 Replies
893 Views
Hey guys, I have an NSAttributedString within my app (created from HTML). I assign this string to a UITextView. I would like certain parts of that text to be marked with an 'header' accessibility trait (all the headlines in that text) so that voice over can identify them properly. I was under the impression that I can just use accessibilityTextHeadingLevel to do so, but the text in that given range is still setup with the 'text' accessibility trait: var myString = NSMutableAttributedString(...) let range = NSRange(location: 0, length: 44) myString.addAttribute(NSAttributedString.Key.accessibilityTextHeadingLevel, value: 1, range: range) How is accessibilityTextHeadingLevel supposed to work?
Posted
by
Post not yet marked as solved
1 Replies
977 Views
System Version: macOS Monterey 12.3.1 My app has a button which will show popover when clicked the button and make a button in the popover to be the popover's window's firstResponder. When Voiceover is not on, I am able to close the popover using escape key (handled by NSpopover's contentView). But when I turned voice over on, pressing escape key cause both the popover and its parent window to escape, and the debug result showed that the popover's sender received the key event rather the popover (In sender's window pressing escape will close the window). This problem disappeared when I changed the popover's behavior to .applicationDefined (still not working for .transient or .semitransient), but I still want to know why VoiceOver will affect the behavior of NSpopover
Posted
by
Post marked as solved
5 Replies
899 Views
I'm developing a desktop application and need the equivalent of a live region to notify visually impaired users of changes on the screen. The documentation only talks about them in the context of a web page - does this feature work for a desktop applications as well?
Posted
by
Post not yet marked as solved
1 Replies
1k Views
I have been setting up MacOS color filters via "System Preferences" -> "Accessibility" -> "Display" -> "Color Filters". When I set it up, the filter effect only shows up on two of the four monitors I have attached to my Mac Pro. (The two that work are a Samsung LC49G95T and a Dell S3221QS; the two that do not are both HP2509s.) I have filed a bug report with Apple on this, but it occurs to me that it might be helpful to add to it any other users' experiences with monitors that do and do not "work" with color filters. Has anyone else seen this phenomenon? Alternatively, does anyone have any suggestions for getting the 2509s to show the filtering?
Posted
by
Post not yet marked as solved
1 Replies
543 Views
hi, In my project, I try to use UIAccessibilityPostNotification (UIAccessibilityScreenChangedNotification, A) method focusing on A, But will call accessibilityApplyScrollContent: sendScrollStatus: animateWithDuration: AnimationCurve, which triggers the scrolling method of A's superView (scorllView). I didn't know what was going on, so I wanted to understand how it worked.I will be very pleased If you could reply to me
Posted
by
Post marked as solved
1 Replies
1.3k Views
Im working on a small text snippet / lorem ipsum app as a side project and the idea is, for instance, whenever and wherever user types "lorem10" I'd like to print/paste 10 random lorem ipsum words. Eg. "lorem10 " -> ("Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do") For that to be possible I need to, Programmatically press "delete" key to remove the trigger string ("lorem10"). Programmatically press "cmd + v" for pasting the result string. This is possible, even in sandbox! But it requires accessibility permission. For instance I can simulate "delete" key press like this: func delete() {     let eventSource = CGEventSource(stateID: .combinedSessionState)     let keyDownEvent = CGEvent(       keyboardEventSource: eventSource,       virtualKey: CGKeyCode(51),       keyDown: true)     let keyUpEvent = CGEvent(       keyboardEventSource: eventSource,       virtualKey: CGKeyCode(51),       keyDown: false)     let loc = CGEventTapLocation.cghidEventTap     //Triggers system default accessibility access pop-up     keyDownEvent?.post(tap: loc)     keyUpEvent?.post(tap: loc)   } My question is essentially if this is allowed in Mac App Store? Because requesting accessibillity permission like this is not allowed in sandbox: func getPermission() { AXIsProcessTrustedWithOptions([kAXTrustedCheckOptionPrompt.takeUnretainedValue():true] as CFDictionary). } But I can simulate one short "shift" or "cmd" key press for instance, and trigger the pop-up inside a sandboxed app and get around this it seems. Is this a bug? I really hope I can release my app in the Mac App Store, but doing so I just want to be sure Im not using any bug that might get removed in the near future.
Posted
by
Post not yet marked as solved
3 Replies
1.5k Views
Hi! I am trying to use the Apple Unity Plugins to add VoiceOver support to my game. I have built the plugins, and followed along with the video shown at https://developer.apple.com/videos/play/wwdc2022/10151/ However, I am seeing odd behaviour that I hope someone could help me with. For what it's worth, I am using Unity 2021.3.10f1 to build my game. I used the recommended version of Unity (2020.3.33f1) to build the plugins. Odd behaviour #1: I cannot work out what determines the order of AccessibilityNodes. If I have three GameObjects at the same level in the hierarchy, all marked with the trait button, using VoiceOver's single swipe right to go to each element in turn, seems to go to the last element first, then the first, then the second. At first I wondered if it was related to alphabetical order (GameObjects are called 'Music', 'SFX', 'Delete'), but renaming them to prefix numbers to the start of the names doesn't change the order. I have also tried removing the Label from the AccessibilityNode, but this did not help. Odd behaviour #1.5: The video says that Buttons don't need labels, and that if you use standard Unity UI controls, the label should be picked up automatically. I am not seeing this behaviour. I have a GameObject with a Button component, with a child GameObject with a TextMeshPro component. The label is not being picked up automatically. Odd behaviour #2: Once I have a button selected, I am unable to trigger it, with a VoiceOver 1-finger double tap. It just seems to repeat the label of the button. The GameObject I am trying to trigger just has a Button component, and the configuration looks the same as in the video. Does anyone have any successful experience using this plugin and could give me any insight? Thanks, Stephen
Posted
by
Post not yet marked as solved
1 Replies
578 Views
I have a question about using a swipe functionality to confirm a booking, like answering the phone by sliding to answer (also as sliding to unlock). I am curious if this harms accessibility, say, if the user has VoiceOver activated. I would like to know if anybody has any experience developing this specific UI pattern related to it being too cumbersome for users with a disability or if this is just fine to use.
Posted
by
Post not yet marked as solved
2 Replies
343 Views
Hi, When a user scrolls on a page, VoiceOver announces "Page [current page] out of [total pages]". Sometimes, it is noticed that when scrolling on some apps (such as Photos), VoiceOver announces that there is one extra page than there actually is. For example, it will announce "Page 3 of 3" on the last page on the screen when there are only a total of 2 pages.
Posted
by
Post not yet marked as solved
1 Replies
455 Views
I’m trying to find some documentation on how to properly support a Bluetooth keyboard in an iOS app. I have a custom UIView which consists of some subviews but the Bluetooth keyboard is unable to target (by using the arrow keys on the keyboard) some of the subviews and I want to know what I’m doing wrong. I’ve tried to find official documentation from Apple to no avail.
Posted
by
Post not yet marked as solved
6 Replies
2.6k Views
I've found multiple leaks in AVSpeechSynthesizer which are plaguing my users. My users are complaining of crashes due to this. Ive created a feedback item (FB12212129) with a sample project attached which demonstrates one of the leaks. I'm hoping an engineer notices this. The only way ive hade my feedback noticed in the past is by both creating a feedback item AND posting on the forums. So here's my forum post. Help is much appreciated!
Posted
by
Post not yet marked as solved
0 Replies
507 Views
I can't seem to execute Voiceover command with JXA. The following AppleScript works. tell application "VoiceOver" tell commander to perform command "item chooser" end tell However, the following script in JXA throws a mystery error. vo = Application("VoiceOver") vo.commander.performCommand("item chooser") Error 6: An error occurred. I'd really appreciate any help with this! Thanks so much!
Posted
by
Post not yet marked as solved
0 Replies
241 Views
My app have raw MathML expression that I'd like to pass to iOS Accessibility framework, so ideally user can use rotor to change granularity when using VoiceOver (such as small/medium/large expressions) to navigate within the equation. This example shows the experience that I am aiming for in Apple Books: https://www.perkins.org/resource/how-read-math-expressions-voiceover-ios-device/. However, I couldn't find what's the best way to expose the raw MathML expression to the accessibility framework. AFAIK, the only way to pass accessibility information to iOS Accessibility framework is by setting the attributes, including label, value, traits, hint, etc. But by experimenting with them, I couldn't get VoiceView to allow the small/medium/large expressions in the rotor. Is there a way to pass raw MathML expression to iOS accessibility framework? Thanks.
Posted
by