Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Posts under Accessibility tag

140 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Apple is lying about its commitment to accessibility on macOS
I've just received an email from Apple regarding the Global Accessibility Awareness Day and some forthcoming sessions to promote their accessibility features. What a joke. For many years, Apple refuses to provide the most basic accessibility requirement on macOS: LET USERS DISABLE ALL NON-CONSENSUAL UNSOLICITED ANIMATIONS AND OTHER UI CONVULSIONS. The scourge of animations started from macOS Lion. Yes, many of them can be, fortunately, disabled through some obscure Terminal commands (that is, if the user is lucky enough to discover them on some obscure internet resources). The "Reduce motion" control in System Settings is a fake option that doesn't do anything. And there are two most glaring accessibility violations that cannot be disabled: Scroll bar rollover highlight effect introduced on macOS 10.7.3. Every time you move the cursor over a scroll bar, the bar gets highlighted. It results in bringing the user's attention to random scroll bars for no reason whatsoever just because the cursor happens to pass over the bar at some point. HUNDREDS of unnecessary, annoying events of distraction daily! Expand/collapse animation of NSOutlineView (such as when we open/close a folder in the list view in the Finder, as well as any other app that's using outline views). It's extremely annoying, distracting, and time-wasting. All feedback submitted about this through the years remains mostly ignored (except for a few cases where I received some ridiculous replies from employees who, apparently, are barely familiar with Macs in general). Apple does NOT care about accessibility. Not only this, but it's obvious that Apple is, in fact, intentionally abusing those users who can't tolerate distracting, time-wasting animations and UI convulsions.
0
0
92
1d
VoiceOver incorrect focus on modal alert
When my macOS Cocoa app displays a modal alert with beginSheetModal(for:completionHandler:), VoiceOver sometimes seems to focus on an "illegal" upper level, where any attempts at navigation will give the unhelpful response "Alert, dialog", until you "drill down" with VO + shift + down or switch apps. After that, things will work as expected. Is this a known bug? Does it happen to anybody else, or am I doing something wrong?
3
1
33
1d
VoiceOver navigation in carousels
Hi all, I’ve got a usability question about accessibility navigation. My app has a lot of carousels (horizontally scrolling lists of content with far more elements than can fit on the screen). Often, these are just images, but sometimes, they’re cards with multiple subelements. In our previous implementation, each card was a single accessibility element, and we exposed the subelements as accessibility custom actions. Despite this, users frequently mentioned navigating with VoiceOver as a pain point. It takes a long time to navigate through and navigate past these carousels. To solve this, I converted my carousels into a single adjustable element, so users can navigate through it with one swipe, and they can still access the elements by adjusting the values up and down. I got this advice from this 2018 WWDC talk. Is this still the recommended advice? Or is there a new, preferred way to do this? Additionally, I had to get a little creative with the second carousel, the one with multiple subelements. Some of these were interactive (imagine a card with a description, an upvote button, and a downvote button). Adjustable elements override the accessibility custom actions VoiceOver gesture, so I can’t expose the individual buttons as actions. Instead, I made each subelement in each card in the carousel one of the adjustable values. Swiping up would go from description 1 to upvote button 1 to downvote button 1 to description 2, etc. Double tapping with VoiceOver would perform whatever action the carousel is currently on. So if I adjust the value to the element at index 2 (say, downvote 1), double tapping would trigger the downvote button’s action. Does this make sense? Is there a better way to do this? This seemed to be the best compromise between screenreader navigation speed, exposing all actions, and the existing UI.
0
1
64
5d
What is the appropriate accessibility trait for selectable UITableViewCell?
I’m trying to understand the best practice for assigning accessibilityTraits to a UITableViewCell that users can select from a list of options. In Apple’s first-party apps like Settings, I’ve noticed an inconsistent approach—some cells use the Button trait, while others simply announce the label along with the Selected trait when applicable, without any additional role like Button or Adjustable. So my question is: What is the most appropriate accessibility trait to use for a selectable table view cell that updates a selection (like a settings option)? Is using .button the right approach, or should we rely solely on .selected? Is there any user experience guideline from Apple that recommends one over the other? Would love to hear how others handle this for clarity and consistency in VoiceOver behavior.
1
0
45
6d
Markdown Link in AttributedString Not Focusable with Full Keyboard Access in SwiftUI
I’m encountering an accessibility issue in SwiftUI related to keyboard navigation. 🐞 Problem When using an AttributedString to display Markdown content in a SwiftUI view (such as a Text view), any links included in the Markdown are not keyboard focusable when Full Keyboard Access is enabled. This means users can’t navigate to or activate the links using the Tab key or other keyboard-only methods. 💻 Platform iOS version: 16+ Framework: SwiftUI Device: All tested iPhones and iPads 🧪 Steps to Reproduce Enable Full Keyboard Access in iOS settings. Run the included SwiftUI Playground or equivalent app using the code below. Try to navigate to the link using Tab or keyboard arrow keys. Observe that the Markdown link is not reachable via keyboard focus. 🧩 Expected Behavior The Markdown link should be reachable via keyboard focus. It should be possible to activate the link using Space or Return. 📚 Example code struct ContentView: View { let attributedString: AttributedString init() { self.attributedString = try! AttributedString( markdown: "This is a [test link](https://apple.com) inside an attributed string." ) } var body: some View { VStack { Text("Issue: Attributed Markdown Link Is Not Focusable with full keyboard access") .font(.headline) .padding() Text(attributedString) // The link is not focusable with .padding() .border(Color.gray, width: 1) Text("Expected: The link should be focusable with Full Keyboard Access.") .foregroundColor(.red) .padding() } } }
1
1
34
1w
Unexpected behaviour of hardware keyboard focus in UITests
Hello! I was faced with unexpected behavior of hardware keyboard focus in UITests. A clear description of the problem When running UITests on the iOS Simulator with both "Full Keyboard Access" and "Connect Hardware Keyboard" options enabled, there is a noticeable delay between keyboard actions for focus managing (like pressing Tab or arrow keys). The delay seems to increase with repeated input and suggests that events are being queued instead of processed immediately. I will describe why I have such an assumption later. A step-by-step set of instructions to reproduce the problem Launch the iOS Simulator. Enable both "Full Keyboard Access" and "Connect Hardware Keyboard" in the Simulator settings. Run a UITest on a target application (ideally an endless or long-running test). Once the app is launched, press the Tab key several times. Observe the delay in focus movement. Optionally, press the Tab or arrow keys rapidly, then stop the UITest. After stopping, you’ll see a burst of rapid focus changes. What results you expected We expected keyboard actions (like Tab) to be handled immediately and the UI focus to update smoothly during UITests. What results you saw There was a 4–10 (end more) second delay between pressing keys and seeing a response. All stacked keyboard events (used for managing focus) are performed all at once after stopping the UITest. The version of Xcode you are using Xcode: Version 16.3 (16E140) Simulator: iPhone 16 Pro (iOS 18.4 and 18.1) Simulator: iPad Pro 11-inch (M4) (iPadOS 17.5)
1
2
82
1w
Using WebSocket for BCI Click Input in VisionOS - FocusState vs. System-Level Limitations
Hi everyone, My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment. Problem 1: Simulating Eye Tracking in Simulator We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro? Problem 2: WebSocket-triggered "Click" doesn't work outside the app We successfully use WebSocket to send a custom signal (a "1" from the brain signal device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works. We suspect this is due to sandbox restrictions or lack of system-level permissions. Is it possible in anyway to: Trigger interaction events outside the app using custom input (like BCI via Websocket)? Access system-wide click/tap simulation APIs from within VisionOS apps Integrate this with accessibility services (like Voice Control or AssistiveTouch) We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS. Thanks in advance for any insight you can provide!
0
0
48
2w
PencilKit on visionOS Doesn’t Support Left-Handed Users? How Can We Customize Hand Roles?
I’m building a visionOS app that uses PencilKit for drawing. Currently, PencilKit defaults to using the right hand for drawing and the left hand for panning, with no apparent way to change this behavior. Some of my users are left-handed, and they naturally want to draw with their left hand and pan with their right. However, PencilKit doesn’t seem to support this interaction pattern. Is there a way to customize which hand does what in PencilKit on visionOS? Or have I missed some API or workaround that would allow support for left-handed users?
1
0
45
2w
Add VoiceOver touch gesture guidance for frame iframe in webView and Safari web
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures. Specifically... iframes. There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch. If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users. VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes. VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor. While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
1
0
70
2w
Handling VoiceOver Focus When Screen Changes (Push, Present, and SplitViewController)
I have some doubts about how VoiceOver handles focus when the screen updates. When a new UIViewController is pushed onto a UINavigationController or presented modally, how does VoiceOver decide which element to focus on? Is there a way to control or customize this behavior? In a UISplitViewController, when an item is selected in the primary view controller, the focus should shift to the relevant content in the secondary view controller. How can we ensure that VoiceOver correctly moves focus to the right element in the secondary panel?
0
0
79
2w
VisionOS - Gamepad steals focus
I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running. Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
1
0
83
2w
VoiceOver Not Scrolling to Focused TableView Cell
I have a view dynamically overlaid on a UITableView with proper padding (added when certain conditions are met). When VoiceOver focuses on a cell beneath this overlay, the focused element does not scroll into view. I’ve noticed similar behavior in Apple’s first-party Podcasts app. Please find the attached image for reference. How can I resolve this issue and ensure VoiceOver scrolls the focused cell into view?
1
0
51
3w
VoiceOver Text Recognition Announcing Hidden Labels
I have a UIImageView as the background of a custom UIView subclass. The image itself does not contain any text. On top of this image view, I have added two UILabels. To improve accessibility, I converted the entire view into a single accessibility element and set a proper accessibilityLabel. Additionally, I disabled accessibility for the UIImageView and the labels by setting isAccessibilityElement = false. However, when VoiceOver's Accessibility Recognition's Text Recognition feature is enabled, VoiceOver still detects and announces the text inside the UILabels at the end after reading my custom accessibility properties. This text should not be announced. It seems that VoiceOver treats the UILabel content as part of the UIImageView. Additionally, when using the Explore Image rotor action, the entire subview is recognized as a single image. Is this the expected behavior? If so, is there a way to disable VoiceOver’s text recognition for this view while keeping custom accessibility intact? class BackgroundLabelView: UIView { private let backgroundImageView = UIImageView() private let backgroundImageView2 = UIImageView() private let titleLabel = UILabel() private let subtitleLabel = UILabel() override init(frame: CGRect) { super.init(frame: frame) setupView() } required init?(coder: NSCoder) { super.init(coder: coder) setupView() configureAceesibility() } private func configureAceesibility() { backgroundImageView.isAccessibilityElement = false backgroundImageView2.isAccessibilityElement = false titleLabel.isAccessibilityElement = false subtitleLabel.isAccessibilityElement = false isAccessibilityElement = true accessibilityTraits = .button } func configure(backgroundImage: UIImage?, title: String, subtitle: String) { backgroundImageView.image = backgroundImage titleLabel.text = title subtitleLabel.text = subtitle accessibilityLabel = "Holiday Offer ," + title + "," + subtitle } private func setupView() { backgroundImageView2.contentMode = .scaleAspectFill backgroundImageView2.clipsToBounds = true backgroundImageView2.translatesAutoresizingMaskIntoConstraints = false backgroundImageView2.image = UIImage(resource: .bannerfestival) addSubview(backgroundImageView2) backgroundImageView.contentMode = .scaleAspectFit backgroundImageView.clipsToBounds = true backgroundImageView.translatesAutoresizingMaskIntoConstraints = false addSubview(backgroundImageView) titleLabel.font = UIFont.systemFont(ofSize: 18, weight: .bold) titleLabel.textColor = .white titleLabel.translatesAutoresizingMaskIntoConstraints = false titleLabel.numberOfLines = 0 addSubview(titleLabel) subtitleLabel.font = UIFont.systemFont(ofSize: 14, weight: .regular) subtitleLabel.textColor = .white.withAlphaComponent(0.8) subtitleLabel.translatesAutoresizingMaskIntoConstraints = false subtitleLabel.numberOfLines = 0 addSubview(subtitleLabel) NSLayoutConstraint.activate([ backgroundImageView2.leadingAnchor.constraint(equalTo: leadingAnchor), backgroundImageView2.trailingAnchor.constraint(equalTo: trailingAnchor), backgroundImageView2.heightAnchor.constraint(equalToConstant: 200), backgroundImageView.centerYAnchor.constraint(equalTo: centerYAnchor), backgroundImageView.topAnchor.constraint(equalTo: topAnchor), backgroundImageView.leadingAnchor.constraint(greaterThanOrEqualTo: leadingAnchor), backgroundImageView.trailingAnchor.constraint(equalTo: trailingAnchor), backgroundImageView.bottomAnchor.constraint(equalTo: bottomAnchor), titleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16), titleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor), titleLabel.bottomAnchor.constraint(equalTo: centerYAnchor, constant: -4), subtitleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16), subtitleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor), subtitleLabel.topAnchor.constraint(equalTo: centerYAnchor, constant: 4) ]) } override func layoutSubviews() { super.layoutSubviews() backgroundImageView.layer.cornerRadius = layer.cornerRadius } }
2
0
66
3w
accessibilityActivationPoint Not Working When Set Directly on UITableViewCell
I’m trying to set the accessibilityActivationPoint directly on a UITableViewCell so that VoiceOver activate on a specific button inside the cell. However, this approach doesn’t seem to work. Instead, when I override the accessibilityActivationPoint property inside the UITableViewCell subclass and return the desired point, it works as expected. Why doesn’t setting accessibilityActivationPoint directly on the cell work, but overriding it inside the cell does? Is there a recommended approach for handling this scenario? The following approach works, override var accessibilityActivationPoint: CGPoint { get { return convert(toggleSwitch.center, to: nil) } set{ super.accessibilityActivationPoint = newValue } } but setting accessibility point directly not works private func configureAccessibility() { isAccessibilityElement = true accessibilityLabel = titleLabel.text accessibilityTraits = .toggleButton accessibilityActivationPoint = self.convert(toggleSwitch.center, to: self) accessibilityValue = toggleSwitch.accessibilityValue }
2
0
74
2w
accessibilityElements excludes the unlisted subviews – How to Fix?
I have a parent view containing 10 subviews. To control the VoiceOver navigation order, I set only a few elements in accessibilityElements. However, the remaining elements are not being focused or are completely inaccessible. Is this the expected behavior? If I only specify a subset of elements in accessibilityElements, does it exclude the rest? What’s the best way to ensure all elements remain accessible while customising the order?
3
0
91
1w
Connections application with 4,000 pre-sign ups rejected unfairly - 4.3
For the last 2 years, our team at Panda has had one goal in mind: to change the failing connection application industry. The business model is severely flawed - evidenced in decline of users in match group etc (all public info). We are building the only connections app in the market without paid features – "We Don't Play Games”. This in itself revolutionizes a space which currently commodifies human connection; true connections aren’t forged through super-likes, platinum memberships and such pay-to-win models, where users that don’t pay are unfairly disadvantaged. Key Differentiators: Never having paid features 50/50 Male-Female Ratio: Our app will ensure a balanced male-to-female ratio, something not found in other apps, especially in countries like India, where dating apps are dominated by men. This helps create a healthier, more equitable user experience for all genders. In a country like India, how can any connections app succeed with 99.9% men and 0.1% women? Panda Duos: A first-of-its-kind feature where two sets of best friends can match with each other – an industry-first that no other major connections app offers. These elements, along with the app's core ethos, make Panda unique in an otherwise saturated market. The traditional models used by Match Group/Bumble are failing, as shown in their earnings reports, because they rely on a pay-to-win approach that doesn't deliver real value. Given these unique aspects, having received 4,550+ pre sign-ups, and our backing by a top VC, we strongly believe that Panda will offer an entirely different experience to users and remake a failing industry. This is not what Apple stands for, goes against fairness, and undermines the trust and respect that it should have as the only app store for iOS phones.
3
0
67
Mar ’25
Assistive Access Bugs
Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode. Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work? Cannot make FaceTime calls with favorite contacts. Find My iPhone cannot jump to the maps app. Camera cannot zoom in or out. Photos cannot be deleted, edited, or shared in a shared album in the photos app. Photos/videos cannot be sent in messages. Spotify cannot be accessed from the lock screen. Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks). There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
1
0
62
Mar ’25