Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Accessibility Documentation

Posts under Accessibility tag

126 Posts
Sort by:
Post not yet marked as solved
5 Replies
589 Views
Accessibility Inspector no longer works with Xcode 13.3. I upgraded to Xcode 13.3.1 but the problem persists. Accessibility Inspector fails to capture the elements by moving on the graphical interface of the simulator.
Posted Last updated
.
Post not yet marked as solved
1 Replies
42 Views
Hello, In Short: add a way to use the text prediction from the Accessibility Keyboard using the regular keyboard. (for example: remap the row number keys.) Explanation: mac's on-screen keyboard has some great features for people with accessibility needs: it is static on the screen, present the typed text in a big font and suggest words. The problem is that the physical keyboard can't use the word suggestion. some of us (accessibility users) can benefit from text prediction using a keyboard, like in any IDE (for programmers) or iPhone keyboard. We are more comfortable using a keyboard than a mouse. By the way: Microsoft Windows's implementation is lacking, as it requires the use of the "up arrow+enter" which is small and far from the center of the keyboard, therefore hard to reach while typing fast, but at least there is something :-) in that case a keyboard macro can help to remap the row number into "up+up...+enter". I have tried to develop it without success. my problem is that I can't find a way to interact with the accessibility keyboard. remapping keys to run a script is the easier part. Thank you for helping make Apple more accessible.
Posted
by nadcohen.
Last updated
.
Post marked as solved
2 Replies
569 Views
Until iOS 15 navigating to a next element went fine by swiping right. Since iOS 15 mainly just before the bottom of a screen (tested on iPhone 12 pro) the navigation is blocked and it looks like it reverses. Cause you can then go to the next element suddenly by swiping left. For example we have at the bottom of one of the screens that shows this different behavior a stack view containing 3 buttons. After focusing button 1, navigation is blocked and then I have to swipe left to go to button 2 and 3. Before iOS 15 I could swipe right to reach all three buttons. Should I raise a bug at Apple or did I miss something?
Posted Last updated
.
Post not yet marked as solved
3 Replies
182 Views
How can we assign the unique accessibility identifiers to the urls in UITextView In the accessibility inspector, I am able to see the urls as separate elements but not able to set the accessibility identifiers for them. The accessibility identifiers are needed for automation testing. In the screenshots added here, i have taken a UITextView which has urls in it.
Posted Last updated
.
Post not yet marked as solved
0 Replies
53 Views
now i have a device connected with USB in remote, i want to use some api or command to control it when voiceover is on, is there any way we can do it? maybe use usbmuxd or some other way?
Posted
by MC_UFTM.
Last updated
.
Post not yet marked as solved
1 Replies
74 Views
I'm using NSAccessibilityAnnouncementRequestedNotification with NSAccessibilityPriorityMedium to announce incoming, unsolicited text in a text game - and sometimes VoiceOver will move onto the next message without fully reading out the first. How can I get VoiceOver to fully read out the current message before moving onto the next? I'm building this accessibility work as part of Mudlet (mudlet.org) an open-source MUD client to give more context.
Posted Last updated
.
Post not yet marked as solved
1 Replies
327 Views
Is there a good alternative to UILargeContentViewerInteraction - https://developer.apple.com/documentation/uikit/uilargecontentviewerinteraction/ in SwiftUI? I know that TabBar - https://developer.apple.com/documentation/swiftui/tabview has support for the large content viewer, but I'm not sure how to implement it in a custom control. Any ideas? I've tried adding UILargeContentViewerInteraction to a UIKit wrapper, but I must've been doing that wrong because all it did was interrupt long presses. Thanks!
Posted
by huwilk.
Last updated
.
Post not yet marked as solved
0 Replies
94 Views
As I currently live in ReactNative Hell, I like to flesh out all my native iOS demos and samples to the max. Including things like accessibility. Recently, I wrote a very simple demo containing a map, and I stumbled upon some issues I was unable to resolve. I think they present very general usecases, and so I would be happy if anyone of you had any idea. The condensed source code for the issues can be found on GitHub Issue 1: The Phantom Overlay To reproduce Run the app on a device, and be sure that VoiceOver is on. Swipe right to get to the Annotations. Expected Result The title of the annotation is read. Actual Result The title of the annotation is read twice. What I know For every annotation on the map view, there is also an overlay, an MKCircle, generated by an MKCircleRenderer. When this overlay is not present, the title is — correctly — only read once. What I have tried In ViewController.swift, lines 54 and 92, I have set both the overlay's and the renderer's isAccessibilityElement property to false. This does not fix the issue (probably because neither of them are the actual views). The overlay should never be an accessible element. Any information should be encoded in the annotation (e.g. "There is a 10m region around this marker") Issue 2: The Unknown Trait While it is correct that the title of the annotation should be read, there is no indication that the annotation can be clicked or interacted with. I have set annotationView.accessibilityTraits = [.button], but this does not change anything. My expectation would be "Cologne Cathedral, Button" or a similar hint that the item is clickable. Issue 3: The Unreachable Callout With VoiceOver active, click on an annotation. I have taken some hints from Stackoverflow, and tried to disable the accessibility on the annotation, and enable it on the callout. This leads to the callout being reachable somehow, but it is absolutely not obvious if you can not see the screen. How can I indicate to the VoiceOver user that now a callout is being shown? A Working Extra: The Annotation Rotor The app also contains a custom rotor, to go through the annotations one by one, without also reading the default Points-Of-Interest on the map. Interestingly (or maybe rather as expected), the title of the annotation is correctly only read once. I whould be extremely happy to get some feedback on these issues, it sounds like most of them could be rather common. Thanks! Alex
Posted
by below.
Last updated
.
Post marked as solved
2 Replies
93 Views
How do I access the value in Settings → Accessibility → Double-tap Timeout? More details: I'm making a game with a SpriteView in SwiftUI that needs to react immediate to a touch while also detecting and responding to double, triple, and quadruple taps. To do this, I have the following: var touch: some Gesture {     var lastTouchTime = Date.now     var tapCounter = 1     return DragGesture(minimumDistance: 0, coordinateSpace: .local)         .onChanged() { touch in             // Respond immediately to touch.         }         .onEnded() { touch in             if lastTouchTime.timeIntervalSinceNow > -0.25 {                 tapCounter += 1             } else {                 tapCounter = 1             }             switch tapCounter {             case 1:                 break             case 2:                 // Respond to double-tap.             case 3:                 // Respond to triple-tap.             default:                 // Respond to quadruple tap.                 // Quintuple, sextuple, etc. taps also resolve here, which is fine.             }             lastTouchTime = Date.now         } } I would like to substitute 0.25 with the user's value in Accessibility settings, but I do not know how to access this value. The following does not work: .gesture(doubleTap) .gesture(singleTap) SwiftUI waits for the double-tap to fail before reacting to the single tap, which is not what I need, plus stacking triple and quadruple taps on top of this just fails.
Posted Last updated
.
Post not yet marked as solved
3 Replies
1.5k Views
Hi All, We are using html input type date in our hybrid application which launches the native datepicker in the app. As we need the app to be accessible we chose the html type date. However at certain instances we want to restrict the date selection till present date only. We have used the max attribute to achieve this, for some reasons the datepicker is not respecting this attribute and allowing the user to select future dates. We are in a tricky position since we do not want to use any library and the production date is nearby. Below is the markup we have used for replicating the behavior: form label for="party"Choose your preferred party date: input type="date" name="party" min="2017-04-01" max="2017-04-30" /label /form Any help would be appreciated. Thanks in advance.
Posted Last updated
.
Post not yet marked as solved
0 Replies
93 Views
I'm working on a mac app that has similar functionalities to the Rocket macOS app. Whenever the user types : outside of my app I'll show an NSPopover with NSTableView near the cursor. The problem I'm facing here is that The moment I show the NSPopover it becomes the first responder and the user couldn't continue typing. So I tried this, Showed the NSPopover without making a first responder, and used NSEvent.addGlobalMonitorForEvents(matching: .keyDown, handler: ) to listen UpArrow, DownArrow and Enter actions to navigate and select results. The downside of this approach: Let's say I'm composing a new mail, and the cursor responding to the up/down arrow events, eventually moved to other lines. Attaching gifs for reference, The functionality that I'm expecting From my app Note: Notice the cursor position to understand what goes wrong.
Posted Last updated
.
Post not yet marked as solved
0 Replies
94 Views
Hi. i'm developing an iOS Phone Call app for enterprise release. iPhone has auto-answer option when you have an incoming phone call. i want to progrmatically control auto-answer option in my Objective-c( or Swift ) code. is there any API to set iPhone's auto-answer option excluding jail-breaks? and i wonder if this auto-answer controlling is an invasion of privacy.
Posted
by ucap.
Last updated
.
Post not yet marked as solved
0 Replies
94 Views
I'm working on a mac app that uses Accessibility APIs to get the cursor position from the focused element (Focused element can be from any app) and shows a window near the cursor position. I checked the window positioning in native apps like XCode, Notes, and Messages and tried on web apps as well everything is working fine. I checked the same on the slack desktop app where I'm facing an issue. Which is I'm getting the XOrigin always as 0 and YOrigin as 982 (That is equal to my screen's size: NSScreen.main?.frame.height). Seems it is breaking on electron apps. What am I missing here? Do we need to account for anything else to handle electron apps? Attaching my code for reference. extension AXUIElement {   func getCursorRect() -> CGRect? {     guard let cursorPosition = cursorPosition else {return nil}     var cfRange: CFRange = .init(location: cursorPosition, length: 1)     let axval: AXValue? = AXValueCreate(.cfRange, &cfRange)     var bounds: CFTypeRef?     guard let axval = axval else {return nil}     AXUIElementCopyParameterizedAttributeValue(self, kAXBoundsForRangeParameterizedAttribute as CFString, axval, &bounds)     var cursorRect: CGRect = .zero     guard let bounds = bounds else {return nil}     AXValueGetValue(bounds as! AXValue, .cgRect, &cursorRect)     return cursorRect   } }
Posted Last updated
.
Post marked as solved
1 Replies
194 Views
Im working on a small text snippet / lorem ipsum app as a side project and the idea is, for instance, whenever and wherever user types "lorem10" I'd like to print/paste 10 random lorem ipsum words. Eg. "lorem10 " -> ("Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do") For that to be possible I need to, Programmatically press "delete" key to remove the trigger string ("lorem10"). Programmatically press "cmd + v" for pasting the result string. This is possible, even in sandbox! But it requires accessibility permission. For instance I can simulate "delete" key press like this: func delete() {     let eventSource = CGEventSource(stateID: .combinedSessionState)     let keyDownEvent = CGEvent(       keyboardEventSource: eventSource,       virtualKey: CGKeyCode(51),       keyDown: true)     let keyUpEvent = CGEvent(       keyboardEventSource: eventSource,       virtualKey: CGKeyCode(51),       keyDown: false)     let loc = CGEventTapLocation.cghidEventTap     //Triggers system default accessibility access pop-up     keyDownEvent?.post(tap: loc)     keyUpEvent?.post(tap: loc)   } My question is essentially if this is allowed in Mac App Store? Because requesting accessibillity permission like this is not allowed in sandbox: func getPermission() { AXIsProcessTrustedWithOptions([kAXTrustedCheckOptionPrompt.takeUnretainedValue():true] as CFDictionary). } But I can simulate one short "shift" or "cmd" key press for instance, and trigger the pop-up inside a sandboxed app and get around this it seems. Is this a bug? I really hope I can release my app in the Mac App Store, but doing so I just want to be sure Im not using any bug that might get removed in the near future.
Posted
by T1Daniel.
Last updated
.
Post marked as solved
4 Replies
237 Views
I'm developing a desktop application and need the equivalent of a live region to notify visually impaired users of changes on the screen. The documentation only talks about them in the context of a web page - does this feature work for a desktop applications as well?
Posted Last updated
.
Post not yet marked as solved
1 Replies
152 Views
Recently I was using tiktok and was messing with my phone. I have the iOS 16 update and I have an IPhone 13. I saw a random bar with a scissor Icon and a arrow as well as other icons which I cannot remember. I have no idea how I triggered this bar to appear and I have no idea what it is. I’ve been searching the internet and found no answer PLEASE HELP ME FIND IT!!
Posted Last updated
.
Post not yet marked as solved
1 Replies
211 Views
Hi folks, I'm really happy you have made this and would love to chat more on the subject if that's your jam. However, I am struggling to integrate your plugin with my project. I have updated to Unity 2020.3.36f1 (Unity 2020.3.33f1 does not open on my laptop, seem like others had similar issues). The plugin compiles correctly in my Unity project, and I have referenced AccessibilitySettings.PreferredContentSizeMultiplier. Unity builds the XCode project fine, but I cannot build in XCode because of a series of errors like this: Undefined symbols for architecture arm64:   "__UnityAX_registerAccessibilityPreferredContentSizeCategoryDidChangeNotification", referenced from:       _AccessibilitySettings_RegisterCallbacks_m24D66FBB224A583EB09114081A38990B392CC1F4 in Apple.Accessibility.o       _AccessibilitySettings__UnityAX_registerAccessibilityPreferredContentSizeCategoryDidChangeNotification_mAE514A0A5C03E7E91AC797D9FB1E52650D30AB1E in Apple.Accessibility.o      (maybe you meant: _AccessibilitySettings__UnityAX_registerAccessibilityPreferredContentSizeCategoryDidChangeNotification_mAE514A0A5C03E7E91AC797D9FB1E52650D30AB1E)   "__UnityAX_registerAccessibilityIsSwitchControlRunningDidChangeNotification", referenced from:       _AccessibilitySettings_RegisterCallbacks_m24D66FBB224A583EB09114081A38990B392CC1F4 in Apple.Accessibility.o I suspect I am not linking a library I should be, however, adding the Accessibility framework to Embedded Frameworks or to the UnityFramework has not resolved the issue. Could you please point me in the right direction? All the best, Le-Roy
Posted
by le-roy.
Last updated
.
Post marked as solved
3 Replies
408 Views
Is it possible to create a sandboxed app that uses accessibility permission? And if so, how do I ask the user for that permission in a way that is allowed by the App Store? Im creating a small menubar app and my current (rejected) solution is to create a pop-up, with link to Security & Privacy > Accessibility and the pop-up asks the user to manually add the app to the list and check the checkbox. This works in sandbox. Reason for rejection: "Specifically, your app requires to grant accessibility access, but once we opened the accessibility settings, your app was not listed." I know it's not listed there and it has to be added manually. But its the only solution I've found to this issue. Is there perhaps any way to add the app there programmatically? Im a bit confused since I've seen other apps in App Store that work the same way, where you have to add the app to the list manually. Eg. Flycut. :man-shrugging: I know about this alternative solution, and it's not allowed in sandboxed apps. It also adds the app to the accessibility list automagically: func getPermission() { AXIsProcessTrustedWithOptions([kAXTrustedCheckOptionPrompt.takeUnretainedValue():true] as CFDictionary). } Does anyone have a solution for this? Best regards, Daniel
Posted
by T1Daniel.
Last updated
.