Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Posts under Accessibility tag

134 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Voice Over accessibility: UITableView
In our app, we display contacts in UITableView. Let us say I have 300 contacts in my AddressBook, and all of them will be displayed in this table. Below this table, I have a UIButton to perform some action like invite selected contacts. With Voice Over enabled, when I get to the UITableView, it doesn't let me to proceed to the UIButton, unless I go over all 300 contacts. Is there a solution to override this and make it more friendly to the visually impaired users?
2
0
3.1k
Aug ’23
accessibilityTextHeadingLevel not working in NSAttributedString
Hey guys, I have an NSAttributedString within my app (created from HTML). I assign this string to a UITextView. I would like certain parts of that text to be marked with an 'header' accessibility trait (all the headlines in that text) so that voice over can identify them properly. I was under the impression that I can just use accessibilityTextHeadingLevel to do so, but the text in that given range is still setup with the 'text' accessibility trait: var myString = NSMutableAttributedString(...) let range = NSRange(location: 0, length: 44) myString.addAttribute(NSAttributedString.Key.accessibilityTextHeadingLevel, value: 1, range: range) How is accessibilityTextHeadingLevel supposed to work?
3
0
1k
Oct ’23
Programmatically press "delete" or "cmd + v" in sandboxed app
Im working on a small text snippet / lorem ipsum app as a side project and the idea is, for instance, whenever and wherever user types "lorem10" I'd like to print/paste 10 random lorem ipsum words. Eg. "lorem10 " -> ("Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do") For that to be possible I need to, Programmatically press "delete" key to remove the trigger string ("lorem10"). Programmatically press "cmd + v" for pasting the result string. This is possible, even in sandbox! But it requires accessibility permission. For instance I can simulate "delete" key press like this: func delete() {     let eventSource = CGEventSource(stateID: .combinedSessionState)     let keyDownEvent = CGEvent(       keyboardEventSource: eventSource,       virtualKey: CGKeyCode(51),       keyDown: true)     let keyUpEvent = CGEvent(       keyboardEventSource: eventSource,       virtualKey: CGKeyCode(51),       keyDown: false)     let loc = CGEventTapLocation.cghidEventTap     //Triggers system default accessibility access pop-up     keyDownEvent?.post(tap: loc)     keyUpEvent?.post(tap: loc)   } My question is essentially if this is allowed in Mac App Store? Because requesting accessibillity permission like this is not allowed in sandbox: func getPermission() { AXIsProcessTrustedWithOptions([kAXTrustedCheckOptionPrompt.takeUnretainedValue():true] as CFDictionary). } But I can simulate one short "shift" or "cmd" key press for instance, and trigger the pop-up inside a sandboxed app and get around this it seems. Is this a bug? I really hope I can release my app in the Mac App Store, but doing so I just want to be sure Im not using any bug that might get removed in the near future.
1
1
1.5k
Oct ’23
AVSpeechSynthesizer Leaking Like a Sieve
I've found multiple leaks in AVSpeechSynthesizer which are plaguing my users. My users are complaining of crashes due to this. Ive created a feedback item (FB12212129) with a sample project attached which demonstrates one of the leaks. I'm hoping an engineer notices this. The only way ive hade my feedback noticed in the past is by both creating a feedback item AND posting on the forums. So here's my forum post. Help is much appreciated!
6
5
3.0k
Oct ’23
TTS problem iOS 17 beta
I see a lot of crashes on iOS 17 beta regarding some problem of "Text To Speech". Does anybody has a clue why TTS crashes? Anybody else seeing the same problem? Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Subtype: KERN_INVALID_ADDRESS at 0x000000037f729380 Exception Codes: 0x0000000000000001, 0x000000037f729380 VM Region Info: 0x37f729380 is not in any region. Bytes after previous region: 3748828033 Bytes before following region: 52622617728 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL MALLOC_NANO 280000000-2a0000000 [512.0M] rw-/rwx SM=PRV ---> GAP OF 0xd20000000 BYTES commpage (reserved) fc0000000-1000000000 [ 1.0G] ---/--- SM=NUL ...(unallocated) Termination Reason: SIGNAL 11 Segmentation fault: 11 Terminating Process: exc handler [36389] Triggered by Thread: 9 ..... Thread 9 name: Thread 9 Crashed: 0 libobjc.A.dylib 0x000000019eeff248 objc_retain_x8 + 16 1 AudioToolboxCore 0x00000001b2da9d80 auoop::RenderPipeUser::~RenderPipeUser() + 112 (AUOOPRenderPipePool.mm:400) 2 AudioToolboxCore 0x00000001b2e110b4 -[AUAudioUnit_XPC internalDeallocateRenderResources] + 92 (AUAudioUnit_XPC.mm:904) 3 AVFAudio 0x00000001bfa4cc04 AUInterfaceBaseV3::Uninitialize() + 60 (AUInterface.mm:524) 4 AVFAudio 0x00000001bfa894bc AVAudioEngineGraph::PerformCommand(AUGraphNodeBaseV3&, AVAudioEngineGraph::ENodeCommand, void*, unsigned int) const + 772 (AVAudioEngineGraph.mm:3317) 5 AVFAudio 0x00000001bfa93550 AVAudioEngineGraph::_Uninitialize(NSError**) + 132 (AVAudioEngineGraph.mm:1469) 6 AVFAudio 0x00000001bfa4b50c AVAudioEngineImpl::Stop(NSError**) + 396 (AVAudioEngine.mm:1081) 7 AVFAudio 0x00000001bfa4b094 -[AVAudioEngine stop] + 48 (AVAudioEngine.mm:193) 8 TextToSpeech 0x00000001c70b3c5c __55-[TTSSynthesisProviderAudioEngine renderSpeechRequest:]_block_invoke + 1756 (TTSSynthesisProviderAudioEngine.m:613) 9 libdispatch.dylib 0x00000001ae4b0740 _dispatch_call_block_and_release + 32 (init.c:1519) 10 libdispatch.dylib 0x00000001ae4b2378 _dispatch_client_callout + 20 (object.m:560) 11 libdispatch.dylib 0x00000001ae4b990c _dispatch_lane_serial_drain + 748 (queue.c:3885) 12 libdispatch.dylib 0x00000001ae4ba470 _dispatch_lane_invoke + 432 (queue.c:3976) 13 libdispatch.dylib 0x00000001ae4c5074 _dispatch_root_queue_drain_deferred_wlh + 288 (queue.c:6913) 14 libdispatch.dylib 0x00000001ae4c48e8 _dispatch_workloop_worker_thread + 404 (queue.c:6507) ... Thread 9 crashed with ARM Thread State (64-bit): x0: 0x0000000283309360 x1: 0x0000000000000000 x2: 0x0000000000000000 x3: 0x00000002833093c0 x4: 0x00000002833093c0 x5: 0x0000000101737740 x6: 0x0000000000000013 x7: 0x00000000ffffffff x8: 0x0000000283309360 x9: 0x3c788942d067009a x10: 0x0000000101547000 x11: 0x0000000000000000 x12: 0x00000000000007fb x13: 0x00000000000007fd x14: 0x000000001ee24020 x15: 0x0000000000000020 x16: 0x0000b1037f729360 x17: 0x000000037f729360 x18: 0x0000000000000000 x19: 0x0000000000000000 x20: 0x00000001016a8de8 x21: 0x0000000283e21d00 x22: 0x0000000283b3f1f8 x23: 0x0000000283098000 x24: 0x00000001bfb4fc35 x25: 0x00000001bfb4fc43 x26: 0x000000028033a688 x27: 0x0000000280c93090 x28: 0x0000000000000000 fp: 0x000000016fc86490 lr: 0x00000001b2da9d80 sp: 0x000000016fc863e0 pc: 0x000000019eeff248 cpsr: 0x1000 esr: 0x92000006 (Data Abort) byte read Translation fault
20
2
6.1k
Jan ’24
Help customizing the accessibility of a large UICollectionView
Hello, I am turning to this forum because I suspect I am "doing it wrong" when it comes to implementing VoiceOver accessibility in my collection view. I suspect this because the system has resisted everything I have tried to do, fought it tooth and nail, and I can't see any way to get this to work. The Collection View I have a collection view that displays a large dataset. It uses a custom collection view layout to create a spreadsheet-like view. It has hundreds of rows, and each row can have hundreds of items. The items in each row do not conform to specific column widths. Their width is defined by the data they display, and for the purposes of this discussion, can be considered to be arbitrary. To the left of the "table" is a column of sticky headers whose position remains fixed in relation to the content. On top of the "table" is a row of headers, whose position also remains fixed. The Problem The default accessibility behavior that Apple has baked into UICollectionView is completely impractical for this application. Each row can contain hundreds of items, so a user who is attempting to navigate by swiping right would have to swipe through hundreds of items just to reach the second row (of hundreds). The Desired Behavior I want the user to be able to swipe through just the cells that are onscreen. To scroll, they can use the standard three-finger gesture. When scrolling occurs, VoiceOver should announce the range of data that is being displayed. Attempted Solution 1: Setting the accessibilityElements array I can set the accessibilityElements array of the UICollectionView to only contain the elements that are onscreen. I also can override the accessibilityScroll method to perform the paging upon a three-finger scroll. This works okay, but has some pretty fatal flaws: As the user swipes through elements, the collection view insists upon scrolling horizontally to try and fit the element into view. It also insists upon scrolling vertically to keep the focused element in the middle of the view. This not only causes the content offset to jump around wildly in a disorienting way, but it also brings content into view that VoiceOver does not know about because I have not added it to the accessibilityElements array. A low-vision user, or a user who pans their finger across the screen, would not be able to access those visible elements. VoiceOver refuses to read my paging announcement. No matter when I post a pageScrolled notification, the system will not read it. Setting accessibilityFrame In an attempt to fix the scroll jumping described above, I tried setting the accessibilityFrame of my collection view cells. This did nothing to alter the scroll jumping behavior, and had the added downside that, as the view jumped around, the accessibility frames did not follow it. A bridge too far? Overriding contentOffset I was about to override contentOffset on the collection view so that only I could set it. That would probably work. But it would do nothing to fix the paging announcement. Attempted Solution 2: Ignore the Cells! Use proxy UIAccessibilityElements I tried setting the accessibilityElements array of my collection view to a collection of UIAccessibilityElement instances whose accessibilityFrame matched the frame of the cells they represent. This worked pretty well! No more scrolling nonsense when swiping through cells, and my paging announcements were being read. This approach has a different, equally fatal flaw: If the user attempts to three finger-scroll too quickly, the VoiceOver process will become confused. It acts as though the last selected element is the only element that exists; swiping right or left does nothing. Three finger-scrolling also does nothing. As best as I can tell, it gets stuck with the last selected element as the only one it knows about. I have since replaced all of the elements in the collection view's accessibilityElements array and posted a layoutDidChange notification, which VoiceOver ignores completely. The only way out of this state is to tap on a cell, causing VoiceOver to refresh its collection of views that it knows about. I guess? No idea what's happening there. Now what? I'm at a complete and total loss. I'm at my wit's end. It feels like this seemingly simple customization is entirely impossible. Does anyone know what I'm doing wrong? Thanks!
2
0
1.2k
Jul ’23
Problems combining a .popover and a conditional accessibilityRepresentation
Hello friends/colleagues, I want to create a ViewModifier that accepts a conditional accessibilityIdentifier accepting an optional string input struct AccessibilityModifier: ViewModifier { let identifier: String? func body(content: Content) -> some View { if let identifier = identifier { content .accessibilityRepresentation { content .accessibilityIdentifier(identifier) } } else { content } } } it mostly works as expected, but .popover appears to be broken. For example, in the following code, the popover will work. But if I uncomment the .modifier line, the popover does not get presented struct ContentView: View { @State var isPresented: Bool = false var body: some View { VStack { Button("Show Popover") { isPresented = true } } .popover(isPresented: $isPresented) { Text("A Popover") } // .modifier(AccessibilityModifier(identifier: "a11y Modifier")) } } The popover also works when I use: .modifier(AccessibilityModifier(identifier: nil)) Any suggestions on how I can support both popovers and my conditional accessibilityIdentifier? thanks, in advance, Mike
0
0
380
Aug ’23
Issue with Accessibility Visible Text Range
I am trying to draw NSPanels on texts of other apps using AXObserverAddNotification with the attribute of kAXValueChangedNotification on an AXTextArea I need to detect if the text range on that text area is visible on the screen and I use kAXVisibleCharacterRangeAttribute to get the visible text range but it doesn't include the top bar of the app and gives a range that starts from an index that is not visible. This makes the panels drawn on top of the app's top app bar when scrolled to bottom like this gif How do I exclude the top app bar for visible text range?
1
0
312
Aug ’23
macOS application hangs if accessibility changed while using CGEventTap
Hi all: I have a macOS application which capture mouse events: CGEventMask eventMask = CGEventMaskBit(kCGEventMouseMoved) | CGEventMaskBit(kCGEventLeftMouseUp) | CGEventMaskBit(kCGEventLeftMouseDown) | CGEventMaskBit(kCGEventRightMouseUp) | CGEventMaskBit(kCGEventRightMouseDown) | CGEventMaskBit(kCGEventOtherMouseUp) | CGEventMaskBit(kCGEventOtherMouseDown) | CGEventMaskBit(kCGEventScrollWheel) | CGEventMaskBit(kCGEventLeftMouseDragged) | CGEventMaskBit(kCGEventRightMouseDragged) | CGEventMaskBit(kCGEventOtherMouseDragged); _eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap, kCGEventTapOptionDefault, eventMask, &MouseCallback, nil); _runLoopRef = CFRunLoopGetMain(); _runLoopSourceRef = CFMachPortCreateRunLoopSource(NULL, _eventTap, 0); CFRunLoopAddSource(_runLoopRef, _runLoopSourceRef, kCFRunLoopCommonModes); CGEventTapEnable(_eventTap, true); CGEventRef MouseCallback(CGEventTapProxy proxy, CGEventType type, CGEventRef event, void *refcon) { NSLog(@"Mouse event: %d", type); return event; } This mouse logger need accessibility privilege granted in Privacy & Security. But I found that if accessibility turned off while CGEventTap is running, left & right click are blocked, unless restart macOS. Although replace kCGEventTapOptionDefault to kCGEventTapOptionListenOnly can fix this issue, but I have other feature which require kCGEventTapOptionDefault. So I try to detect accessibility is disabled and remove CGEventTap: [[NSDistributedNotificationCenter defaultCenter] addObserver:self selector:@selector(didToggleAccessStatus:) name:@"com.apple.accessibility.api" object:nil suspensionBehavior:NSNotificationSuspensionBehaviorDeliverImmediately]; } However, the notification won't be sent if user didn't turn off accessibility but removed it from list. Worse, AXIsProcessTrusted() continues to return true. Is there a way to fix mouse blocked, or detect accessibility removed? Thanks!
9
0
1.1k
Aug ’23
Converting macOS system colors to iOS system colors in NSAttributedString
When syncing an NSAttributedString between devices, it seems that macOS system colors are decoded as "UIExtendedGrayColorSpace" black color on iOS. For example, if I have text on the Mac and chose systemColorRed, this is converted to UIExtendedGrayColorSpace black on iOS. I would like to convert the macOS system red (for example) to the iOS system red, but I'm not sure the best way to do this. My question is: has anyone discovered an efficient and clean way to convert macOS system colors to iOS system colors when syncing between devices? I have seen that the system colors on both devices have a consistent hex value when reading the color components. A possible solution would be to save the hex value of the colors in the attributed string. We could then enumerate the hex values and exchange the attributed string foreground color for the appropriate system color of whatever device is being used. I would greatly appreciate thoughts regarding this. Thank you everyone!
2
0
531
Aug ’23
[Chromium][PWA] VoiceOver doesn't read in many cases of PWA
Recently, I found there are some problems with Voice Over on PWA app: Voice Over cannot read the state changes when users interacts with some HTML element. I am not sure this bug is a chromium bug or Voice Over bug. I have reported the bug to chromium, which list the reproduce steps and video. I am working on the bug, but I don’t have much of an idea. So sending this here to see if you have some insight on this issue. How to reproduce the bug Install PWA test app(https://pwa-a11-test.netlify.app) Open the PWA app Turn on the VoiceOver Pressing Space on the checkbox element (Note: I only use checkbox element as a example, other HTML element also have the similar problem, such as Pressing Left/Right on slider element) Expected: Voice Over read the value changes, just like the behavior shown in the browser. Actual: Voice Over do nothing. Chromium code analysis The bug only exist in PWA scenario, it’s fine for browser scenario. FYI, PWA Mac app has its own process(called app shim process), which spawn browser process(think it as Chrome app) and then communicate between them. For the accessibility implementation for PWA App, chromium use NSAccessibilityRemoteUIElement, a private Apple API, to make app shim process have the all accessibility ability of browser process. I doubt the bug has anything to do with NSAccessibilityRemoteUIElement https://source.chromium.org/chromium/chromium/src/+/main:ui/base/cocoa/remote_accessibility_api.h;l=14?q=NSAccessibilityRemoteUIElement&ss=chromium%2Fchromium%2Fsrc but I am not familiar with the undocumented API. According to the chromium code, when pressing the Space on checkbox, chromium will call NSAccessibilityPostNotification with NSAccessibilityValueChangedNotification. Voice Over can’t read the value change for PWA Mac app, but can read for browser scenario. I use Xcode accessibility inspector tool to see the notification posted from PWA App. I found the notification was sent successfully, but Voice Over can not read it! So I doubt voice over is doing some check to disallow read it in this case? Do you have any thoughts on this? Appreciate it if there is any comments or response!!
2
1
491
Aug ’23
Complete process of how to add my voice as speech synthesis
Hello, I am deaf and blind. So my Apple studies are in text vi aBraille. One question: how do I add my voice as voice synthesis? Do I have to record it somewhere first? What is the complete process, starting with recording my voice? Do I have to record my voice reading something and then add it as voice synthesis? What's the whole process of that? There is no text explaining this' I found one about authorizing personal voice, but not the whole process starting the recording and such' Thanks!
3
0
1.2k
Aug ’23
Assistive touch crash
While using an HID device that recognized as mouse and keyboard using assistive touch, I noticed that sometimes after some usage, the feature in "AssistiveTouch" menu called, "Perform Touch Gestures", is no longer working although it remains on, as well as the assistive touch that remains on. In order to make it work again, you need to turn off and on the assistive touch feature. What could cause this issue? Does it relate to the HID Descriptor of report map? Or am I sending some commands that can cause this crash? Is there any way to avoid/be aware/get some callback on when this feature crashes using swift?
1
0
451
Aug ’23
VoiceOver HTML table issue: VO announces content out of order when span element used in th
Wondering if anyone has come across this issue. When a element has text in a element in an HTML table, VoiceOver will announce the text in the second element first, even if it comes after the text that visually (and programmatically) comes before it. This also happens with nested span elements. It does not happen with cells. For example: Replicable on: Safari/iOS 16.6 with VoiceOver Edge 113.0.17 with VoiceOver Please see you can replicate here: https://codepen.io/ayesha-2303/full/eYQqbXX Is this expected functionality or is it a bug? How can I raise it if it is a bug?
1
0
584
Aug ’23
How to Make Personal Voice Recording in My Languag
I've been deaf and blind for 15 years' I'm not good at pronunciation in English, since I don't hear what I say, much less hear it from others. When I went to read the phrases to record my personal voice in Accessibility > Personal Voice, the 150 phrases to read are in English' How do I record phrases in Brazilian Portuguese? I speak Portuguese well' My English is very bad in pronunciation and deafness contributed' Help me.
1
0
644
Aug ’23