Post not yet marked as solved
we are working on accessible games for years and appreciate the initiative to bring IOS core functionalities for unity game developement. But to bring this project further, we need a short communication to let you ( Jaron Marsau + Eric Lang) know the problems. To speak it out frankly, not even the demo files work. So lets connect
hansjoerg @EuMentalhome
Post not yet marked as solved
I'm testing my App in the Xcode 14 beta (released with WWDC22) on iOS 16, and it seems that AVSpeechSynthesisVoice is not working correctly.
The following code always returns an empty array:
AVSpeechSynthesisVoice.speechVoices()
Additionally, attempting to initialize AVSpeechSynthesisVoice returns nil for all of the following:
AVSpeechSynthesisVoice(language: AVSpeechSynthesisVoice.currentLanguageCode())
AVSpeechSynthesisVoice(language: "en")
AVSpeechSynthesisVoice(language: "en-US")
AVSpeechSynthesisVoice(identifier: AVSpeechSynthesisVoiceIdentifierAlex)
AVSpeechSynthesisVoice.speechVoices().first
Post not yet marked as solved
We are currently experiencing a usability issue in our App. We also discovered this issue for sites in Safari as well.
While using Voiceover in iOS 13.3+, we've discovered that VO skips all tables that are using a caption. This occurs when a user swipes to read the contents of the page. We also observed that using the "rotor" and choosing tables, it will not recognize that there is a table on the page.
This has been repeated by multiple users on different devices. Our testing also encompassed VO on macOS Catalina and VO worked as expected for all tables we tested.
Has anyone else come across this issue?
Post not yet marked as solved
Any issues with bluetooth connection iOS 14
Post not yet marked as solved
I have a UIButton, when focus on the button, VoiceOver will speak the button's label, when double tap to activate it, VoiceOver will speak the button's label again. Is there anyway I can prevent speaking the button's label again when double tapping the button?
Post not yet marked as solved
Looking to begin an accessibility vision app and I’m not having any luck locating properties or code that allows access to the display zoom. Has anyone worked with these areas before?
Sandbox is set to no in the entitlements file.
Settings → security & privacy → privacy → accessibility is enabled for the app.
Can detect global mouse events.
Can't use accessibility features:
let systemWideElement: AXUIElement = AXUIElementCreateSystemWide()
var focusedElement : AnyObject?
let error = AXUIElementCopyAttributeValue(
systemWideElement,
kAXFocusedUIElementAttribute as CFString,
&focusedElement
)
print(focusedElement) // this prints `nil`
Can't execute applescripts ( NSAppleScript() )
Can't send keypress events ( CGEvent().post() )
Also, if i compile the executable with swiftc from terminal and then run from terminal, the app is able to access these features.
Are there other xcode settings I need to change or are they always blocked when building from xcode?
Post not yet marked as solved
I'm trying to implement an app in Hebrew language using SwiftUI, for some reason, the accessibility labels are not reading on voice over labels in Hebrew like for example:
Text("שלום")
.accessibility(label: Text("שלום"))
Post not yet marked as solved
I have been setting up MacOS color filters via "System Preferences" -> "Accessibility" -> "Display" -> "Color Filters". When I set it up, the filter effect only shows up on two of the four monitors I have attached to my Mac Pro. (The two that work are a Samsung LC49G95T and a Dell S3221QS; the two that do not are both HP2509s.)
I have filed a bug report with Apple on this, but it occurs to me that it might be helpful to add to it any other users' experiences with monitors that do and do not "work" with color filters. Has anyone else seen this phenomenon?
Alternatively, does anyone have any suggestions for getting the 2509s to show the filtering?
Post not yet marked as solved
hi, In my project, I try to use UIAccessibilityPostNotification (UIAccessibilityScreenChangedNotification, A) method focusing on A, But will call accessibilityApplyScrollContent: sendScrollStatus: animateWithDuration: AnimationCurve, which triggers the scrolling method of A's superView (scorllView). I didn't know what was going on, so I wanted to understand how it worked.I will be very pleased If you could reply to me
Post not yet marked as solved
I'm implementing a TableView with 3 cells and every cell fits the whole screen with the following code:
func tableView(_ tableView: UITableView, heightForRowAt indexPath: IndexPath) -> CGFloat {
return self.view.frame.size.height
}
I'm trying to navigate over the cells using the VoiceOver but it does not jump to the next cell.
VoiceOver does not recognize the elements outside the screen, even if it's a TableView?
At the above example, the VoiceOver reads correctly the "Cell: 0" element, but it does not scroll to the next cell ("Cell: 1").
I couldn't find any documentation regarding Accessibility with full-screen elements such as these cells. Is there any website/docs I could take a look?
Thanks.
Post not yet marked as solved
Hi Team,
Currently facing an issue with accessibility voice over larger text mode on buttons in iOS 15. When voice over is "ON", for normal text size a button accessibility label is reading fine, but when in large text mode the button labels are reading twice.
For example: Normal mode "Yes" button title is reading as "Yes button", while in larger text mode it is reading as "Yes Button yes".
We have only enabled accessibility label for the buttons.
Post not yet marked as solved
I have a screen reader accessibility issue with an editable custom Combobox with listbox popup widget in Safari. I have tested it with Mac's inbuilt Voiceover. The screen reader doesn't read the list item from the drop down. This custom Combobox is almost similar to the W3C demo component.
The Combobox widget has a multiline textbox field that displays a popup with a list of suggestions. I have followed all the necessary W3C accessibility guidelines to make it screen reader and keyboard accessible. The component is similar to the demo component on W3C website.
The following ARIA attributes are used in the Combobox:
role="combobox"
aria-expanded="false" (will be updated dynamically)
aria-haspopup="listbox"
aria-autocomplete="list"
aria-controls=<listbox id>
aria-owns=<listbox id>
The listbox/popup is a ul element with following ARIA attribute:
role="listbox"
The listbox/popup list options are the li elements with following ARIA attribute:
role="option"
W3C demo link: https://www.w3.org/WAI/ARIA/apg/example-index/combobox/grid-combo.html
Screen reader accessibility works fine on Chrome, Firefox and Brave browsers except Safari. Got the same results for W3C demo component. I tested it in Mac Monterey, safari version 15.3 and Mac Big Sur, safari version 14.1. I tried different ways by adding the role, and aria attribute to the parent but was not successful.
Is this a bug? What would be the best workaround?
Thank you in advance for your help.
Post not yet marked as solved
Let's say I am a VoiceOver user that wants to use the rotor option "Headers" to iterate through supplementary header views of a UICollectionView. This behavior works as expected if the number of cells between each header view is less than the height of the UICollectionView. However, if there are lots of items between the visible header view and the adjacent one that is hidden, the device currently says "header not found" and requires a three-finger swipe to eventually bring the header into view.
I can understand the technical reason behind this; UICollectionView does not actually have everything loaded into memory and reuses cells to give the impression that it does, so the device is technically not finding the view. Does that mean that VoiceOver users are quite used to hearing "Heading not found" and using the three-finger swipe motion as a workaround to this issue? Or is this an actual bug?
I don't see much discussion of this in the forums either (I apologize if this has been answered anywhere else), so I thought of posting this question here. Thanks!
Post not yet marked as solved
Hi there,
I wanted to make a UILabel with a gradient as its text color, and its fairly easy doing it like this:
let gradientImage = ... // Create a Gradient image using CAGradientLayer
label.textColor = UIColor(patternImage: gradientImage)
Problem is that as soon as I set this pattern image color, the UILabel is not accessible anymore. So isAccessibilityElement is set to false and I have no way of changing this back. The issue doesn't lie in the gradientImage, as using any other image as the patternImage for the textColor results in the same issue. Did anyone else experience this before and knows a way around this?
Thanks,
Klemens
Let's say I have a music library view controller comprising a UICollectionView, where the cells (that are songs) are grouped alphabetically, and a cell can either be a song or a loop in the respective song (e.g. the image attached).
My question is whether there's a way VoiceOver users can swipe through the headings and song cells but skip over the loop cells in the same way the rotor allows jumping from one header to another. If I'm a VoiceOver user and I know that the loop I'm looking for is not in "Sisters", there's no point in having to go through "Lydian lick 2" and "WOW!"; it should just skip to "T'Wonderful".
Is this possible to implement given the accessibility API that is currently out there? I know I can certainly program it to be nested (i.e. pressing "Sisters >" would change to "Sisters /" and the "Lydian lick 2" and "WOW!" cells would appear) but I like that a user only has to do one press to open either the song or loop (not to mention I now don't have a way of loading the song itself).
Does anyone have any suggestions on how I can improve this design such that it minimizes number of gestures required to open a song/loop, while making it easy for VoiceOver users to skip over loops they know is not in a given song? It'd be highly appreciated!!!
Post not yet marked as solved
Hi Everyone.
I'm trying to use Accessibility Inspector thats included with Xcode 13.3 and it fails to identify UI elements with Target mode... basically the entire iOS simulator is a green/unreadable element. Besides the fact that is hangs a lot is this tool completely broken now?
If I try to run an audit it just throws the warning:
Element has no description
This SimulatorKit.SimDisplayRenderableView
is missing useful accessibility information.
Note that no matter what app I try to inspect, it always fails, it also fails when I try to inspect the springboard / home screen.
It worked fine in 13.2.1 and now its completely broken.
Post not yet marked as solved
I am doing some iPhone automation and want to observe call states of ongoing calls. I implemented a listener to read the idevicesyslog and fetch log output of com.apple.accessibility.heard, which is printing each state change similar to the following lines (stat=Sending and stat=Active):
ay 13 02:14:02 heard(HearingUtilities)[11392] <Notice>: -[HUComfortSoundsController callStatusDidChange:]:415 Phone call holding 0 [pending = 1, active = 0, avc = 0, endpoint = 1] - NSConcreteNotification 0x102f173f0 {name = TUCallCenterCallStatusChangedNotification: object = <TUProxyCall 0x102f47080 p=com.apple.coretelephony aPI=(null) svc=Telephony hdl=<TUHandle 0x103407870 type=PhoneNumber, value=+4912345, normalizedValue=+4912345, isoCountryCode=de> isoCC=de stat=Sending tStat=0 dR=0 fR=0 supportsR=1 uPI=57FCA1D2-D09F-4E23-A08A-3AD4B18B570D grp=(null) lSIUUID=00000000-0000-0000-0000-000000000001 lSIAccountUUID=AA1FC9E6-068E-4D86-B3B9-C1074658AFB2 hosted=1 endpt=1 callerNFN=(null) srcID=(null) aC=(null) aM=(null) iUB=1 vm=0 connStat=00 nMICS=0 sR=0 iSA=0 iSV=0 iSS=0 wHM=0 hSI=0 vST=0 iapST=0 oapST=0 vCA=<TUVideoCallAttributes 0x102f17330 remoteCameraOrientation=0 localVideoContextSlotIdentifier=0 remoteVideoContextSlotIdentifier=0> model=<TUCallModel 0x102f3e380 hold=1 grp=1 ungrp=1 DTMF=1 uMPS=1 aC=1 sTV=0> em=0 iFE=0 sos=0 sSR=1 sSUI=0 mX=0<\M-b\M^@\M-&>
May 13 02:14:14 heard(HearingUtilities)[11392] <Notice>: -[HUComfortSoundsController callStatusDidChange:]:415 Phone call holding 0 [pending = 0, active = 1, avc = 1, endpoint = 1] - NSConcreteNotification 0x10321ab80 {name = TUCallCenterCallStatusChangedNotification: object = <TUProxyCall 0x102f47080 p=com.apple.coretelephony aPI=(null) svc=Telephony hdl=<TUHandle 0x103330670 type=PhoneNumber, value=+4912345, normalizedValue=+4912345, isoCountryCode=de> isoCC=de stat=Active tStat=0 dR=0 fR=0 supportsR=1 uPI=57FCA1D2-D09F-4E23-A08A-3AD4B18B570D grp=(null) lSIUUID=00000000-0000-0000-0000-000000000001 lSIAccountUUID=AA1FC9E6-068E-4D86-B3B9-C1074658AFB2 hosted=1 endpt=1 callerNFN=(null) srcID=(null) aC=AVAudioSessionCategoryPhoneCall aM=(null) iUB=1 vm=0 connStat=11 nMICS=0 sR=0 iSA=0 iSV=0 iSS=0 wHM=0 hSI=1 vST=0 iapST=0 oapST=0 vCA=<TUVideoCallAttributes 0x102f17330 remoteCameraOrientation=0 localVideoContextSlotIdentifier=0 remoteVideoContextSlotIdentifier=0> model=<TUCallModel 0x1032596b0 hold=1 grp=1 ungrp=1 DTMF=1 uMPS=1 aC=1 sTV=1> em=0 iFE=0<\M-b\M^@\M-&>
Unfortunately, starting from newer iOS versions (maybe 15.4 or 15.x already, don't know exactly), the heard service is killing itself after 3 minutes:
May 13 02:28:03 heard(Accounts)[11501] <Notice>: "The connection to ACDAccountStore was invalidated."
May 13 02:28:03 heard(Accounts)[11501] <Notice>: "The connection to ACDAccountStore was invalidated."
May 13 02:30:21 heard(HearingUtilities)[11501] <Notice>: -[AXHeardController shutdownIfPossible]:355 heard still shouldn't be running. Shutting down.
It is restarting after i am opening the settings-> accessibility settings via phone menu. Does anybody have an idea what I can do about this? I thought about following things:
Include heard service / accessibility in an own app, so that it will stay online when app is active
Get call states in another way? I tried an observer inside an app, but when it is running in background, it will not react anymore.
Some iOS setting to enable it permanently
Will this be fixed in an upcoming version?
The com.apple.accessibility.heard seems to be an internal service, does anybody know how to deal with it now?
Post not yet marked as solved
When I inspect a regular pages document using the accessibility inspector, the whole page seems to be some kind of canvas and doesn't give me access to the document's content.
The odd thing is... as soon as I start the Grammarly desktop app (Grammarly seems to have access to the document's content), the document is inspectable all of a sudden.
Do I have to register as some kind of special app to the system to make Pages render its content as accessible elements? Do you have any Idea how Grammarly changes the way Pages renders its Documents?
Thanks a lot!
I've been trying to get VoiceOver to announce the role of the new VC when the user moves to another screen.
For some reason:
UIAccessibility.post(notification: .announcement, argument: "text")
doesn't have time to finish its work and the focus is set to the first accessibility item.
Is there any way to avoid this behaviour or to disable automatic setting of focus?