Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Posts under Accessibility tag

122 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Application "help" menu does not open main help book page
Following the official documentation, I'm trying to create a set of three localised Help Books. The Help Books should be available in Spanish, English and Polish. Presently, I'm trying to complete English version. App Structure This is the plugin application consisting of main app and the plugin. The main app structure would looks as follows: Files . <XcodeProject Top> ├── Localizable.xcstrings ├── MyAppExtension │   ├── MyAppExtension.swift │   └── <other swift files>.swift ├──MyApp │ ├── Info.plist │   ├── +Array.swift │   ├── +ButtonStyle.swift │   ├── <other app swift files>.swift ├── Resources     └── MyApp.help └── MyApp.help └── Contents ├── Info.plist └── Resources ├── English.lproj │   ├── ExactMatch.plist │   ├── InfoPlist.strings │   ├── MyApp.helpindex │   ├── MyApp.html │   └── pgs └── shrd MyApp / MyApp.help / Info.plist file Consists the following values: Bundle name: MyApp HPDBookAccessPath: MyApp.html HPDBookTitle: My App Help Default localization: en_gb MyApp / Info.plist file Contains the following entries: Help Book directory name: MyApp.help Help Book Identifier: MyApp Help Build phase The Copy Bundle Resources copies MyApp.help in MyApp/Resources. Questions Is the provided folder structure valid for creating a localised help books Is there anything that is missing from across Info.plist files or is in the wrong places? Why the MyApp -> Help opens the main help menu, not the app help
3
0
141
3d
Accessibility Localization Questions
For practice, I have implemented accessibility labels and announcement in a very simple test app (All SwiftUI, all iOS 18). The app is not localized, default language is English. When running this on a German phone, odd things happen in the localization. My accessibility labels are read with an accent, but when they contain a url, the "dots" are read as the German "Punkt" (with an English Accent). When I am providing the same text as accessibility announcement, the same text (which is in English), is read with a German voice. I am also providing a Button with an "arrow.clockwise" image, and VoiceOver reads this, in an English Voice with "Refresh, Button". This is great and was to be expected. However, when the button is disabled, VoiceOver reads "Refresh, grau dargestellt, Button", all in an English Voice. Is this an error? Am I doing it wrong? The video at the link should show the issue https://share.icloud.com/photos/0757FJW2Q3fsA_cdhMX6ls46Q
2
0
448
1w
Xcode 16.1 broken accessibilityLabel
Accessibility got broken after updated till XCode 16.1 There is a call to accessibilityLabel - it sets an a11y label for a title of a view. This used to work (pronounced by VoiceOver) with XCode 15.4 + iOS 17.5. Xcode 16.1 + iOS 18.1 + Physical device/ iOS SImulator - with Accessibility Inspector - no a11y label set. Tried Xcode 16.2 beta 3 - the same result - accessibilityLabel does not work - a11y label is not set.
1
0
258
2w
Fullscreen API web standard is unacceptably missing on iPhone
It is outrageous that Apple continue to fail to implement the Fullscreen API web standard for web apps on iPhone only, which is so important to accessibility and web app functionality. The only possible reason for this block is commercial: to promote iOS apps instead of browser based web apps. To quote a client from a major agency just now - a typical enquiry : We value accessibility greatly, and we noticed that the embedded player is missing a full screen button on iPhone. Everything else works perfectly fine, including a full screen button that appears on the mobile webpage on android devices. Is there any way we can include a button to enable full screen view for our viewers in your player that are going to watch it on iOS devices? To which, as usual, we have to reply: Apple unfortunately block fullscreen mode from being used with all web applications on iPhone. Apple will allow this to be displayed fullscreen on MacBooks and iPads, but currently not on on iPhone - so we have to hide the fullscreen button there. So fullscreen works on all devices and browsers apart from on iPhone. As you've seen with Android, all other devices and browsers follow the universal 'Fullscreen API' web standard to allow full screen. You're probably familiar with seeing the fullscreen button on normal linear videos on iPhone. These use Apple's native video player, which doesn't let buttons and scripts be used on top of it - just a single video, not an interactive web application. Our player looks like a video player but it is actually a web app combining multiple different video clips connected together by code and styling. They block it on iPhones for reasons known only to them, but the assumption is that it is to incentivise people to make iOS apps instead of web apps. The web development community is hopeful that Apple will change this unfortunate restriction soon, but we have been waiting a long time in vain. We have to send this to a lot of people. It's a very bad look for Apple. In less than a month it will be 2025. We have been waiting years for this. The web standard documentation showing universal support on other devices and browsers is here: https://developer.mozilla.org/en-US/docs/Web/API/Fullscreen_API This is not acceptable. It is time for Apple to stop blocking this important accessibility web standard for commercial reasons - only on iPhone. To whoever is in charge of these decisions in the Safari/Webkit team: Please just enable Fullscreen API for web apps on iPhone as soon as possible.
2
0
239
2w
German VoiceOver says "millibars" instead of "megabytes"
In SwiftUI, iOS 18.1.1, Xcode 16.1, the following control: Text(12345678, format: .byteCount(style: .binary)) displays text with MB (megabytes) unit, but German VoiceOver reads it as "millibars". I tried explicitly specify units with: Text(12345678, format: .byteCount(style: .memory, allowedUnits: .mb)) but the result is the same (German VoiceOver still says "millibars"). Aside from creating own accessibility label, is there any way to go around that?
3
0
248
2w
tvOS Accessibility: How to enable accessibility focus on static text and custom views
Hi guys, I'm trying to add accessibility labels to a static text and custom SwiftUI views. Example: MyView { ... } //.accessibilityElement() .accessibilityElement(children: .combine) //.accessibilityRemoveTraits(.isStaticText) //.accessibilityAddTraits(.isButton) .accessibilityLabel("ACCESSIBILITY LABEL") .accessibilityHint("ACCESSIBILITY HINT") When using 'voiceover' or 'hover text' accessibility features, focus moves only between active elements and not on static elements. When I add .focusable() it works, but I don't want to make those elements focusable when all accessibility features are off. I suppose I could do something like this: .focusable(UIApplication.shared.accessibility.voiceOver.isOn || UIApplication.shared.accessibility.hoverText.isOn) Note: this is just pseudocode, because I don't remember exactly how to detect current accessibility settings. However using focusable() with conditions on hundreds of static texts in an app seems to be overkill. Also the accessibility focus is needed on some control containers where we already have a little more complex handling of focus with conditions in focusable(...) on parent and child elements, so extending it for accesssiblity seems to be too complicated. Is there a simple way to tell accessiblity that an element is focusable specifically for 'hover text' and for 'voiceover'? Example what I want to accomplish for TV content: VStack { HStack { Text(Terminator) if parentalLock { Image(named: .lock) { } .accessibilityLabel(for: hover, "Terminator - parental lock") Text("Sci-Fi * 8pm - 10pm * Remaining 40 min. * Live") .accessibilityLabel(for: hover, "Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live") } .accessibilityLabel(for: voiceover, "Terminator, Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live, parental lock")``` I saw all Accessibility WWDC videos 2016, 2022, 2024 and googling it for several hours, but I coudln't find any solution for static texts and custom views. From those videos it appears .accessibilityLabel() should be enough, but it clearly works only on actvie elements and does not work for other SwiftUI views on tvOS without focusable(). Can this be done without using focusable() with conditions for detection which accessibility feature is on? The problem with focusable would be that for accessibility I may need to read a text for parent view, but focus needs to be placed on a child element. I remember problems when focusable() is set on parent view that child was not focusable or something like that - simply put: complications in focus logic. Thanks.
1
0
391
3w
Increased and Mismatched Audio Buffer Sizes on iOS 18 when Sound Recognition or Vocal Shortcuts Is Enabled
Description As of iOS 18, AVAudioSession.setPreferredIOBufferDuration ignores the requested buffer size when Sound Recognition or Vocal Shortcuts is enabled. This results in 1) much larger buffer sizes and 2) mismatched buffer sizes between input and output buffers, which causes ‘glitchy’ audio and increased latency. Additionally, when this issue occurs AVAudioSession.setPreferredIOBufferDuration continues to return ‘true’ and no error is produced. Steps to Reproduce: Enable Vocal Shortcuts on a device running iOS 18. Enable at least one shortcut (e.g. Control Center). Open or clone the example project (https://github.com/cwalo/SoundRecognitionBug) Build and install the example project Attach a headset and launch the application Observe console logs showing a requested buffer size of 0.005805 (256 samples @ 48k) an actual buffer size of 0.023220 (1104 samples @48k - this is regularly the resulting buffer size in all of our tests) Quit the app and detach the headset. Enable mutesOutput in AudioSystem.mm (to avoid feedback) Launch the application Observe Same result from step 4 Mismatched hardware buffer size of 1104 and recorded frame count of 1024 Mismatched playbackCount and recordCount Quit the app and disable vocal shortcuts Launch the app Observe IOBufferDuration matching the requested duration and matched buffer sizes (expected behavior) Expected results: Requested IOBufferDuration is respected or AVAudioSession returns false or error is produced Input and output buffer sizes match Device(s): iPhone 11 Pro, iPad Pro OS: iOS 18.0.1 Environment: Xcode 16.1 FB: FB15715421 Related to: https://forums.developer.apple.com/forums/thread/765477
2
2
274
1w
Swift UI Chart Accessibility issue
Hi Team, We are integrating SwiftUI's Charts BarMark, UI looks good but when we try setting up custom ADA it doesn't reflect/override the accessibility label/value we set manually. Is it iOS defect or is there any workaround? Thanks in advance. Sample: Chart(data) { BarMark( x: .value("Category", $0.department), y: .value("Profit", $0.profit) ) .foregroundStyle(by: .value("Product Category", $0.productCategory)) .accessibilityIdentifier("BarMark") .accessibilityLabel("Dep: \($0.department)") .accessibilityValue("Profile: \($0.profit) Category: \($0.productCategory)") }
0
0
311
Nov ’24
Could not use 'Keyboard Navigation' to navigate to inline navigation title on iOS 18 & iOS 17
Hello everyone, I’m experiencing an issue with the accessibility feature "Keyboard Navigation" in iOS 18.0. Specifically, when enabling the "Allow Full Keyboard Access" option and attempting to navigate to inline navigation titles within stock iOS apps, the navigation doesn’t seem to work as expected. Here’s how to replicate the issue: Enable the accessibility option: Allow Full Keyboard Access (Settings > Accessibility > Keyboards > Full Keyboard Access). Open any stock iOS app that uses an inline navigation style (for example, the Mail or Settings app). Press the Tab key to cycle through items on the inline navigation bar. In iOS 18.0, pressing the Tab key does not allow navigation to the inline navigation title, which was previously possible in iOS 16. This issue is specific to iOS 18 and iOS 17, as it worked fine on the earlier version (iOS 16). Has anyone else encountered this issue or have suggestions for a workaround? Would love to hear your thoughts.
0
0
250
Nov ’24
iOS 18.1 Deeplink to Wallpaper settings
Prior to iOS 18.1, App-prefs:Wallpaper deeplinked to the wallpaper settings. On iOS 18.1, it now deeplinks to the settings app. Is there a new URL to deeplink to the Wallpaper settings? The following URLs have been tested and do not work on iOS 18.1: App-prefs:Wallpaper App-prefs:wallpaper App-prefs:root=wallpaper App-prefs:root=Wallpaper App-prefs:root=WALLPAPER App-prefs:root=General&path=Wallpaper prefs:root=Wallpaper
1
6
390
Nov ’24
VoiceOver: Detect Languages
My app does not automatically switch languages (voices) in VoiceOver when I have VoiceOver on and the screen includes both English and Spanish content. Instead of switching between the correctly accented voice, whatever my manual Voices rotor setting is, that's what the content is announced as. I can manually switch the Voice in the rotor to make words sound inteligible but my main concern is that language changes are not auto-detected even though that feature in my Settings is on. VO does detect language changes in other apps, so I think there must be either misplaced or missing accessibiiltyLanguage strings somewhere in my app. Or is it more than that for localization considerations? I reached out to the Apple Accessibilty team and was directed to open a ticket here, as my question is about the underlying code. I am a novice developer and primarily accessibility SME; i expect that wnen "detect languages" is on in the user settings for VoiceOver, that the voice for the screen reader speech output will automatically switch to the correct language / accent. I recognize there is a problem but am not sure where the breakdown is. I would like guidance how to fix it to relay to my teams. https://developer.apple.com/documentation/objectivec/nsobject/1615192-accessibilitylanguage
1
0
342
Nov ’24
Torch Strobe not working in light (ambient light) environments on iOS 18.1
As of iOS 18.1 being released we are having issues with our users experiencing issues with our app that relies on strobing the device torch. We have narrowed this down to being caused on devices with adaptive true-tone flash and have submitted a radar: FB15787160. The issue seems to be caused by ambient light levels. If run in a dark room, the torch strobes exactly as effectively as in previous iOS versions, if run in a light room, or outdoors, or near a window, the strobe will run for ~1s and then the torch will get stuck on for half a second or so (less frequently it gets stuck off) and then it will strobe again for ~1s and this behaviour repeats indefinitely. If we go to a darker environment, and background and then foreground the app (this is required) the issue is resolved, until moving to an area with higher ambient light levels again. We have done a lot of debugging, and also discovered that turning off "Auto-Brightness" from Settings -> Accessibility -> Display & Text Size resolves the issue. We have also viewed logs from Console.app at the time of the issue occurring and it seems to be that there are quite sporadic ambient light level readings at the time at which the issue occurs. The light readings transition from ~100 Lux to ~8000 Lux at the point that the issue starts occurring (seemingly caused by the rear sensor being affected by the torch). With "Auto-Brightness" turned off, it seems these readings stay at lower levels. This is rendering the primary use case of our app essentially useless, would be great to get to the bottom of it! We can't even really detect it in-app as I believe using SensorKit is restricted to research applications and requires a review process with Apple before accessing? Edit: It's worth noting this is also affecting other apps with strobe functionality in the exact same way
3
5
293
4w
How can we make elements which are grouped accessible for automation
I have a stackview which have 2 labels class TextView: UIView { @IBOutlet private weak var stackView: UIStackView! { didSet { stackView.isAccessibilityElement = true stackView.accessibilityLabel = label1.text + label2.text } } @IBOutlet private weak var label1: UILabel! { didSet { label1.accessibilityIdentifier = "label1" } } @IBOutlet private weak var: UILabel!{ didSet { label2.accessibilityIdentifier = "label2" } } } My goal here is to have a combines accessibility label for the stackview and yet able to access the accessibilityIdentifier of child elements for automation.
0
0
883
Nov ’24
VoiceOver needs to support CFBundleSpokenName
VoiceOver does not support the plist property CFBundleSpokenName. This is wrong and should be fixed. Ultimately the issue I am dealing with is that our app name is UWCU, and instead of VoiceOver pronouncing each letter, it tries to read this as a word and horribly butchers our organization's/app's name. Alternatives such as using U.W.C.U. and U W C U are not acceptable. @Apple, I know you're first response is going to be "no, it is working perfectly," but quite frankly you are wrong. I know you feel strongly about this, given your response in posts like this: https://forums.developer.apple.com/forums/thread/734545?answerId=760084022 HOWEVER, with iOS 18, your argument for "VoiceOver should read what's on the screen" doesn't hold water anymore. With iOS 18, you Apple have added a new feature that lets users customize their home screens and completely remove the name of apps. Here's your own guide: https://support.apple.com/guide/iphone/customize-apps-and-widgets-on-the-home-screen-iph385473442/ios Quoted from your guide: Make the icons bigger: Tap Large. (In large size, the names of the apps disappear.) With large icons + VoiceOver turned on, VoiceOver still reads the app name even though it has disappeared from the screen. So, your own argument "VoiceOver should read the text as it appears on the screen" is invalid, because there is NO text on the screen. If you can't tell, I'm pretty peeved about all this. There's a reason why screen readers support aria attributes to help deliver the right accessible experience. It's a simple ask for VoiceOver to do the same thing.
3
0
539
Oct ’24
UIApplication.shared.open
In iOS 17, the call to "UIApplication.shared.open("App-prefs:ACCESSIBILITY&path=HEARING_AID_TITLE")" was opening the device Settings and going to Accessibility and then Hearing Device which is very helpful. Now in iOS 18, this call only opens the device Settings at the root. I would like to know how to replace the URL so that it works like before. canOpenUrl does return true, so I'm wondering if something is broken, or if canOpenUrl is kind of lying a bit. I also tried other paths to go to other screens and they don't work either.
1
0
400
Nov ’24
"AVSpeechSynthesisVoice" choppy at start.
So, I'm trying to create my own text-to-speech setup. Problem I'm having is whenever I do a test run, the speech gets a bit choppy at the start kind of skipping over maybe a word or a few characters. A few details: I've essentially built a separate class for handling the speech events. AVSpeechSynthesizer is set up as a private variable for the class so I don't expect deallocation to be the issue. Especially since it's a problem at the start. I've got a queue set up for what it's worth so that shouldn't be a problem. I'd appreciate any advice.
2
0
219
Oct ’24
Resetting the selected accessibility action on a button
I have a record button that either starts or stops a recording using the default action. When the user is recording, I want to add a custom action to discard the recording instead of saving it. That all works fine with the following code: if isRecording { recordButton.accessibilityCustomActions = [ .init(name: String(localized: "discard recording"), actionHandler: { [weak delegate] _ in delegate?.discardRecording() return true }) ] recordButton.accessibilityLabel = String(localized: "stop recording", comment: "accessibility label") } else { recordButton.accessibilityCustomActions = [] recordButton.accessibilityLabel = String(localized: "start recording", comment: "accessibility label") } The problem I have is that when a user chose "discard recording", it becomes the default selected action again the next time the user records, and instead of stopping and saving the recording, the user might accidentally discard the next one as well. How can I programmatically reset the selected action on this recordButton to the default action?
0
0
245
Oct ’24
Add words to Voice Control
I want to create a utility to import a list of words to the Voice Control user custom vocabulary. Is there an API to do this? I noticed if you use the built-in export vocabulary functionality (Settings > Accessibility > Voice Control > ...) the file that gets exported is a plist document type. If there is no API to add words programmatically should I just create a utility that generates a plist file and import it using the built-in import vocabulary functionality (Settings > Accessibility > Voice Control > ...)?
3
0
268
Oct ’24