When my parent container is disabled and I use VoiceOver, it reads out all the components in the container as "Dimmed" or "Disabled" (iOS 16.4 simulator says "Disabled" 16.6 actual device says "Dimmed", but I digress...)
I was hoping to be able to enable one of the components in the container, but VO continues to report the item as Disabled/Dimmed. (see code below)
Any suggestions on how I might be able to not report Text("Heading") as Dimmed/Disabled in VO?
struct ContentView: View {
@State var inputText: String = ""
var body: some View {
VStack {
Text("Heading")
.disabled(false)
TextField("Type some text", text: $inputText)
}
.disabled(true)
}
}
Accessibility
RSS for tagMake your apps function for a broad range of users using Accessibility APIs across all Apple platforms.
Posts under Accessibility tag
134 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
When I run the code below and enable VoiceOver on my iPhone running iOS 16.6, it navigates from "email1" to "email2".
When I run it on an iPhone simulator running iOS 16.4 and use the A11y Inspector it navigates from "email1" to the first textField, and then to "email2"
when I delete the .accessibilityRepresentation it navigates from "email1" to the first textField, and the to "email2" on both simulator and device.
Am I using .accessibilityRepresentation incorrectly? Or did something change between 16.4 and 16.6? Or is there a better question I should be asking?
any and all help greatly appreciated
struct ContentView: View {
@State var input1: String = ""
@State var input2: String = ""
var body: some View {
VStack {
TextInput(title: "email1", inputText: $input1)
TextInput(title: "email2", inputText: $input2)
}
}
}
struct TextInput: View {
let title: String
@Binding var inputText: String
init(title: String, inputText: Binding<String>) {
self.title = title
self._inputText = inputText
}
var body: some View {
VStack {
Text(title)
TextField("enter Text", text: $inputText)
.accessibilityRepresentation {
TextField("enter Text", text: $inputText)
}
}
}
}
Hello, I am implementing voice over for a UITextField which contains a currency field, the drawback is that voice over reads that field in dollars and the currency that I need to implement is another. I have tried to use NumberFormatter and it reads me the type of currency I need but in the end it pronounces the word dollar
Good morning,
my company needs to develop a screen reader (like VoiceOver more-or-less) with added custom features for blind people.
I wanted to know if there's any possibility to implement a third-party screen reader on iOS, or if someone could suggest any workaround for it.
Thank you
Hello all,
I have an application which retrieves URL from browser using accessibility.
From time to time I am having this crash and I don't know the reason for it.
Could you please help me?
This is the stacktrace:
It is happening on MacOS 13.5 and it happens on an application built for x86_64 and arm64
My iPad 8th gen running IPadOS 17.0 RC does not list Personal Voice in the Accessibility settings. Is Personal Voice creation supported on iPad? If so, on which iPad models is it supported?
*** Crash Message: -[UIWindow _accessibilityFindSubviewDescendant:]: unrecognized selector sent to instance 0x13118ca70 ***
we collected more than 10K+ reports with same call stack, the almost form Accessibility Kit. but no idea how to fix it and no reason why.
could take some help please
4. CoreFoundation 0x00000001d0735900 _CF_forwarding_prep_0 + 96
5 AccessibilitySettingsLoader 0x000000025a97608c 70360165-7515-35AD-9723-C4719EB48D13 + 102540
6 AccessibilityUtilities 0x00000001d9bfa720 AXPerformSafeBlockWithErrorHandler + 112
7 AccessibilityUtilities 0x00000001d9bfb258 AXPerformSafeBlock + 56
8 AccessibilitySettingsLoader 0x000000025a976014 70360165-7515-35AD-9723-C4719EB48D13 + 102420
9 AccessibilitySettingsLoader 0x000000025a975ed8 70360165-7515-35AD-9723-C4719EB48D13 + 102104
10 AccessibilitySettingsLoader 0x000000025a97720c 70360165-7515-35AD-9723-C4719EB48D13 + 107020
11 AccessibilityUtilities 0x00000001d9c7be40 265BEA5E-C36A-3E51-A119-A3FD42F3DB5C + 552512
12 AXCoreUtilities 0x00000001db850530 AXPerformBlockSynchronouslyOnMainThread + 72
13 AXCoreUtilities 0x00000001db84faa0 774B92BE-E295-3FDC-99A4-5EFB55BA70C5 + 6816
14 AccessibilityUtilities 0x00000001d9c7b8d0 265BEA5E-C36A-3E51-A119-A3FD42F3DB5C + 551120
15 AccessibilitySettingsLoader 0x000000025a976ed4 70360165-7515-35AD-9723-C4719EB48D13 + 106196
16 AccessibilityUtilities 0x00000001d9c74178 265BEA5E-C36A-3E51-A119-A3FD42F3DB5C + 520568
17 AccessibilityUtilities 0x00000001d9bfff54 265BEA5E-C36A-3E51-A119-A3FD42F3DB5C + 44884
18 AccessibilityUtilities 0x00000001d9bfca28 265BEA5E-C36A-3E51-A119-A3FD42F3DB5C + 31272
19 libdispatch.dylib 0x00000001d7b7feac 5D16936B-4E4C-3276-BA7A-69C9BC760ABA + 16044
Hi there,
I'm wondering about how certain keyboard keystrokes should work when using an external keyboard to navigate apps.
In particular i'm wondering if there is a difference between the arrow keys and Ctrl+Shift key. The iPhone keyboard shortcut documentation states that Ctrl+Tab moves to the next item - but doesn't elaborate what an 'item' is.
Should you be able to get to every interactive element on a view with both the arrow keys and the Ctrl+Shift key?
Thanks for your help
If you have a iPhone with a home button, but the home button is broken, and you factory reset the device, the assistive touch accessibility feature will be disabled, and it's quite complicated to turn it back on, because when first powered on after a reset or after choosing "start over" in the setup process, the screen will only give one option, and one option only: "press home button to start".
And if home button is broken, you are STUCK. Completely. There is no Siri, no voice control, no voiceover. Nothing!
The only workaround we found was that, after restarting the device again after resetting, it would immediately show the languages list, and after choosing a language it showed the countries, and after choosing a country it FINALLY showed the "quick start" page, where on the top right it has the accessibility shortcut button, from where I was able to enable assistivetouch.
BUT: if at any point I chose to start the setup over, I would have to wait for the device to restart, show the "press home to start" screen AGAIN, restart it AGAIN myself, so that it would show the languages, then countries, then the accessibility shortcut again.
Why not preserve accessibility settings across "start setup over"? And why not show accessibility shortcut on the FIRST page shown to the user, whether when iOS decides to show the Hello screen where the only option is to press home (which is no good if button is broken or if user needs accessibility options), AND whether it shows languages list?
Just show that shortcut right away, everywhere, and preserve accessibility settings across "start over" option of setup.
Thanks.
Why o why did you have to mess with the login screen in Sonoma??? For a visually impaired person (like my wife) you have made the login/lock screen very unfriendly compared to previous releases. With Ventura and previous versions, I have the Lock Screen settings set to:
List of Users
Show Sleep, Restart, Shut Down buttons
The avatar pics of the three users on our computers (admin, me, wife) show up as big icons in the middle of the screen, with the Sleep, Restart, Shutdown buttons in a row right beneath the icons. My wife can find her avatar, click on it, type in her password, and then get right to her magnified closeview screen. With Sonoma, she will have to deal with small moving avatar pics at the bottom of the screen, not friendly at all. And she will NEVER find the Shutdown button hidden in the upper right menu bar.
Doesn't Apple test new upgrades with the accessibility community??? Sonoma is a big step backward for the visually impaired.
Hello, sorry if this is a repost, but I couldn't find an answer anywhere.
I'm french, thus date are displayed dd//mm//yy for us.
on the DatePicker, when the mode .wheels is enabled, i can see the d/m/y.
Now, let's say there are 30 days on the current month and we are currently the 27th of september
The voiceOver will announce "27... 12 elements on the list"
Switching to the month will announce "9... 30 elements on the list"
The correct result should be "27... 30 elements on the list" and "9... 12 elements on the list"
I don't know if it is related but for some reason, even if my phone is set to language french, region France,
the Local.current return en_FR ( and my app support french english language so I was expecting fr_FR)
So I tried to set the local and the accessibility language of the picker to french but the result is the same.
Another problem occur with the date picker set on .inline
Months are displayed in french, voiceOver announce it in french but as soon as I switch to the days it will announce the number in english.
I'tried on a new project with nothing else but the picker and I still have the same results
Are those some bugs or am I missing something ?
xI'm wondering if now would be an excellent time for Apple to consider implementing Accessibility Lens with FaceTime Chroma Green. Unlike Apple, other platforms like Zoom and Webex allow using Chroma Green backgrounds. I use Cam Studio, Elgato Camera Hub, and OBS for Chroma Green effects. I'd like to have the option to use FaceTime with a Chroma Green and choose my background setting. For accessibility and professionalism, we need the ability to change the background settings. We want to be creative with our FaceTime and FaceTime Group. We've invested much in our devices, including iPhones, iPads, and MacBook Pros.
The issue stems from Apple's built-in applications using hardened runtimes. These runtimes prevent apps from loading third-party plugins unless explicitly allowed by the developers. This means third-party camera drivers are incompatible with Apple apps. We're trying to find a solution, but currently, there's nothing we can do. This is a barrier for all of us who are Deaf, Deaf-Blind, and Hard of Hearing and rely on FaceTime and FaceTime Group with Chroma Green background settings. Please let me know. Thank you!
Our users are using Apple's native Voice Control feature: https://support.apple.com/en-us/HT210417
We want to enhance our accessibility experience by adding some additional voice controlled dialogs that show up specifically when Voice Control is enabled.
It can be determined if other Apple accessibility features are turned on via a check like UIAccessibility.isVoiceOverRunning, however there is no option for Voice Control (note, different than Voice Over).
How can I detect if a user is running Voice Control or not?
Our accessibility users are using Apple's Voice Control feature: https://support.apple.com/en-us/HT210417
Voice Control does not work with bluetooth devices. It doesn't seem to be natively supported by Apple and only the iOS device's microphone works. Airpods, Bose Headphones and Jabra BT devices do NOT work.
Is there a way to get our application to run with Bluetooth devices? The current experience doesn't work with our accessibility users as they physically are unable to move closer to the device so bluetooth headphones are critical for their user experience to work correctly.
Hi
I'm using library in my project which create, modify and read file in iConf. How to get access for the app to read from that file.
Tnx,
Filip
I'm trying to make an accessible custom keyboard extension written in SwiftUI using the latest version of XCode (15) and iOS SDK (17). In order to use SwiftUI, I've started with the UIInputViewController (UIKIt) and have added a UIHostingController to act as a bridge between SwiftUI and UIKit.
The keyboard works well but when I enable VoiceOver on a device I can't select any elements within my custom keyboard and it becomes unusable. Below is a minimal reproduction of the issue. When VoiceOver is enabled, I am unable to select the single "Test" button. Unfortunately VoiceOver is only available on devices so it takes a bit more effort to try.
I look forward to any help or suggestions. Thanks!
import SwiftUI
class KeyboardViewController: UIInputViewController {
open override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
setup(withRootView: Button("Test"){})
}
func setup<Content: View>(withRootView view: Content) {
self.children.forEach { $0.removeFromParent() }
self.view!.subviews.forEach { $0.removeFromSuperview() }
addSubSwiftUIView(view, to: self.view)
}
}
extension UIViewController {
/// Add a SwiftUI `View` as a child of the input `UIView`.
/// - Parameters:
/// - swiftUIView: The SwiftUI `View` to add as a child.
/// - view: The `UIView` instance to which the view should be added.
func addSubSwiftUIView<Content>(_ swiftUIView: Content, to view: UIView) where Content : View {
let hostingController = UIHostingController(rootView: swiftUIView)
/// Add as a child of the current view controller.
addChild(hostingController)
/// Add the SwiftUI view to the view controller view hierarchy.
view.addSubview(hostingController.view)
/// Setup the contraints to update the SwiftUI view boundaries.
hostingController.view.translatesAutoresizingMaskIntoConstraints = false
let constraints = [
hostingController.view.topAnchor.constraint(equalTo: view.topAnchor),
hostingController.view.leftAnchor.constraint(equalTo: view.leftAnchor),
view.bottomAnchor.constraint(equalTo: hostingController.view.bottomAnchor),
view.rightAnchor.constraint(equalTo: hostingController.view.rightAnchor)
]
NSLayoutConstraint.activate(constraints)
/// Notify the hosting controller that it has been moved to the current view controller.
hostingController.didMove(toParent: self)
}
}
Some additional background:
I have been using KeyboardKit as a base for my custom keyboard. KeyboardKit is using this same method to bridge UIKit and SwiftUI and seems to have the same general problem. KeyboardKit has a app available where you can easily see this same issue. I will also post this on their issues board but this seems to be an underlying issue with using the UIHostingController.
Hi,
My app supports VoiceOver and VoiceControl, and I find it hard or perhaps even impossible to support both when it comes to grouping accessibility elements.
I have an element containing multiple subviews, and I want to define it as a group of accessibility custom actions to enhance VoiceOver navigation. (So that the user can focus on the entire element, and just swipe up or down if they want to activate one of its subviews as a custom action). To do that, I'm setting isAccessibilityElement to YES and I'm using accessibilityCustomActions where I create a custom action for each of this element's subviews.
This works fine and presents the accessibility custom actions as expected.
The struggle comes when I try to combine it with Voice Control. Since the element contains multiple subviews, I want each of these subviews to be accessible when using Voice Control with item numbers, for example. Instead, Voice Control is setting a number only on the entire element, ignoring its subviews. This is not functional because each of these subviews performs a different action when activating it.
I found that VoiceControl is only numbering subviews when the parent's isAccessibilityElement is set to NO. And indeed, when changing to NO, the subviews are getting numbered, but the custom actions group break, and VoiceOver treats each element as a separate accessibility element.
So I can't use accessibilityCustomActions AND supporting VoiceControl.
Is this an intended behavior defined by Apple? Or, is there anyway I can support them both?
Accessibility Voiceover is not treating navigation bar left button as first focused element.
If we navigate from A->B then the focus is going to first element inside the B view not to the left button item.(back button)
If we post accessibility notification, in viewWillAppear of B, focus is not shifting.
If we post viewDidAppear then first focus is going to element inside B's view then shifting back to back button.
There is a inconsistency behaviour. Can you please help here.
Thanks
We launched an application we created with Assistive Access enabled on iOS17,
When we tried to open a URL using the openURL method, the following error message was returned and the URL could not be opened.
"Untrusted open application requests are not allowed in Assistive Access."
Is there anything I can do to get them to trust our request and allow it?
We launched an application we created with Assistive Access enabled on iOS17,
We tried to provide standard services such as Mail and Messages to users with UIActivityViewController,
When selecting the Mail or Messages icon, the following error occurred
"Not allowing open application request from unallowed client process."
Is there anything I can do to allow the request from our process?