Hello,
I am a student studying accessibility.
I aim to analyze the smartphone usage patterns of visually impaired individuals.
Therefore, I would like to log the VoiceOver usage records of visually impaired iPhone users.
Is there a way to output VoiceOver logs, similar to the AccessibilityService API on Android?
Thank you in advance for your responses.
Accessibility
RSS for tagMake your apps function for a broad range of users using Accessibility APIs across all Apple platforms.
Posts under Accessibility tag
124 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
In our application we are using OTP login. When accessibility full keyboard access is enabled, and we are trying to enter OTP in the OTP field that time in iOS 17 focus is moving to the next text field accordingly but in iOS 18 focus is staying the first OTP field only and not moving to the next text field.
In our application we are using a pop over view and we have enabled the accessibility VoiceOver, When user navigating inside the popover and reached to the last element that time with the right swipe we need to dismiss the popover.
In our application we are using a Search bar in a pop over view and we have enabled Accessibility full keyboard access and we are using external keyboard. Now if the focus is on Searcher that time by next Tab key press Search bar will dismiss and focus needs to shift to the next UIElement.
In our application we are using UITableView for data population and that TableView cell contains a button. When we are enabling full keyboard access that time only TableView cell is focusing not the button. We need to focus on cell and button differently.
I am trying to grant Input Monitoring permission using MDM (Mobile Device Management), but I am facing issues. While I am able to deny the permission, I am unable to grant it.
In some profile configurator tools, I noticed a note stating:
"Allows the application to use CoreGraphics and HID APIs to listen to (receive) CGEvents and HID events from all processes. Access to these events cannot be given in a profile; it can only be denied."
This seems to suggest that granting Input Monitoring permission via an MDM profile may not be possible.
Has anyone successfully granted Input Monitoring permission using MDM, or is there an alternative way to achieve this on managed macOS devices?
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
Hello all.
Currently I am trying to get WKWebView to scroll with a physical keyboard and it just will not work. I tried allowsKeyboardScrolling( ) and it did not work. UIWebView works but its no longer supported. Trying to get full keyboard access to work to make our app more accessible but WKWebView does not want to play nice.
Has anyone else had issues trying to use WKWebView with an external keyboard, and if so did you find any solutions? Greatly appreciated!
home button cannot be pressed.
attempting to enable developer mode says "press home to continue"
the usual accessibility switch "assistive touch" that is shown on iOS is not available on this screen
so dev mode cannot be enabled
this is an accessibility issue
When using an app via external keyboard, FocusState and .focused used to work just fine until iOS17. Vertical-axis textfields were also accessible without any issues. But after iOS18 update, adding focused modifier removes elements out of focus order of external keyboard.
1 such example is -when a button using focused modifier and @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink, that button is not accessible via Bluetooth (external) keyboard.
TextEditor / Vertical-axis TextFields also seem to be impacted in external-keyboard-focus-order when added inside ScrollView.
Is this a known iOS18 issue with ScrollView / any tip to get this fixed ?
Sample code that can reproduce this issue:
struct ContentView: View {
@State private var showBottomSheet: Bool = false
@State private var goToNextView: Bool = false
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
var body: some View {
NavigationView {
VStack {
Text("Hello, world!")
// This button works fine in Bluetooth keyboard in all versions
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Goto another view") {
goToNextView = true
}
NavigationLink(
destination: View2(),
isActive: $goToNextView
) { EmptyView() }
.accessibility(hidden: true)
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
#Preview {
ContentView()
}
struct View2: View {
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
@State private var showBottomSheet: Bool = false
var body: some View {
ScrollView {
VStack {
Text("check")
// In iOS18, this button doesn't get focused in Bluetooth / external keyboard
// This issue occurs when these 3 combine in iOS 18 - a button using FocusState inside a view that has a ScrollView & it is opened via NavigationLink
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Test button") { }
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
I am trying to simulate a paste command and it seems to not want to paste. It worked at one point with the same code and now is causing issues.
My code looks like this:
` func simulatePaste() {
guard let source = CGEventSource(stateID: .hidSystemState) else {
print("Failed to create event source")
return
}
let keyDown = CGEvent(keyboardEventSource: source, virtualKey: CGKeyCode(9), keyDown: true)
let keyUp = CGEvent(keyboardEventSource: source, virtualKey: CGKeyCode(9), keyDown: false)
keyDown?.flags = .maskCommand
keyUp?.flags = .maskCommand
keyDown?.post(tap: .cgAnnotatedSessionEventTap)
keyUp?.post(tap: .cgAnnotatedSessionEventTap)
print("Simulated Cmd + V")
}
I know that there is some issues around permissions and so in my Info.plist I have this:
<string>NSApplication</string>
<key>NSAppleEventsUsageDescription</key>
<string>This app requires permission to send keyboard input for pasting from the clipboard.</string>
I have also disabled sandbox. It does ask me if I want to give the app permissions but after approving it, it still doesn't paste.
I'm looking to integrate call / text / facetime history into my app while maintaining the necessary security for the end user. I only need date stamp and contact link or name of the person communicated with, no access to content of messages etc.
How would this be accomplished?
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
When turning VoiceOver ON, GCController does not send button press events for "Button A" and "Button Center".
This happens when using Siri 2nd generation remote (with dedicated arrow buttons on the circle around center button) and also when using iOS remote. I didn't test it on old Siri 1st generation with touchpad without arrow buttons.
Example:
gameController.microGamepad?.allButtons.forEach { button in
button.valueChangedHandler = { [weak self] _, _, _ in
self?.buttonHandler(gameController: gameController, button: button)
}
private func buttonHandler(gameController: GCController, button: GCControllerButtonInput) {
print("BUTTON: Pressed \(button.description) isPressed=\(button.isPressed) isTouched=\(button.isTouched)")
}
#endif
VoiceOver ON (incorrect behavior):
BUTTON: Pressed Direction Pad Left (value: 0.030, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Direction Pad Down (value: 0.079, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false
BUTTON: Pressed Direction Pad Down (value: 0.000, pressed: 0) isPressed=false isTouched=false
VoiceOver OFF (correct behavior):
BUTTON: Pressed Direction Pad Left (value: 0.137, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Direction Pad Up (value: 0.078, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Button A (value: 1.000, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Button Center (value: 1.000, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Button A (value: 0.000, pressed: 0) isPressed=false isTouched=false
BUTTON: Pressed Button Center (value: 0.000, pressed: 0) isPressed=false isTouched=false
BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false
BUTTON: Pressed Direction Pad Up (value: 0.000, pressed: 0) isPressed=false isTouched=false
I could use for detection Direction Pad Left/Right/Up/Down and detect position between -0.7 and +0.7 and handle it as center button press, because I use that on old Siri remote where I need to distinguish center button and arrows (for switching TV channels by Up/Down and Skip forward/back by Left/Right arrows), but for new Siri remote it would be unnecessary workaround.
Does anybody know why the center/select button is not detected when VoiceOver is ON. Is there another way of detecting it using GCController?
I don't want to use SwiftUI onTapGesture for this one particular case.
Is it an unexpected bug in tvOS APIs or is there some specific reason why center button is not handled by GCController when VoiceOver is ON?
Thanks.
I like to suggest a different microphone dot icon for Voice Control. I had customized Voice Control to turn on a Flashlight. This caused confusion with the orange dot being switched on constantly.
I made an error in sending a security vulnerability to Apple Security about the orange dot microphone in always ON mode when iPhone is unlocked.
Hi,
I am writing in the hope to receive some clarification about the rationale of the Audit type sufficientElementDescription - in context with Accessibility Audit API.
Please see my test below:
And another example in context with Xcode, where the strings visible in the UI are also set as accessible labels of their respective elements.
Thanks for your help!
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include:
Apple Music, Library tab
Photos, Album view
Reminders
In all these cases, VoiceOver reads this element as "More, Button".
In my SwiftUI app, I've implemented a visually identical button.
Menu {
// Button for Menu Item 1
// Button for Menu Item 2
// ...
} label: {
Image(systemName: "ellipsis.circle")
.accessibilityHidden(true)
}
.accessibilityLabel("More")
However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
Using gesture recognizers it is easy to implement a long-press gesture to open a menu, show a preview or something else on the iOS platform. And you can provide the duration the user must hold down the finger until the gesture recognizer fires.
But I could not yet find out how to determine the default duration for a long-press gesture that is configured in the system settings within the "accessibility" settings under "Haptic Touch" (the available options are fast, standard and slow here).
Is it possible to read out this setting, so my App can adapt to this system setting as well?
Hello,
I’m developing a sandboxed macOS app using Qt, which will be distributed via the Mac App Store. The app:
Monitors the clipboard to store copied items.
Overrides the paste function of the operating system via keyboard shortcuts.
Modifies clipboard content, replacing what the user pastes with stored data.
So, I have some questions:
Can a sandboxed app continuously read and modify clipboard content?
What entitlements are required?
What permissions should I request from the user to ensure that my app works?
Any guidance would be greatly appreciated!
Thanks in advance!
Beril Bayram
Hello Apple Community,
I'm experiencing an issue with "Vocal Shortcuts" on iOS. I created a trigger in Vocal Shortcuts to run a specific shortcut, and it works perfectly on the first day. However, by the next day, the voice command stops functioning entirely. To make it work again, I have to disable and then re-enable Vocal Shortcuts in the settings.
I've tested this on multiple devices (iPhone 11, iPhone 13, and iPhone X), all running the latest iOS version, and the same problem occurs on each one. Is there any additional configuration needed, or could this be a bug?
Any advice or insights would be greatly appreciated!
Thank you in advance,