I have an application that binds a menu item to trigger on ⌘]. When I set the US input source, I press ⌘] in order to trigger that item. However, when I switch the input source to QWERTZ (German), the trigger changes to ⌘Ä automatically by the OS. It seems to translate keystrokes for different input sources.
The problem is that I also render the keybindings in a window in my application, and my application is not aware of this translation. Furthermore, I have other key shortcuts in my application which are not bound to menu items, and I want to make sure those get translated too.
Does AppKit expose a way to lookup what a keystroke will be when MacOS translates it, i.e. lookup ⌘Ä from ⌘] when the current layout is QWERTZ? I can't find anything in Apple's docs.
I tried converting a character to virtual key code based on the US layout and then mapping it back to a character based on the QWERTZ layout. That doesn't seem to be the same b/c that ends up converting ] to + instead which seems to be based on physical key location, different from how the keybindings are handled.
Update: I notice similar behavior for VS Code's menu bar, e.g. in their "Terminal" menu. Switching to German changes some bindings. This does not occur at all in iTerm's menu bar, I suspect b/c their menu items are specified in a different way, xib files with hard-coded key equivalents
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Post
Replies
Boosts
Views
Activity
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include:
Apple Music, Library tab
Photos, Album view
Reminders
In all these cases, VoiceOver reads this element as "More, Button".
In my SwiftUI app, I've implemented a visually identical button.
Menu {
// Button for Menu Item 1
// Button for Menu Item 2
// ...
} label: {
Image(systemName: "ellipsis.circle")
.accessibilityHidden(true)
}
.accessibilityLabel("More")
However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
Hello!
I'm trying to improve the accessibility of a UIKit login form in our iOS app. If an error occurs, an error message is shown in a label that is hidden by default. For our VoiceOver users, I want to move the focus to the error message label so that VoiceOver reads out the error message.
I'm trying to achieve this using UIAccessibility.post, but try as I might, it does not work. To better understand the problem, I created a very simple App which shows a button and a label (always visible), and on pressing the button, I post an accessibility notification:
UIAccessibility.post(notification: .layoutChanged, argument: label)
What I expect to happen is for the focus to move from the button to the label. What happens instead is the focus stays with the button and VoiceOver reads out the button's label again. So it seems to process the notification, but ignore the argument.
Am I misunderstanding how accessibility notifications work or is this simply broken at the moment? I am testing this withy my iPhone with the current iOS version 18.2.1
By the way, using the more modern variant leads to the same result:
AccessibilityNotification.LayoutChanged(label).post()
I updated with 18.3 beta, but lost video and audio option with that update, I tried to restore with itune in windows 11, got struck between. Forcefully turned off ipad, after two tries got off.... Off like blinked out screen... Not tried all tricks to on, can't on....please tell a solution. Used all your advices in internet. It was 90% charged, working superbly. So far no risk... Please help me. No charging icon, no sign of life. How can On?
before I start this could just be me and handful of people but I like to reorganize my phone screen to my needs based on what’s going on in life. I was jaut thinking it would be easier if u could get rid of all the folders at once then reorganize or something easier than this long extensive process it is now.
Allow the user to add their own tags to the default emoji tags.
For instance, this emoji, for me, is nonna: 🤌🏻. My efficiency would improve immensely if I could search for it as the “Nonna” emoji, rather than searching for nonna, remembering it doesn’t exist, trying the search for other things it might be called, realising I don’t know what it is, then having to scroll through all the hand emojis twice to find it.
🤌🏻🤞🏼👌
Hello Apple Community,
I'm experiencing an issue with "Vocal Shortcuts" on iOS. I created a trigger in Vocal Shortcuts to run a specific shortcut, and it works perfectly on the first day. However, by the next day, the voice command stops functioning entirely. To make it work again, I have to disable and then re-enable Vocal Shortcuts in the settings.
I've tested this on multiple devices (iPhone 11, iPhone 13, and iPhone X), all running the latest iOS version, and the same problem occurs on each one. Is there any additional configuration needed, or could this be a bug?
Any advice or insights would be greatly appreciated!
Thank you in advance,
I am facing issue of back camera in my iphone 14 plus it is showing the black screen and my iphone is manufacture between april 2023 to april 2024 but its still not eligible for apple program my phone is also getting same issue why its not eligible for it
who else has this error?
Hi apple, ive been having a problem with my ipad pro 5th generation.
since updating my ipad it has been acting weird lately… to be specific it keeps closing twice randomly and the widgets turn white andcmy screen keeps going black
when i go on apps it keeps exiting out of the app
also the new siri is so slow and wouldnt answer when i say [hey siri] only on random ocasions
please help me fix this problem because i need my ipad for studying…
thank you.
I am registering my startup for an Apple Developer Account so that we can put our app on the App Store. I do not want use my personal Apple ID's payment to pay the $99 annual fee. How can I change the payment method so that I can continue with my enrollment?
Individuals with a stroke can end up with vision impairments: specifically Homonymous Hemianopia which basically means the individual has lost sight in (as an example) the left half of both eyes. I'm interested in understanding if it would be possible to help individuals with this vision impairment by providing an accessibility config within the Apple Vision Pro which would first determine an individuals field of view (possibly by showing a field of dots across the entire "screen" and having the individual look at the dot and click. Based on the results of this field of view, this would determine how the screen would be presented to the user moving forward.
My mom (82 years old) had a stroke recently and was diagnosed with Homonymous Hemianopia. She lived on her IPhone and would love to get back the ability to text message, use Facebook, and order items from Amazon.
Please advise if you believe the Apple Vision Pro would be capable of helping in this area with the suggested development, or other thoughts.
I am trying to build an app that will, on optimized occasions, use the latest iphone's gyroscopic and accelerometer features.
Am I able to use them if I can provide a valid justification for their use?
Hi All,
I am develop the a braille keyboard with iOS, when I testing the typing function in notes , the screen update is very slow after typing, we suppose the respond should been instance change, I am not sure how to setup voiceover setting.
does any document supply on this issue?
or does a any guideline for this?
有在Info.plist配置权限使用说明(NSLocalNetworkUsageDescription),App本地网络权限也已经打开,但App请求局域网设备接口时,仍返回异常:
Error Domain=NSURLErrorDomain Code=-1009 "The Internet connection appears to be offline." UserInfo={_kCFStreamErrorCodeKey=50, NSUnderlyingError=0x302d79d40 {Error Domain=kCFErrorDomainCFNetwork Code=-1009 "(null)" UserInfo={_NSURLErrorNWPathKey=unsatisfied (Local network prohibited), interface: en0[802.11], ipv4, uses wifi, _kCFStreamErrorCodeKey=50, _kCFStreamErrorDomainKey=1}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask .<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalDataTask .<1>"
), NSLocalizedDescription=The Internet connection appears to be offline., NSErrorFailingURLStringKey=http://192.168.1.1:80/***, NSErrorFailingURLKey=http://192.168.1.1:80/***, _kCFStreamErrorDomainKey=1}
I bought a new iPhone 16 recently and connected with my car (Hyundai Venue) I couldn't able to see WhatsApp. I researched and found some forum, but the suggested steps are not workable or not suitable for latest iOS version.
I have updated iOS and WhatsApp, nothing helped to resolve.
Note: Earlier I was used Pixel phone I can able to see Whatsapp and I can make a call
Hey there! Hope you are starting the year with great joy.
My situation
I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift
My problem
I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target.
My discoveries so far
I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API.
Thank you in advanced.
Hi,
I have an iOS app where bluetooth scanner and bluetooth keyboard both should work simultaneously.
I have used 'pressesBegan' method to fetch the characters which are coming from the bluetooth keyboard. This method is fetching all the bluetooth keyboard inputs correctly. But this method is also called when the characters are coming from the bluetooth scanner.
In the 'pressesBegan' method, I have to separate the inputs which are coming from the bluetooth keyboard and are coming from bluetooth scanner. They both have some different use in the app. I have already tried with the fetching speed of the characters, but no luck in case of high speed typing. So characters fetching speed will not work in our case.
Is there any way to separate the inputs based on some other factorials?
Or any other info in the 'pressesBegan' method which can separate the input that it is coming from scanner or is coming from keyboard.
Any suggestion regarding this will be helpful.
Thanks in advance.
I am currently developing a macOS application that listens for system-wide mouse clicks to simulate typing with user-provided text. The app requires Accessibility permissions to function properly, and I want to ensure compliance with Apple’s latest privacy and security guidelines.
The app listens to global mouse clicks.
It simulates keyboard input with user-provided text
I would like detailed guidance on the following aspects:
What specific entitlements are required to allow system-wide mouse click monitoring and simulating user input ?
App Sandbox enable or disable?
what keys required to explain global mouse click monitoring and keyboard input simulation in the info.plist
What will be the configuration of Privacy Manifest
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac
However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day?
For context, my app is a macOS-native SwiftUI app.