Hello everyone,
I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5.
This has also been reported via Feedback Assistant with
Feedback ID: FB17806167
Description:
When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation.
Specifically:
The pointer cannot move past the center of the screen
Horizontal and vertical (X/Y) movements appear to be swapped or misaligned
Natural movement of the pointer is not possible
It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape.
This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps.
Steps to Reproduce:
Enable AssistiveTouch
Connect a Bluetooth mouse to the iPhone
Rotate the device to landscape orientation
Try moving the mouse pointer across the screen
→ Notice that:
Pointer cannot move past the center
Horizontal/vertical input is interpreted incorrectly (as if still in portrait)
Expected Behavior:
The mouse pointer should move across the entire screen correctly, regardless of device orientation.
Actual Behavior:
In landscape orientation, the pointer is either restricted to part of the screen or misaligned.
It behaves as if the device is still in portrait.
Horizontal mouse movement causes vertical pointer movement, and vice versa
User experience feels broken and unintuitive
Feature Suggestion:
Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS.
I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior.
Thanks in advance for any insights or suggestions.
Best regards,
Jannis
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
When my app is in the background, I create a Live Activity through a push notification with token get from pushToStartTokenUpdates, and this process works fine. However, without opening the app, how can I retrieve the new push token for this Live Activity again and use it for subsequent updates to the Live Activity content?
Dear Team,
We are stuck in membership enrollment process at the purchase stage. We enter the payment details, get the message that our purchase will be completed in 48 hours, but after some time, it reverts back to the same stage of purchase your membership without any error or information. This has happened for 3-4 times till now.
Kindly let us know how to resolve this issue and move forward with the enrollment process.
Thanks.
curl -o actions-runner-osx-x64-2.321.0.tar.gz -L https://github.com/actions/runner/releases/download/v2.321.0/actions-runner-osx-x64-2.321.0.tar.gz echo "b2c91416b3e4d579ae69fc2c381fc50dbda13f1b3fcc283187e2c75d1b173072 actions-runner-osx-x64-2.321.0.tar.gz" | shasum -a 256 -c tar xzf ./actions-runner-osx-x64-2.321.0.tar.gz ./config.sh --url https://github.com/funds123/tpsdk --token BMOPQPZNQKB57MXE4BDDKKTHMTHJO ./run.sh
Save this configuration as com.github.actions.runner.plist in
Topic:
Accessibility & Inclusion
SubTopic:
General
I want to expose few elements for accessibility alone and other for automation alone. But I can only accessibility elements in Accessibility Inspector but can't see Automation Elements
self.view.accessibilityElements = [loginButton as Any,
registerButton as Any,
closeButton as Any]
self.view.automationElements = [claimLabel as Any,
loginButton as Any,
registerButton as Any,
intoductionImage as Any,
closeButton as Any])
I’d love to see Apple implement a Bionic Reading feature as a system-wide accessibility option. This type of reading aid highlights the first part of each word in bold to help guide the eyes and improve comprehension.
It’s been shown to be especially helpful for people with ADHD, dyslexia, and other neurodivergent needs. Having a toggle in Settings > Accessibility would be life-changing.
Ideally, it could be:
• Enabled system-wide, or per-app
• Allow customization of how much of the word is bolded
• Available in Safari, Messages, Books, News, etc.
Hello!
I was faced with unexpected behavior of hardware keyboard focus in UITests.
A clear description of the problem
When running UITests on the iOS Simulator with both "Full Keyboard Access" and "Connect Hardware Keyboard" options enabled, there is a noticeable delay between keyboard actions for focus managing (like pressing Tab or arrow keys). The delay seems to increase with repeated input and suggests that events are being queued instead of processed immediately.
I will describe why I have such an assumption later.
A step-by-step set of instructions to reproduce the problem
Launch the iOS Simulator.
Enable both "Full Keyboard Access" and "Connect Hardware Keyboard" in the Simulator settings.
Run a UITest on a target application (ideally an endless or long-running test).
Once the app is launched, press the Tab key several times.
Observe the delay in focus movement.
Optionally, press the Tab or arrow keys rapidly, then stop the UITest.
After stopping, you’ll see a burst of rapid focus changes.
What results you expected
We expected keyboard actions (like Tab) to be handled immediately and the UI focus to update smoothly during UITests.
What results you saw
There was a 4–10 (end more) second delay between pressing keys and seeing a response. All stacked keyboard events (used for managing focus) are performed all at once after stopping the UITest.
The version of Xcode you are using
Xcode: Version 16.3 (16E140)
Simulator: iPhone 16 Pro (iOS 18.4 and 18.1)
Simulator: iPad Pro 11-inch (M4) (iPadOS 17.5)
We have get the response from Apple pay after the the customer doing the face ID & touch ID authorization.
But the shiping contact is not complete, for examble:
` {
"addressLines": [
"1************ kwy"
],
"administrativeArea": "FL",
"country": "",
"countryCode": "",
"emailAddress": "S*********le.com",
"familyName": "******i",
"givenName": "******m",
"locality": "*******s",
"phoneNumber": "+*******79",
"phoneticFamilyName": "",
"phoneticGivenName": "",
"postalCode": "*****3",
"subAdministrativeArea": "",
"subLocality": ""
},`
as the documents said, it should be the completed shipping contact,
but the country & countrycode is null
https://developer.apple.com/documentation/apple_pay_on_the_web/applepaypayment/1916097-shippingcontact
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures.
Specifically... iframes.
There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch.
If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users.
VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes.
VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor.
While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
Can it please be explained why the voice memo layers is exclusive to iPhone 16 pro models?
I was told by apple support that if it was possible on older models Apple would have implemented it.
This type of functionality has been in technology for tens of years and has a proven track record of being implemented across so many different platforms and even apps in Apples on App Store. So that response from apple support is just rubbish to be honest. Even if the response is “so we can sell the iPhone 16 pro“ then so be it. ive noticed Apple locking software features to new devices lately and it’s genuinely making me think about buying my first even android phone.. because I really don’t support these kind of scummy tactics.
many thanks,
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac
However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day?
For context, my app is a macOS-native SwiftUI app.
I'm experiencing a heating issue with my iPhone 15 Pro Max. The device heats up even during light usage, such as browsing, checking social media, or using simple apps. It becomes noticeably warm, even though I'm not running any demanding tasks like gaming or video editing. I’ve tried restarting the phone, closing background apps, and ensuring that iOS is fully updated, but the problem persists.
Has anyone else encountered this issue with the iPhone 15 Pro Max?
What could be causing my device to heat up during regular use? Are there specific settings or steps I can take to reduce the heating? Is it more likely to be a software-related issue or a hardware problem? Any suggestions or solutions would be appreciated!
Has anyone else encountered this issue with the iPhone 15 Pro Max?
What could be causing my device to heat up during regular use? Are there specific settings or steps I can take to reduce the heating? Is it more likely to be a software-related issue or a hardware problem? Any suggestions or solutions would be appreciated!
I'm trying to validate my app's handling of voiceover accessibility when using VoiceOver hotspots.
What I'm doing:
Set a hotspot
Validate hotspot references correct control within the hotspot chooser
set another hotspot
Validate hotspot references both correct controls within the hotspot chooser
Try to use one of the hotspots
Result: The hotspot has changed to some other control in the app (usually one of the window buttoms (close, minimze, full screen) but at other times are one of the menus or a completely different control in the window.
My question is how I can debug what might be going on in the app when the hotspots are resolved and invoked. I'm assuming I have some accessibility property set improperly for these controls that are causing incorrect resolution of the hotspots.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi,
I'm trying to fix tvOS view for VoiceOver accessibility feature:
TabView { // 5 tabs
Text(title)
Button(play)
ScrollView { // Live
LazyHStack { 200 items }
}
ScrollView { // Continue watching
LazyHStack { 500 items }
}
}
When the view shows up VoiceOver reads:
"Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack.
VocieOver should only read "Home tab 1 of 5"
When moving focus to scroll view it reads:
"Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4"
When moving focus to second item it reads:
"Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4"
When moving focus to third item it reads:
"Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4"
It should be just reading what is focused, idealy just
"Live, Item 1, 1 of 200"
then after moving focus on item 2
"Item 2, 2 of 200"
this time without the word "Live" because we are on the same scroll view (the same horizontal list)
Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused.
This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction.
Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists.
How do I disable reading content that is not focused?
I tried:
.accessibilityLabel(isFocused ? title : "")
.accessibilityHidden(!isFocused)
.accessibilityHidden(true) - tried on various levels in view hierarchy
.accessiblityElement(children: .ignore) - even focused item is not read back by voice over
.accessiblityElement(children: .ignore) - even focused item is not read back by voice over
.accessiblityElement(children: .contain) - tried on various levels in view hierarchy
.accessiblityElement(children: .combine) - tried on various levels in view hierarchy
.accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy
.accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy
// the last 2 was basically an attempt to hack it
.accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view.
50+ other attempts at configuring accessibility tags attached to views.
I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before.
Any idea how to fix this?
Thanks.
I have a view dynamically overlaid on a UITableView with proper padding (added when certain conditions are met). When VoiceOver focuses on a cell beneath this overlay, the focused element does not scroll into view. I’ve noticed similar behavior in Apple’s first-party Podcasts app.
Please find the attached image for reference. How can I resolve this issue and ensure VoiceOver scrolls the focused cell into view?
addition of Central Kurdish language support for Text-to-Speech (TTS) and VoiceOver functionality on Apple products. Our TTS model boasts an impressive 99.9% accuracy, making it a highly reliable tool for this purpose.
This initiative would bring meaningful benefits to over 10,000 visually impaired and more than 40,000 illiterate individuals in the Kurdistan Region of Iraq, empowering them to access digital information, navigate devices, and perform tasks more independently.
The integration of Central Kurdish VoiceOver support would make a significant difference in improving accessibility and quality of life for these individuals, promoting inclusivity and digital literacy in the region.
I am currently developing an alarm app, and I’ve noticed that apps like Super Alarm and Alarmy are able to send local push notifications every 3 seconds after a specified time, even when the app is completely closed and in Airplane Mode. The notifications continue until the user opens the app. I’m trying to implement this functionality, but I haven’t been able to figure out how. Could anyone provide guidance on how to achieve this?
I am trying to build an app that will, on optimized occasions, use the latest iphone's gyroscopic and accelerometer features.
Am I able to use them if I can provide a valid justification for their use?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have a TextField and entered for example "sg?!". At the TextField I set the modifier speechAlwaysIncludesPunctuation(). But when I activate VoiceOver the content of TextField is reading. The special characters don't read out.
How can I fix this?
I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running.
Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Game Controller
Accessibility
Focus
visionOS