Hello, my submission is based on Haptics. Without it the App doesn't make sense. And only real iPhone can give this opportunity. But it says that Xcode playgrounds will be tested on Simulator.
Is it indeed like this? What can I do?
Thank you in advance!
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message.
Does anyone have any ideas what the problem could be?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey folksI, I would like to ask for help on this topic:
I think this is exactly the same problem Combobox not working with VoiceOver after… - Apple Community.
VoiceOver also breaks the combobox from the official ARIA W3C website https://www.w3.org/WAI/ARIA/apg/patterns/combobox/examples/combobox-autocomplete-list/. When VO is turned off, I can use the up/down arrow to go through the menu items from the dropdown, but when VO is turned on, the up/down arrows cannot access the dropdown menu items.
Is there an official tutorial on how to control it using voice over?
Kind regards,
Jakub
Topic:
Accessibility & Inclusion
SubTopic:
General
On iOS, there is accessibilityLanguage.
I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running.
Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Game Controller
Accessibility
Focus
visionOS
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of:
AVSpeechUtterance(string: "中文") // Any Chinese Content
in the dialect specified by:
Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language
instead of the dialect that I specified in AVSpeechUtterance.voice:
AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese
AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin
However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below.
My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18.
Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen.
However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens.
This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior.
Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode.
Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work?
Cannot make FaceTime calls with favorite contacts.
Find My iPhone cannot jump to the maps app.
Camera cannot zoom in or out.
Photos cannot be deleted, edited, or shared in a shared album in the photos app.
Photos/videos cannot be sent in messages.
Spotify cannot be accessed from the lock screen.
Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks).
There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
I updated with 18.3 beta, but lost video and audio option with that update, I tried to restore with itune in windows 11, got struck between. Forcefully turned off ipad, after two tries got off.... Off like blinked out screen... Not tried all tricks to on, can't on....please tell a solution. Used all your advices in internet. It was 90% charged, working superbly. So far no risk... Please help me. No charging icon, no sign of life. How can On?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey everyone,
I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery?
The Core Idea
Imagine this:
Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket.
Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery.
Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures.
On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED).
Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy.
Why This Matters
This isn't just a cool gimmick; it offers significant benefits:
True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else.
Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses.
Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech.
Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source.
This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
Topic:
Accessibility & Inclusion
SubTopic:
General
hi giys,can anyone help me bcouse i cant pair my apple watch series 1 with my iPhone 15
Topic:
Accessibility & Inclusion
SubTopic:
General
After 26 IOS update, the colors on my new iPad Pro M4 have become extremely dull almost like those on a very old device. The screen brightness is significantly reduced, and it's now difficult to see UI elements clearly. This is very disappointing considering the device’s high display quality before the update. Please advise if this is a known issue or if there's a fix.
Topic:
Accessibility & Inclusion
SubTopic:
General
SwiftUI provides the accessibilityCustomContent(_:_:) modifier to add additional accessibility information for an element. However, I couldn’t find a similar approach in UIKit.
Is there a way to achieve this in UIKit?
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars.
Issue details:
My iPhone is set to Region: Chile and Language: Spanish (Chile).
In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars.
A notification with the text:
let content = UNMutableNotificationContent()
content.body = "¡Has recibido un pago por $5.000!"
is read aloud by Siri as:
”¡Has recibido un pago por 5.000 dólares!”
(English: “You have received a payment of five thousand dollars!”)
instead of the correct:
”¡Has recibido un pago por 5.000 pesos!”
(English: “You have received a payment of five thousand pesos!”)
Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177
This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services:
watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions).
Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency.
Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly.
Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency.
I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348.
This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars.
Has anyone found a workaround, or is there any update from Apple on this?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Siri and Voice
User Notifications
Localization
Apple Intelligence
While editing the search text using the external keyboard (with VoiceOver on), if I try to navigate the to List using the keyboard, the focus jumps back to the search field immediately, preventing selection of list items. It's important to note that the voiceover navigation alone without a keyboard works as expected.
It’s as if the List never gains focus—every attempt to move focus lands back on the search field.
The code:
struct ContentView: View {
@State var searchText = ""
let items = ["Apple", "Banana", "Cherry", "Date", "Elderberry", "Fig", "Grape"]
var filteredItems: [String] {
if searchText.isEmpty {
return items
} else {
return items.filter { $0.localizedCaseInsensitiveContains(searchText) }
}
}
var body: some View {
if #available(iOS 16.0, *) {
NavigationStack {
List(filteredItems, id: \.self) { item in
Text(item)
}
.navigationTitle("Fruits")
.searchable(text: $searchText)
}
} else {
NavigationView {
List(filteredItems, id: \.self) { item in
Text(item)
}
.navigationTitle("Fruits")
.searchable(text: $searchText)
}
}
}
}
I am seeing a strange issue where NSObject accessibilityRespondsToUserInteraction returns true on Simulator but false on device.
Checking the same object on simulator with Accessibility inspector I see the object traits as image so why would it return true in that case?
Are there any other way to check the the item might be accessibilityRespondsToUserInteraction OR Clickable beside that property and traits?
(Or is it just another bug)
Hello,
I am working on a Braille keyboard by using HID approach.
Current the device works with iPhone 11 and SE3.
However, when tested in iPhone 6s with iOS 15, although the device can be connected and recognized as Braille device in VoiceOver screen, the phone shows no response to key press report.
Would there be any requirement at points such as HID descriptor for iPhone 6s support on Braille device? If iPhone 6s does not support such devices, what is the minimum system requirements?
Thank you!
Can you guys like probably make Visual Intelligence available for the action button on the iPhone 16e? It should be only for iPhones that use A18 and future gen apple chips.
Topic:
Accessibility & Inclusion
SubTopic:
General
After IOS 26 beta 2 installation in my iphone 13, I can't do a screenshot using assistivetouch nor touch on back.
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi everyone,
I've encountered a rare and strange crash in my app that I can't consistently reproduce. The crash seems to occur deep within Apple's internal frameworks, and I can't pinpoint which line of my own code is causing it. Here's the crash stack trace:
#44 AXSpeech
SIGSEGV
SEGV_ACCERR
0 CoreFoundation ___CFCheckCFInfoPACSignature + 4
1 CoreFoundation _CFRunLoopSourceSignal + 28
2 Foundation _performQueueDequeue + 492
3 Foundation ___NSThreadPerformPerform + 88
4 CoreFoundation ___CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28
5 CoreFoundation ___CFRunLoopDoSource0 + 176
6 CoreFoundation ___CFRunLoopDoSources0 + 340
7 CoreFoundation ___CFRunLoopRun + 828
8 CoreFoundation _CFRunLoopRunSpecific + 608
9 Foundation -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212
10 TextToSpeech _TTSCFAttributedStringCreateStringByBracketingAttributeWithString + 776
11 Foundation ___NSThread__start__ + 732
12 libsystem_pthread.dylib __pthread_start + 136
Sometimes, instead of line 10 referencing _TTSCFAttributedStringCreateStringByBracketingAttributeWithString, it shows:
10 TextToSpeech LogWarning(char const*, ...) + 7288
Has anyone experienced a similar issue or know what might be triggering this crash? Any guidance on how to investigate or resolve this would be greatly appreciated. Thank you!