I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
When using an app via external keyboard, FocusState and .focused used to work just fine until iOS17. Vertical-axis textfields were also accessible without any issues. But after iOS18 update, adding focused modifier removes elements out of focus order of external keyboard.
1 such example is -when a button using focused modifier and @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink, that button is not accessible via Bluetooth (external) keyboard.
TextEditor / Vertical-axis TextFields also seem to be impacted in external-keyboard-focus-order when added inside ScrollView.
Is this a known iOS18 issue with ScrollView / any tip to get this fixed ?
Sample code that can reproduce this issue:
struct ContentView: View {
@State private var showBottomSheet: Bool = false
@State private var goToNextView: Bool = false
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
var body: some View {
NavigationView {
VStack {
Text("Hello, world!")
// This button works fine in Bluetooth keyboard in all versions
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Goto another view") {
goToNextView = true
}
NavigationLink(
destination: View2(),
isActive: $goToNextView
) { EmptyView() }
.accessibility(hidden: true)
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
#Preview {
ContentView()
}
struct View2: View {
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
@State private var showBottomSheet: Bool = false
var body: some View {
ScrollView {
VStack {
Text("check")
// In iOS18, this button doesn't get focused in Bluetooth / external keyboard
// This issue occurs when these 3 combine in iOS 18 - a button using FocusState inside a view that has a ScrollView & it is opened via NavigationLink
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Test button") { }
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
Hey everyone,
I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery?
The Core Idea
Imagine this:
Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket.
Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery.
Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures.
On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED).
Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy.
Why This Matters
This isn't just a cool gimmick; it offers significant benefits:
True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else.
Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses.
Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech.
Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source.
This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I have the following problem. I’m developing a NoCode app using the FlutterFlow platform and have been working on it for over a year.
This time, after publishing a new version of the app through FlutterFlow, I tried logging into Apple Store Connect, but I got an error saying that I had made too many login attempts and needed to try again later. However, I hadn’t attempted to log in before that at all.
No matter how long I wait—24 hours, 48 hours—the same error keeps appearing, meaning I still can’t access my account. Apple Support hasn’t responded for 4 days, and in total, I’ve been locked out of my account for over 9 days.
Please help me understand what might be causing this issue. Apple Store Connect refuses to send me an SMS with the login code.
Topic:
Accessibility & Inclusion
SubTopic:
General
Our company enrolled in the Apple Developer Program as an organization in July 2024. Everything was fine for several months, but in early January 2025, our developer noticed that the certificates were missing. When we logged into our developer account, we were shocked to see a page prompting us to “Enroll Today”—as if we had never joined in the first place.
Clicking the enrollment button led us to an error page stating we cannot enroll.
We immediately reached out to Apple Developer Support via email, but despite multiple attempts, we received no response. Strangely, our apps remain live on the App Store, App Store Connect functions as usual, and we continue receiving payments every month. However, we are completely blocked from developing and releasing updates.
Today, I managed to reach Apple by phone. After being transferred to a senior representative, I was told they couldn’t tell me why this was happening. They only confirmed that a request had been made and that I should “wait.” That’s it—no explanation, no timeline, nothing. While it’s somewhat reassuring that they acknowledge the issue, I’ve already seen other developers with the same problem go unanswered for months.
My suspicion? This account might be linked to an individual developer account from way back in 2015 when Apple’s registration process was far less strict. Could that be the issue? No idea—because Apple won’t say a word.
Meanwhile, both of our apps have been exposed to several bugs, and customers are waiting for updates. If there’s still no response from Apple, I have no choice but to register a new account—purely to continue supporting our users.
CASE ID: 102508598957
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message.
Does anyone have any ideas what the problem could be?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi,
I want to detect if there is a fullscreen window on each screen.
_AXUIElementGetWindow and kAXFullscreenAttribute methods work, but I have to be in a non-sandbox environment to use them.
Is there any other way that also works? I don't think it's enough to judge if it's fullscreen by comparing the window size to the screen size, since it doesn't work on MacBook with notch, or the menu bar is set to 'auto-hide'.
Thanks.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Accessibility
Mac App Store
Core Graphics
App Sandbox
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of:
AVSpeechUtterance(string: "中文") // Any Chinese Content
in the dialect specified by:
Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language
instead of the dialect that I specified in AVSpeechUtterance.voice:
AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese
AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin
However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below.
My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18.
Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen.
However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens.
This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior.
Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
ar quicklook suddenly is grayed out on iphone 15 pro, I bought the phone new recently ot was working great, 2 days ago updated to ios 18.1.4, ar mode kept opening but i started getting a move iphone over surface message and the object wouldn’t detect surfaces correctly, updated to ios 18.5, now when i open quicklook modesl ar is completely greyed out,
can someone help me fix or detect the issue
thank you
We have an iOS App built in .NET MAUI (Multi-platform App UI).
This is a web view App.
We wish to integrate APP Clips into this App.
But we are unable to do it, due to less available resources online on such implementation.
We do not wish to share code between .NET MAUI App and App clips.
We understand it is not possible to add APP Clips without a parent swift/Xcode app.
As an alternative solution we were thinking to Create a new APP in APP Store Connect using XCode/swift and integrate app clips to it.
This parent app when downloaded by users will only redirect users to our MAIN .NET MAUI app to app store connect.
We need to know if such apps will be approved by APPSTORE Connect?
Please guide us on this.
Also please do let us know if you have any other solution to integrate App clips to a .NET MAUI App
Hey everyone!
I am developing a screen time limit app to help people spend less time in distracting apps.
It works this way: people choose unhealthy apps for them and opposite productivity apps. In the app you can exchange time spent on healthy habits to scroll or use other distracting apps.
This idea was loved by social media, and the app already has 100k followers on social media without even being launched yet.
So I am waiting just for one feature permission from Apple, and they have not given me any answer since I applied 3 weeks ago.
There are a lot of similar apps on the market, and this feature exists in other screen time limit apps.
Why is app blocking permission needed?
Time Exchange Functionality:
Users independently select which apps are productive and which are distracting for them.
The system blocks the "negative" apps until the user accumulates enough time in the "positive" ones. This encourages healthy device usage.
Full User Control:
All apps to be blocked are manually selected by the user in the settings.
The extension does not impose any restrictions without explicit permission.
Transparency and Security:
Blocking happens locally, with no data collected about app usage.
We adhere to Apple’s privacy policy.
Compliance with App Store Guidelines:
We understand that app blocking is a sensitive feature, but in our case it:
Is used for the benefit of the user (digital detox, productivity improvement).
Does not interfere with system processes or other developers’ apps.
Does not misuse access to APIs.
My question to the forum is:
Did you have similar problems, and how did you resolve them?
Are there any ways to speed up the process or contact someone from the approval team directly?
Should I give up and release it on Android?
I am very disappointed and frustrated. Hope to get some useful tips.
Thank you very much!
Topic:
Accessibility & Inclusion
SubTopic:
General
写了个自己用的app,在自己手机上测试中,隔一周左右就打不开了,显示不再可用。
ps.没花钱买开发者账号,app也不打算发布。
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello, so basically when I play some game and I get a notification it lags like badly to I think 20-30 fps and stutters, when notification is gone it works normally. Also sometimes when I open control panel its has slow and laggy animation.
Bought this phone like a week ago and this makes me sad :(.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi All,
I am develop the a braille keyboard with iOS, when I testing the typing function in notes , the screen update is very slow after typing, we suppose the respond should been instance change, I am not sure how to setup voiceover setting.
does any document supply on this issue?
or does a any guideline for this?
Hello,
I'm observing a persistent and frustrating issue with an accessibility feature called Guided Access that seems to affect many users across different devices and iOS versions.
Problem
The triple-click gesture (side or home button) to activate Guided Access intermittently stops working after the device has been in normal use for a few days (typically 2-7 days) without a restart.
I have done some debugging for Apple in FB16094026 but received no updates after 6 months. So I'm posting here in the hope that this will be solved sooner. A core accessibility feature shouldn't require daily device restarts to function reliably.
Details:
Guided Access is correctly enabled in Settings > Accessibility.
Initially, the triple-click works perfectly.
After a period of normal device use (2-7 days), the triple-click no longer triggers Guided Access in any app.
Restarting the device temporarily resolves the issue, and Guided Access triple-click works again immediately after a reboot. However, the problem recurs after continued use.
Simply toggling the Guided Access setting on/off does NOT fix it.
Additional observation: Even trying to select Guided Access manually via the Accessibility Shortcut menu (if multiple shortcuts are enabled) sometimes fails to launch the feature when in this state.
Affected:
iPhones and iPads
Observed on iOS/iPadOS 16, 17, and now 18, indicating it's a long-standing bug.
Impact:
Guided Access is a crucial accessibility feature for many users (for focus, special needs, parental controls, etc.). Its unreliable activation significantly disrupts daily workflows and reliance on this function. This issue appears to be widespread, with many reports across forums like Apple Support Communities and Reddit.
For example, this post received over 1k upvotes.
To see more examples please refer to FB16094026.
Could Apple please investigate this bug urgently? Thanks.
Topic:
Accessibility & Inclusion
SubTopic:
General
We have an app with a large audience (around 2.1M DAUs) and because of this, we build it with accessibility first in mind.
In that app, we link to specific iOS accessibility settings (such as VoiceOver, Display & Text, etc) in our menu screens, to offer the user a shortcut to customize VO behaviour, text size etc.
Unfortunately, since iOS 18, these links are no longer working, they all open the Settings app, but don't navigate.
It appears (through support) users use these links to easily access the settings, mostly older people trained to go this way in computer courses.
We used to open the settings app through the App-prefs scheme, but seems broken in iOS 18.
eg. App-prefs:root=ACCESSIBILITY&path=VOICEOVER_TITLE
I know about the AccessibilitySettings API, but seems it is only limited to once specific feature.
Is there a way we can get these links to work again?
Can you guys like probably make Visual Intelligence available for the action button on the iPhone 16e? It should be only for iPhones that use A18 and future gen apple chips.
Topic:
Accessibility & Inclusion
SubTopic:
General
i have a swift app,in latest version , i change some class module . in old version used archivedData(withRootObject rootObject: Any) to save class model. in lastest version crashed -[NSKeyedUnarchiver decodeObjectForKey:]: cannot decode object of class
Topic:
Accessibility & Inclusion
SubTopic:
General
i have updated to the ipados 26 and my pointer is still the circle one and not the arrow cursor
Topic:
Accessibility & Inclusion
SubTopic:
General
Have tried to join the developer programme and says its still pending after 3 days.
Anyone any idea how long the procedure takes??
Topic:
Accessibility & Inclusion
SubTopic:
General