When using an app via external keyboard, FocusState and .focused used to work just fine until iOS17. Vertical-axis textfields were also accessible without any issues. But after iOS18 update, adding focused modifier removes elements out of focus order of external keyboard.
1 such example is -when a button using focused modifier and @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink, that button is not accessible via Bluetooth (external) keyboard.
TextEditor / Vertical-axis TextFields also seem to be impacted in external-keyboard-focus-order when added inside ScrollView.
Is this a known iOS18 issue with ScrollView / any tip to get this fixed ?
Sample code that can reproduce this issue:
struct ContentView: View {
@State private var showBottomSheet: Bool = false
@State private var goToNextView: Bool = false
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
var body: some View {
NavigationView {
VStack {
Text("Hello, world!")
// This button works fine in Bluetooth keyboard in all versions
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Goto another view") {
goToNextView = true
}
NavigationLink(
destination: View2(),
isActive: $goToNextView
) { EmptyView() }
.accessibility(hidden: true)
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
#Preview {
ContentView()
}
struct View2: View {
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
@State private var showBottomSheet: Bool = false
var body: some View {
ScrollView {
VStack {
Text("check")
// In iOS18, this button doesn't get focused in Bluetooth / external keyboard
// This issue occurs when these 3 combine in iOS 18 - a button using FocusState inside a view that has a ScrollView & it is opened via NavigationLink
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Test button") { }
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
hi giys,can anyone help me bcouse i cant pair my apple watch series 1 with my iPhone 15
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m working on a macOS Accessibility setup for a French-speaking user and I’ve hit a wall. (I'm not a developper and I'm trying to help my kid with dyslexia)
I successfully built a custom word prediction panel using the Panel Editor (Keyboard) in macOS Accessibility > Keyboard > Accessibility Keyboard.
Here’s what I have so far:
• The prediction panel works system-wide: I can use it to type in Finder, Safari, Notes, TextEdit, and even browser search bars.
• The panel appears above all applications and suggestions show up correctly.
• However, it does not work inside Google Docs (tested in Chrome, Safari, and Firefox). Selecting a word from the panel does nothing in the Docs editor.
I suspect this is because:
• Google Docs does not use a standard macOS text input field.
• Docs is a web app that relies on custom JavaScript editors, contentEditable elements, and canvas rendering, so macOS Accessibility APIs (AXTextField, AXInsertText, etc.) don’t register or inject text events.
• Accessibility tools like the Accessibility Keyboard rely on native macOS text input methods, which don’t hook into Google Docs’ custom editor.
Important:
I’m not a programmer. I’d like to know if there is an easy fix or option in macOS, Google Chrome, or Google Docs that would make my custom prediction panel work, before going into custom development.
Technical setup:
• MacBook Air (M2, 2022)
• RAM: 8 GB
• macOS: Sequoia 15.3.1
• Language: French (system and keyboard)
• Accessibility Keyboard: Enabled via Settings > Accessibility > Keyboard
• Custom panel: Built using Panel Editor (Keyboard), named “Philemon Prédiction”
• Browsers tested: Chrome, Safari, Firefox (same issue)
• Behavior: Panel is visible, suggestions appear, but inserting text does nothing in Google Docs
Has anyone worked around this limitation? Is there a simple setting, workaround, or accessibility option to bridge macOS Accessibility input with Google Docs’ editor?
Thanks a lot!
Topic:
Accessibility & Inclusion
SubTopic:
General
I have the following method to insert @mentions to a text field:
func insertMention(user: Token, at range: NSRange) -> Void {
let tokenImage: UIImage = renderMentionToken(text: "@\(user.username)")
let attachment: NSTextAttachment = NSTextAttachment()
attachment.image = tokenImage
attachment.bounds = CGRect(x: 0, y: -3, width: tokenImage.size.width, height: tokenImage.size.height)
attachment.accessibilityLabel = user.username
attachment.accessibilityHint = "Mention of \(user.username)"
let attachmentString: NSMutableAttributedString = NSMutableAttributedString(attributedString: NSAttributedString(attachment: attachment))
attachmentString.addAttribute(.TokenID, value: user.id, range: NSRange(location: 0, length: 1))
attachmentString.addAttribute(.Tokenname, value: user.username, range: NSRange(location: 0, length: 1))
let mutableText: NSMutableAttributedString = NSMutableAttributedString(attributedString: textView.attributedText)
mutableText.replaceCharacters(in: range, with: attachmentString)
mutableText.append(NSAttributedString(string: " "))
textView.attributedText = mutableText
textView.selectedRange = NSRange(location: range.location + 2, length: 0)
mentionRange = nil
tableView.isHidden = true
}
When I use XCode's accessibility inspector to inspect the text input, the inserted token is not read by the inspector - instead a whitespace is shown for the token. I want to set the accessibility-label to the string content of the NSTextAttachment. How?
I have two iPad Pros
iPad Pro (12.9 inch) (3rd Generation)
iPad Pro 13-inch M4
Both with their Apple Magic Keyboard and trackpad.
With iPadOS 17 I could ctrl-space to jump between input languages. Now with 18.1.1 and 18.2 beta this is broken.
On the old iPad, the languages used to show EN, then DE. Now it shows EN DE, EN DE. I went to settings and the keyboards were shown as having English and German input for two physical keyboards. I deleted and recreated, now each keyboard has only one language and ctrl-space now alternates between EN and DE.
On the new iPad Pro, two keyboards are already set with a single input language, but ctrl-space is not changing the input, it types a space instead. There does not seem to be any way to change the input language using the keyboard.
I did the 18.3 update over the weekend and every contact and their information (family names, addresses, photos etc) that was added to my phone over the last year is completely gone. I’ve spent hours on the phone with Apple and their “top” senior account employees with no resolution. I am told my case has been escalated to engineering and they will get back to me in one week. I have zero confidence my issue will be resolved. I’ve gone over and over every action done over the weekend and the only thing I did was erase some emails and do the update. There has to be a way they can see every action made on my phone to find the issue.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi, my name is Yuki.
I'm developing an application with generative AI for junior and high school students based on the Nocode tool "bubble" (BDK Native).
I want to upload this app to the App Store, however, a notch-like interface element is appearing only in the iPad environment, which is causing my app to fail App Store review.
I've reached out to the BDK Native support team about this issue, but they were unable to identify the cause and only offered a refund as a solution. This is particularly frustrating as I'm unable to proceed with the App Store publication, and time is passing without a resolution.
Technical details:
The notch appears only on iPad devices
The issue is not present on iPhone versions
The app was built using bubble/BDK Native
Multiple App Store submissions have been rejected due to this UI issue
Has anyone encountered a similar issue or knows how to resolve this iPad-specific interface problem? Any guidance or suggestions would be greatly appreciated, as this is blocking our app's release.
Thank you in advance for your help!
Hello, my name is Donald Kirby, and I am a registered developer. I have been helping the development team with accessibility issues for years. When I first inquired about becoming a developer, they told me I could pay a hundred dollars a year and still contribute.
However, in the last two betas, I've encountered problems with voice control becoming inactive. I tried to screenshot the microphone feature to show them, but it disappears when I use Command + 3. Is there a trick I'm unaware of other than turning the microphone off and back on to reactivate it? I'm baffled about what to do next.
I've reported this issue numerous times, and I'm genuinely trying to help. Unfortunately, I have to rely on dictation when voice control doesn't work, but dictation doesn't operate the computer effectively. It feels like a glitch because recent changes have affected voice control and dictation, particularly with voice command functionality.
If I could code myself, I would gladly learn, but it's quite challenging due to the amount of typing involved. Thank you for your assistance.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I have the following problem. I’m developing a NoCode app using the FlutterFlow platform and have been working on it for over a year.
This time, after publishing a new version of the app through FlutterFlow, I tried logging into Apple Store Connect, but I got an error saying that I had made too many login attempts and needed to try again later. However, I hadn’t attempted to log in before that at all.
No matter how long I wait—24 hours, 48 hours—the same error keeps appearing, meaning I still can’t access my account. Apple Support hasn’t responded for 4 days, and in total, I’ve been locked out of my account for over 9 days.
Please help me understand what might be causing this issue. Apple Store Connect refuses to send me an SMS with the login code.
Topic:
Accessibility & Inclusion
SubTopic:
General
When my macOS Cocoa app displays a modal alert with beginSheetModal(for:completionHandler:), VoiceOver sometimes seems to focus on an "illegal" upper level, where any attempts at navigation will give the unhelpful response "Alert, dialog", until you "drill down" with VO + shift + down or switch apps. After that, things will work as expected.
Is this a known bug? Does it happen to anybody else, or am I doing something wrong?
I've just received an email from Apple regarding the Global Accessibility Awareness Day and some forthcoming sessions to promote their accessibility features.
What a joke.
For many years, Apple refuses to provide the most basic accessibility requirement on macOS:
LET USERS DISABLE ALL NON-CONSENSUAL UNSOLICITED ANIMATIONS AND OTHER UI CONVULSIONS.
The scourge of animations started from macOS Lion.
Yes, many of them can be, fortunately, disabled through some obscure Terminal commands (that is, if the user is lucky enough to discover them on some obscure internet resources).
The "Reduce motion" control in System Settings is a fake option that doesn't do anything.
And there are two most glaring accessibility violations that cannot be disabled:
Scroll bar rollover highlight effect introduced on macOS 10.7.3. Every time you move the cursor over a scroll bar, the bar gets highlighted. It results in bringing the user's attention to random scroll bars for no reason whatsoever just because the cursor happens to pass over the bar at some point. HUNDREDS of unnecessary, annoying events of distraction daily!
Expand/collapse animation of NSOutlineView (such as when we open/close a folder in the list view in the Finder, as well as any other app that's using outline views). It's extremely annoying, distracting, and time-wasting.
All feedback submitted about this through the years remains mostly ignored (except for a few cases where I received some ridiculous replies from employees who, apparently, are barely familiar with Macs in general).
Apple does NOT care about accessibility. Not only this, but it's obvious that Apple is, in fact, intentionally abusing those users who can't tolerate distracting, time-wasting animations and UI convulsions.
Hey everyone,
I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery?
The Core Idea
Imagine this:
Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket.
Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery.
Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures.
On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED).
Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy.
Why This Matters
This isn't just a cool gimmick; it offers significant benefits:
True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else.
Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses.
Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech.
Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source.
This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
Topic:
Accessibility & Inclusion
SubTopic:
General
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of:
AVSpeechUtterance(string: "中文") // Any Chinese Content
in the dialect specified by:
Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language
instead of the dialect that I specified in AVSpeechUtterance.voice:
AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese
AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin
However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below.
My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18.
Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen.
However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens.
This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior.
Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi Guys,
I've been trying for two weeks to sign up for the Apple Developer program. I've tried to pay the $99 but it didnt go off my account. Now my accoun tis in pending mode and there is no response from apple support. I've ogged about 10 tickets over the past two weeks.
Any advice here? I am getting desperate.
In our system, when a user enables a mobile hotspot and the system connects to it, the system attempts to verify WIFI availability by sending an HTTP GET request to http://captive.apple.com.
Normally, the server returns:
HTTP Status: 200 (OK)
Content-Type: text/html
This has always been used as a sign of normal connectivity.
Issue:
Since last Friday, the server sometimes responds with:
Content-Type: application/octet-stream
When this occurs, our system determines that the network is unavailable and displays a connection warning (a “!” icon).
Question:
Has Apple recently made any backend or CDN configuration changes to captive.apple.com that could affect the response type?
Any advice how can we solve this problem?
Thanks!
Topic:
Accessibility & Inclusion
SubTopic:
General
We have an iOS App built in .NET MAUI (Multi-platform App UI).
This is a web view App.
We wish to integrate APP Clips into this App.
But we are unable to do it, due to less available resources online on such implementation.
We do not wish to share code between .NET MAUI App and App clips.
We understand it is not possible to add APP Clips without a parent swift/Xcode app.
As an alternative solution we were thinking to Create a new APP in APP Store Connect using XCode/swift and integrate app clips to it.
This parent app when downloaded by users will only redirect users to our MAIN .NET MAUI app to app store connect.
We need to know if such apps will be approved by APPSTORE Connect?
Please guide us on this.
Also please do let us know if you have any other solution to integrate App clips to a .NET MAUI App
Have tried to join the developer programme and says its still pending after 3 days.
Anyone any idea how long the procedure takes??
Topic:
Accessibility & Inclusion
SubTopic:
General
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message.
Does anyone have any ideas what the problem could be?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey everyone!
I am developing a screen time limit app to help people spend less time in distracting apps.
It works this way: people choose unhealthy apps for them and opposite productivity apps. In the app you can exchange time spent on healthy habits to scroll or use other distracting apps.
This idea was loved by social media, and the app already has 100k followers on social media without even being launched yet.
So I am waiting just for one feature permission from Apple, and they have not given me any answer since I applied 3 weeks ago.
There are a lot of similar apps on the market, and this feature exists in other screen time limit apps.
Why is app blocking permission needed?
Time Exchange Functionality:
Users independently select which apps are productive and which are distracting for them.
The system blocks the "negative" apps until the user accumulates enough time in the "positive" ones. This encourages healthy device usage.
Full User Control:
All apps to be blocked are manually selected by the user in the settings.
The extension does not impose any restrictions without explicit permission.
Transparency and Security:
Blocking happens locally, with no data collected about app usage.
We adhere to Apple’s privacy policy.
Compliance with App Store Guidelines:
We understand that app blocking is a sensitive feature, but in our case it:
Is used for the benefit of the user (digital detox, productivity improvement).
Does not interfere with system processes or other developers’ apps.
Does not misuse access to APIs.
My question to the forum is:
Did you have similar problems, and how did you resolve them?
Are there any ways to speed up the process or contact someone from the approval team directly?
Should I give up and release it on Android?
I am very disappointed and frustrated. Hope to get some useful tips.
Thank you very much!
Topic:
Accessibility & Inclusion
SubTopic:
General
Haptic or Sound queue to allow for the accessibility of the blind (sound) and deaf population (haptic) for even knowing when location services and the camera were last used?
Also, the grey notification rather than the purple notification for location services should appear for the full 24 hours after an application has used the app, if the correct description is within the "copy" of Settings
The green light lets them know that the application has changed to the camera and fade out orange light both could even have subtle simply click sounds, like a
shutter, big haptic, softer sound, but editable in Settings, of course