Hi. I'm a frontend developer and i'm having problems navigating and interacting with any type of carousel using VoiceOver on mobile. The carousel navigation works perfectly using the arrows and touch, but after enabling VoiceOver, the navigation freezes and it's impossible to move the carousel using touch gestures, working only with the arrows.
Following the documentations i'm trying to use a three finger gesture to move the carousel. I also tried other gestures, but none worked as expected. Even tested using the VoiceOver in other webpages that have carousels, like Bootstrap carousel component page, Picpay, Amazon and even Apple.
I made an StackBlitz page with a sample code i'm using to test carousel accessibility https://stackblitz.com/edit/stackblitz-starters-gg2t5w?file=src%2Fmain.ts
So the question is. Is there any limitations regarding using VoiceOver on browsers? Am i doing something wrong here? Or is there a different approach when it comes to carousels?
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Post
Replies
Boosts
Views
Activity
Hi. I'm using the string catalog to localize the app.
How can I change the language in my app?
I want to change the language using the button.
I have the following requirement for my project:
The app localization is currently based on static string JSON files. However, I would like to dynamically update strings upon an API call. Is it possible to update the resource files mentioned above after receiving a response from the API? If not, can I create a custom bundle to store new translations? Additionally, where should I create the custom bundle, specifying the path or location?
Hello
I want to perform an Audit of a tableView in my XCTest
The test failed : "Error Domain=com.apple.xcode.xctest.accessibilityAudit Code=-56 "Audit failed to complete in time"
The test passes manually with the Accessibility inspector.
Do you know how to make it pass automatically?
Localizable.strings:1:1 validation failed: Couldn't parse property list because the input data was in an invalid format
I've spent 3 days moving localization from my older project to my new project and converted to Strings Catalog.
When I tried to export, I got the Unable to build project for localization string extraction error.
Read about that and the workaround seems to be to set Localisation Export Supported to NO, which I did.
I worked once and now it refuses to export and complains about missing targets, which I had to remove from the export process as recommended. WTH?
Or, at some point it complained about a missing Swift Package.
wwdc2023-10155
Why o why did you have to mess with the login screen in Sonoma??? For a visually impaired person (like my wife) you have made the login/lock screen very unfriendly compared to previous releases. With Ventura and previous versions, I have the Lock Screen settings set to:
List of Users
Show Sleep, Restart, Shut Down buttons
The avatar pics of the three users on our computers (admin, me, wife) show up as big icons in the middle of the screen, with the Sleep, Restart, Shutdown buttons in a row right beneath the icons. My wife can find her avatar, click on it, type in her password, and then get right to her magnified closeview screen. With Sonoma, she will have to deal with small moving avatar pics at the bottom of the screen, not friendly at all. And she will NEVER find the Shutdown button hidden in the upper right menu bar.
Doesn't Apple test new upgrades with the accessibility community??? Sonoma is a big step backward for the visually impaired.
Hello, We have recently developed a Refreshable Braille Display for visually challenged people and are planning to add it in VoiceOver inside Braille Display. As far as my knowledge I think there needs to be a custom driver for the device which will help in communicating but I'm not sure how to do it?
Any idea on how to proceed with it or is any documentation available that I can follow?
Thanks.
App was rejected with:
Guideline 2.3 - Performance - Accurate Metadata
We were unable to locate some of the features described in your metadata. > Specifically, “Languages: English and Portuguese (PT and BR).”.
My app is indeed localized in these three languages, tested and working. The app automatically adapts its language to match the user's iOS system language settings.
Isn't this enough?
I replied to the rejection saying this, but I find it hard to believe that the reviewer didn't change the iOS system language settings. Is it possible, or is it something else?
Also, do I need to press the "Add For Review Button" so he can see my reply?
Thank you.
We launched an application we created with Assistive Access enabled on iOS17,
We tried to provide standard services such as Mail and Messages to users with UIActivityViewController,
When selecting the Mail or Messages icon, the following error occurred
"Not allowing open application request from unallowed client process."
Is there anything I can do to allow the request from our process?
We launched an application we created with Assistive Access enabled on iOS17,
When we tried to open a URL using the openURL method, the following error message was returned and the URL could not be opened.
"Untrusted open application requests are not allowed in Assistive Access."
Is there anything I can do to get them to trust our request and allow it?
Accessibility Voiceover is not treating navigation bar left button as first focused element.
If we navigate from A->B then the focus is going to first element inside the B view not to the left button item.(back button)
If we post accessibility notification, in viewWillAppear of B, focus is not shifting.
If we post viewDidAppear then first focus is going to element inside B's view then shifting back to back button.
There is a inconsistency behaviour. Can you please help here.
Thanks
I've found multiple leaks in AVSpeechSynthesizer which are plaguing my users. My users are complaining of crashes due to this.
Ive created a feedback item (FB12212129) with a sample project attached which demonstrates one of the leaks. I'm hoping an engineer notices this. The only way ive hade my feedback noticed in the past is by both creating a feedback item AND posting on the forums. So here's my forum post. Help is much appreciated!
Hi all,
My app is available in English and Arabic.
However, when searching for the app in iOS App Store, by typing the app name in Arabic, it does not show up in the suggested apps.
This occurs in English but also Arabic phone/app configuration.
On the other hand, on Android I don't have this issue.
Do you know what to do to fix this?
We have already added the app name in the AR app store keywords / description, but it did not help as it's still not showing up.
Thank you
I have multiple targets in my app: targetA and targetB. Whenever i "Export For Localization", the .xliff generated will cointain translations for:1) Main.storyboard2) targetA/InfoPlist.strings3) targetB/InfoPlist.strings4) targetA/Localizable.strings5) targetB/Localizable.stringsI want to avoid multiple Localizable.strings, and multiple InfoPlist.strings. We do not need one localization per target.Any way i can force the "Export For Localization", to create a .xliff that only contains one Localizable.strings and one InfoPlist.strings?
Steps:
Create a new clean Swift package in Xcode 14.3
Add a defaultLocalization value to the package manifest
Add a platform to the package manifest: platforms: [.iOS(.v16)]
Run Product -> Export Localizations and the following error appears:
Version 14.3 (14E222b)
macOS 13.1 (22C65)
Feedback FB12183400
xI'm wondering if now would be an excellent time for Apple to consider implementing Accessibility Lens with FaceTime Chroma Green. Unlike Apple, other platforms like Zoom and Webex allow using Chroma Green backgrounds. I use Cam Studio, Elgato Camera Hub, and OBS for Chroma Green effects. I'd like to have the option to use FaceTime with a Chroma Green and choose my background setting. For accessibility and professionalism, we need the ability to change the background settings. We want to be creative with our FaceTime and FaceTime Group. We've invested much in our devices, including iPhones, iPads, and MacBook Pros.
The issue stems from Apple's built-in applications using hardened runtimes. These runtimes prevent apps from loading third-party plugins unless explicitly allowed by the developers. This means third-party camera drivers are incompatible with Apple apps. We're trying to find a solution, but currently, there's nothing we can do. This is a barrier for all of us who are Deaf, Deaf-Blind, and Hard of Hearing and rely on FaceTime and FaceTime Group with Chroma Green background settings. Please let me know. Thank you!
I recently added a new language to my app and I have an issue regarding the localization.
Configuration:
I have the developmentRegion in "fr" and setup 2 localizations: "French: Development language" and "Danish"
The goal is to have the application in Danish language if the preferred language is Danish and French otherwise.
Issue:
If French or Danish appear in my preferred languages, I have no problem.
But if neither French nor Danish was set as preferred language (like only English), I have the app translation in French as expected but all strings regarding app permission like "NSLocationWhenInUseUsageDescription" in system modal alert are in Danish.
Is like the localizable.strings file used is in the "fr.lproj" and the InfoPlist.strings is the "da.lproj"
debug
I verify I have all the strings files at the right repository. I have:
fr.lproj
-- Localizable.strings
-- InfoPlist.strings
da.lproj
-- Localizable.strings
-- InfoPlist.strings
I verify the file architecture in the .app file:
find ***.app -name "*.strings"
***.app/fr.lproj/InfoPlist.strings
***.app/da.lproj/InfoPlist.strings
***.app/fr.lproj/Localizable.strings
***.app/da.lproj/Localizable.strings
Finally I print the Bundle.main.localizedInfoDictionary and I have values of the French file but the application show the Danish ones
Thanks for the help.
This simple text:
Text("10'20\"")
when the localisation is a RightToLeft, like Arabic
on iMac show "20'10
on iOS show 10'20"
Which is the correct one?
Why the are different?
Hi there,
I'm wondering about how certain keyboard keystrokes should work when using an external keyboard to navigate apps.
In particular i'm wondering if there is a difference between the arrow keys and Ctrl+Shift key. The iPhone keyboard shortcut documentation states that Ctrl+Tab moves to the next item - but doesn't elaborate what an 'item' is.
Should you be able to get to every interactive element on a view with both the arrow keys and the Ctrl+Shift key?
Thanks for your help