Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Disable Inappropriate Content Warnings in Writing Tools
I’ve noticed that the writing tools frequently display content warnings for certain topics, often more than seems necessary, and I would love to see a way to disable them in my app. Specifically, I have encountered many situations where mentioning LGBTQ topics triggers a content warning and temporarily blocks the user, while similar mentions of heterosexual topics do not. For example: Triggers Content Warning: “I’ve been listening to this song on repeat yesterday and today haha. It’s really nice that it’s about being homosexual ~u~” “I’ve been listening to this song on repeat yesterday and today haha. It’s really nice that it’s about being *** ~u~” Does Not Trigger Content Warning: “I’ve been listening to this song on repeat yesterday and today haha. It’s really nice that it’s about being heterosexual ~u~” “I’ve been listening to this song on repeat yesterday and today haha. It’s really nice that it’s about being straight ~u~” This inconsistency not only introduces unnecessary friction when discussing diverse topics but may also inadvertently suggest that LGBTQ topics are inappropriate or immoral. Such an implication could be harmful to many users, especially young adults, by reinforcing negative perceptions of LGBTQ issues. I believe it’s important for the AI writing tools to handle all topics equitably to promote inclusivity and diversity. And would love to be able to disable these warnings in my app. here’s a video showcasing the example prompts above: https://share.icloud.com/photos/0bdGPxy2xmGZT2ildAU4ThERQ
0
0
534
Sep ’24
iOS 18 review
Over the years apple y'all have continued to improve iOS adding many new features and I’m very impressed with the new ai features however there’s one thing that yall haven’t added for some reason and it’s a clear all tabs button. It would make things so much easier instead of swiping up on all those tabs you can you just press a button and everything’s reset. I don’t know if y'all have a reason for not adding the button yet but just putting it out there.
1
0
577
Nov ’24
iOS VoiceOver Does Not Remove :focus-visible from Button When Moving to Non-Button Elements
When using iOS VoiceOver to navigate a webpage, selecting a element correctly activates the :focus-visible state. However, when VoiceOver moves to a non-button element (such as a or ), the previously focused button retains its :focus-visible state. The focus indicator only updates when VoiceOver moves to another . This behavior can be confusing for screen reader users, as it creates the appearance of multiple elements being focused simultaneously. It also differs from expected keyboard navigation behavior, where focus styles typically update as soon as the user moves to a new interactive element. Is this an intentional VoiceOver behavior, or could this be a bug? If intentional, is there a recommended workaround to ensure correct focus indication when moving between different types of elements? Steps to Reproduce: Enable VoiceOver on an iOS device. Navigate using swipe gestures or explore-by-touch to focus on a . Observe that the button correctly receives the :focus-visible styling. Move to a non-button element (e.g., a with tabindex="0" or an ). Notice that the button still retains its :focus-visible state, even though VoiceOver has moved to a new element. Expected Behavior: The previously focused should lose its :focus-visible state when VoiceOver moves to a different interactive element, just as it does when using keyboard navigation. Actual Behavior: The :focus-visible state remains on the previously focused button unless VoiceOver moves to another . This can create confusion by displaying multiple focus indicators at once. Tested On: iOS 17.7, 18.3.1 iOS Safari iPhone 11 Pro, iPhone 14 Pro Max
1
0
633
Feb ’25
VisionOS - Enhancing Accessibility for Individuals with Visual Impairments
Hello, I am reaching out because I believe your product, the Vision Pro, could significantly improve the quality of life for individuals with visual impairments, and I thought my personal experience might be of interest to you. We could discuss this in more detail, but to respect your time, I’ll get straight to the point: I have retinitis pigmentosa, a rare retinal disease for which there is currently no treatment. This condition causes a progressive narrowing of the visual field (potentially leading to blindness) and a deficit in photoreceptors (let’s just say I’m not exactly a night owl). In my case, it has become impossible to go out alone in the dark or even see in dim light. (Goodbye evening parties—I can’t even find the entrance to a nightclub, let alone navigate the dance floor!). However, I’ve discovered that sometimes, simply looking through my phone screen and using its brightness helps me see much better. Over the years, I’ve imagined how amazing it would be if a pair of glasses could simply display the image my eyes are supposed to perceive, but with enhanced brightness. It would allow me to live my life as freely as others, whether that’s venturing out at night or finding that elusive pen lost in the depths of my apartment. I initially looked into the Google Glass project, for example, but it pales in comparison to what Apple is now creating, don’t you think? What amuses me most is that what some see as a tool that isolates users from reality could actually become an inclusion device for people like me, who would use it to go out and engage with the world. (I can’t count how many times I’ve gone home early in winter because of the anxiety caused by the early darkness, or turned down after-work gatherings with my DevOps colleagues.) The Vision Pro could simply restore reality for us by enhancing what has been progressively lost. And that’s just for nighttime! I can only imagine how helpful it could be during the day—for instance, by detecting obstacles or highlighting dangerous zones in a person’s limited field of vision. One could even use OCR technology to map the results of a visual field test and provide tailored assistance. What incredible potential… I dream of a day when ideas like these become a reality, and I wanted to share them with you. This wouldn’t just help me—it could help many others as well. Thank you for taking the time to read this message. I would be delighted to contribute in any way, should these development directions resonate with you now or in the future. Wishing you an excellent evening, Hugo Bled
1
0
508
Nov ’24
IPad OS18
since having updated my iPad with subject title update, I can no longer type French accents with my magic keyboard. The toggle between English and French is there, the actual key type is no longer available.
0
0
428
Sep ’24
macOS avoid VoiceOver audio ducking for certain sounds
Is it possible to play certain sounds from a macOS app that remain at full volume when VoiceOver is speaking? Here's some background: I want to play sounds from my app (even when it's not in focus) to notify a VoiceOver user that an action is available (this action can be triggerred even when the app is not in focus; and is comfigurable by the user). I tired using an NSSound for this, but VoiceOver ducks the audio of my sound when it is speaking. Is there some way to avoid audio ducking for certain sounds? Or is there another, perhaps lower level, audio API that i can use to achieve this?
1
0
482
Dec ’24
guidedAccessStatusDidChangeNotification does not get called on visionOS
I am trying to get a Notification if Guided access is enabled or disabled on the VisionPro. For doing so you would normally just call: NotificationCenter.default.addObserver(forName: UIAccessibility.guidedAccessStatusDidChangeNotification, object: nil, queue: .main){ noti in print("guided access did change") } and this works fine on iOS devices. But running the exact same code in Vision os Results in not getting a notification at all, even though Guided Access gets enabled or Disabled. For testing i ran a simple default app, that works perfectly on both os types. import SwiftUI struct ContentView: View { var body: some View { VStack { Image(systemName: "globe") .imageScale(.large) .foregroundStyle(.tint) Text("Hello, world!") } .onAppear{ print("is appeairng") NotificationCenter.default.addObserver(forName: UIAccessibility.guidedAccessStatusDidChangeNotification, object: nil, queue: .main){_ in print("guided access did change") } } .padding() } } But as said it prints "guided access did change on iOS" but not on the Vision Pro.
0
0
410
Sep ’24
Don't get the completed shipping contact after authorizaiton
We have get the response from Apple pay after the the customer doing the face ID & touch ID authorization. But the shiping contact is not complete, for examble: ` { "addressLines": [ "1************ kwy" ], "administrativeArea": "FL", "country": "", "countryCode": "", "emailAddress": "S*********le.com", "familyName": "******i", "givenName": "******m", "locality": "*******s", "phoneNumber": "+*******79", "phoneticFamilyName": "", "phoneticGivenName": "", "postalCode": "*****3", "subAdministrativeArea": "", "subLocality": "" },` as the documents said, it should be the completed shipping contact, but the country & countrycode is null https://developer.apple.com/documentation/apple_pay_on_the_web/applepaypayment/1916097-shippingcontact
1
0
408
Dec ’24
When to use Numbered Lists in VoiceOver Accessibility
When would it be a good idea to utilize numbered lists with VoiceOver accessibility. As an example, for tab bars it will read about "2 of 5". In speaking with an Accessibility engineer at WWDC this year, the said that its good practice, but we ran out of time in the call to dig further deeper. When or how do you know when you should read out the item count "2 of 5". It makes sense to me in say a tab group or a chip group, but I don't see it being good in say potentially a UITableView with potentially a various number of sections of which each section itself can include multiple cells. I've had trouble finding additional guidance on this topic, if anyone can provide recommendations or their thoughts, that would be great.
3
0
794
Oct ’24
How to use core spotlight ?
Watched videos, blog post and downloaded their projects and there the core spot lights works accordingly. I copied code to an empty project and did the same as what they did but still is not working os: macOS and iOS on coredataobject I settled up a attribute to index for spotlight and in object it self I putted the attribute name in display name for spotlight. static let shared = PersistenceController() var spotlightDelegate: NSCoreDataCoreSpotlightDelegate? @MainActor static let preview: PersistenceController = { let result = PersistenceController(inMemory: true) let viewContext = result.container.viewContext for _ in 0..<10 { let newItem = Item(context: viewContext) newItem.timestamp = Date() } do { try viewContext.save() } catch { let nsError = error as NSError fatalError("Unresolved error \(nsError), \(nsError.userInfo)") } return result }() let container: NSPersistentContainer init(inMemory: Bool = false) { container = NSPersistentContainer(name: "SpotLightSearchTest") if inMemory { container.persistentStoreDescriptions.first!.url = URL(fileURLWithPath: "/dev/null") } container.loadPersistentStores(completionHandler: { [weak self] (storeDescription, error) in if let error = error as NSError? { fatalError("Unresolved error \(error), \(error.userInfo)") } if let description = self?.container.persistentStoreDescriptions.first { description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey) description.type = NSSQLiteStoreType if let coordinator = self?.container.persistentStoreCoordinator { self?.spotlightDelegate = NSCoreDataCoreSpotlightDelegate( forStoreWith: description, coordinator: coordinator ) self?.spotlightDelegate?.startSpotlightIndexing() } } }) container.viewContext.automaticallyMergesChangesFromParent = true } } in my @main view struct SpotLightSearchTestApp: App { let persistenceController = PersistenceController.shared var body: some Scene { WindowGroup { ContentView() .environment(\.managedObjectContext, persistenceController.container.viewContext) .onContinueUserActivity(CSSearchableItemActionType) {_ in print("") } } } } onContinueUserActivity(CSSearchableItemActionType) {_ in print("") } never gets triggered. Sow What am I missing that they dont explain in the blog post or videos ?
3
0
181
Mar ’25
Voice Over Sound
Hello, When I listen to title in my app with VoiceOver, it makes a strange sound. This characters make with Korean+number+Alphabet. Is this combination makes some strange sound with voice over? I would like to ask if Apple can fix this issue. Thank you.
1
0
189
Mar ’25
Unable to Add Accessibility Trait to UISegmentedControl
I’m trying to add the .header accessibility trait to a UISegmentedControl so that VoiceOver recognizes it accordingly. However, setting the trait using the following code doesn’t seem to have any effect: segmentControl.accessibilityTraits = segmentControl.accessibilityTraits.union(.header) Even after applying this, VoiceOver doesn’t announce it as a header. Is there any workaround or recommended approach to achieve this?
1
0
185
Mar ’25