Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode.
Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work?
Cannot make FaceTime calls with favorite contacts.
Find My iPhone cannot jump to the maps app.
Camera cannot zoom in or out.
Photos cannot be deleted, edited, or shared in a shared album in the photos app.
Photos/videos cannot be sent in messages.
Spotify cannot be accessed from the lock screen.
Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks).
There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
i'm trying to use tthis feature with voice over, but each time i try it uses my system voice.
how can i fix this.
i sent a report to apple.
my id is FB15265988
Topic:
Accessibility & Inclusion
SubTopic:
General
Individuals with a stroke can end up with vision impairments: specifically Homonymous Hemianopia which basically means the individual has lost sight in (as an example) the left half of both eyes. I'm interested in understanding if it would be possible to help individuals with this vision impairment by providing an accessibility config within the Apple Vision Pro which would first determine an individuals field of view (possibly by showing a field of dots across the entire "screen" and having the individual look at the dot and click. Based on the results of this field of view, this would determine how the screen would be presented to the user moving forward.
My mom (82 years old) had a stroke recently and was diagnosed with Homonymous Hemianopia. She lived on her IPhone and would love to get back the ability to text message, use Facebook, and order items from Amazon.
Please advise if you believe the Apple Vision Pro would be capable of helping in this area with the suggested development, or other thoughts.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi apple, ive been having a problem with my ipad pro 5th generation.
since updating my ipad it has been acting weird lately… to be specific it keeps closing twice randomly and the widgets turn white andcmy screen keeps going black
when i go on apps it keeps exiting out of the app
also the new siri is so slow and wouldnt answer when i say [hey siri] only on random ocasions
please help me fix this problem because i need my ipad for studying…
thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
I get no sound at all. Not music, no texts, no phone. If I unplug and try to plug back in, the phone is not recognized at all
Topic:
Accessibility & Inclusion
SubTopic:
General
I am facing issue of back camera in my iphone 14 plus it is showing the black screen and my iphone is manufacture between april 2023 to april 2024 but its still not eligible for apple program my phone is also getting same issue why its not eligible for it
Topic:
Accessibility & Inclusion
SubTopic:
General
SwiftUI provides the accessibilityCustomContent(_:_:) modifier to add additional accessibility information for an element. However, I couldn’t find a similar approach in UIKit.
Is there a way to achieve this in UIKit?
In SwiftUI, the date picker component is breaking in colour contrast accessibility. Below code has been use to create date picker:
struct ContentView: View {
@State private var date = Date()
@State private var selectedDate: Date = .init()
var body: some View {
let min = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
let max = Calendar.current.date(byAdding: .year, value: 4, to: Date()) ?? Date()
DatePicker(
"Start Date",
selection: $date,
in: min ... max,
displayedComponents: [.date]
)
.datePickerStyle(.graphical)
.frame(alignment: .topLeading)
.onAppear {
selectedDate = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
}
}
}
#Preview {
ContentView()
}
attaching the screenshot of failure accessibility.
I was able to add shortcuts with parameters and use them from the Shprtcuts app in iOS 17, nevertheless Siri intent did never work.
I upgraded to iOS 18 my app and my mobile.
Now, the shortcut only appears in shortcuts app if no parameter is added to it. When I try to set a parameter, the shortcut does not appear any mora in Shortcuts app.
struct ShortcutsProvider: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: OpenAppIntent(),
phrases: [
"Show (.$screen) in (.applicationName)"
],
shortTitle: "Open",
systemImageName: "iphone.badge.play"
)
}
}
struct OpenAppIntent: AppIntent {
static var title: LocalizedStringResource = "Show"
static let description = IntentDescription("Shows a screen.")
static var openAppWhenRun: Bool = true
static var authenticationPolicy = IntentAuthenticationPolicy.alwaysAllowed
@Parameter(title: "screen")
var screen: String
@MainActor
func perform() async throws -> some IntentResult {
return .result()
}
}
extension ScreenOption: AppEntity {
struct OpenAppQuery: EntityQuery {
@IntentParameterDependency<OpenAppIntent>( \.$screen )
var openAppIntent
func entities(for: [ScreenOption.ID]) async throws -> [ScreenOption] {
return []
}
func suggestedEntities() async throws -> [ScreenOption] {
return []
}
}
var displayRepresentation: DisplayRepresentation {
.init(stringLiteral: "\(title)")
}
static var defaultQuery: OpenAppQuery = OpenAppQuery()
static var typeDisplayRepresentation: TypeDisplayRepresentation = .init(name: "Screen")
}
extension ScreenOption: EntityIdentifierConvertible {
static func entityIdentifier(for entityIdentifierString: String) -> ScreenOption? {
allCases.filter { $0.rawValue == entityIdentifierString }.first
}
public var entityIdentifierString: String {
rawValue
}
public init?(entityIdentifierString: String) {
guard let screenOption = ScreenOption.entityIdentifier(for: entityIdentifierString)
else { return nil }
self = screenOption
}
}
who else has this error?
Topic:
Accessibility & Inclusion
SubTopic:
General
Allow the user to add their own tags to the default emoji tags.
For instance, this emoji, for me, is nonna: 🤌🏻. My efficiency would improve immensely if I could search for it as the “Nonna” emoji, rather than searching for nonna, remembering it doesn’t exist, trying the search for other things it might be called, realising I don’t know what it is, then having to scroll through all the hand emojis twice to find it.
🤌🏻🤞🏼👌
Hi everyone,
I wanted to share my thoughts on the recent announcement regarding the Image Playground waitlist. I understand that there has been an increase in wait times, but I find it interesting that a friend of mine signed up yesterday and already gained access.
I’ve seen in various places that the system operates on a first-come, first-served basis. I’ve been on the waitlist since day one, so I’m starting to wonder if this is truly the case.
If anyone has more insights on this or has had a similar experience, I’d love to hear your thoughts!
Thanks!
Topic:
Accessibility & Inclusion
SubTopic:
General
before I start this could just be me and handful of people but I like to reorganize my phone screen to my needs based on what’s going on in life. I was jaut thinking it would be easier if u could get rid of all the folders at once then reorganize or something easier than this long extensive process it is now.
Went to privacy and security on iPhone 11 and development mode is not there
I have two views in a container view as below
@IBOutlet weak var dataDisclosureView: UIStackView! // Main ContainerView
@IBOutlet private weak var titleLabel: UILabel! {
didSet {
titleLabel.text = "Hello"
}
}
@IBOutlet private weak var descriptionLabel: UILabel! {
didSet {
descriptionLabel.text = "World"
}
}
@IBOutlet weak var descriptionView: UIStackView! { // sub container view containing titleLabel and descriptionLabel
didSet {
descriptionView.isAccessibilityElement = true
descriptionView.accessibilityLabel = "Hello"
descriptionView.accessibilityIdentifier = "test_hello"
}
}
@IBOutlet private weak var requestButton: UIButton! {
didSet {
requestButton.isAccessibilityElement = true
requestButton.accessibilityLabel = "Request Button"
requestButton.accessibilityIdentifier = "test_button"
}
}
override func viewDidLoad() {
super.viewDidLoad()
dataDisclosureView.isAccessibilityElement = false
dataDisclosureView.accessibilityElements = [ descriptionView ?? "" ]
if #available(iOS 17.0, *) {
dataDisclosureView.automationElements = [ descriptionView ?? "",
requestButton ?? ""]
} else {
// Fallback on earlier versions
}
let requestButtonAction = UIAccessibilityCustomAction(name: "start",
target: self,
selector: #selector( request))
dataDisclosureView.accessibilityCustomActions = [ requestButtonAction ]
}
Mx issue is I want AccessibilityIdentifers for descriptionLabel,titleLabel,requestButton and hintLabel(For Automation) and accessibility labels for descriptionView and requestButton(VoiceOver Accessibility).
But I am unable to see accessibilityIdentifier for Button, TitleLabel and descriptionLabel in AccessibilityInspector. what am I doing wrong here?
Can you guys like probably make Visual Intelligence available for the action button on the iPhone 16e? It should be only for iPhones that use A18 and future gen apple chips.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello, I have a question regarding the voice and sound recognition features on the iPhone 15 Pro.
The iPhone 15 Pro is equipped with four microphones, and I understand that for features like Apple’s sound recognition and when invoking Siri, the microphone(s) must always be active. My question is whether the device uses a single microphone (mono channel) for these functions or if multiple microphones are activated simultaneously.
I would appreciate clarification on how the microphones are utilized in sound and voice recognition features.
Thank you for your assistance.
Best regards.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
App Tracking Transparency
iPhone
AudioToolbox
ML Compute
How do I enable the system hand controls within an immersive space?
I have an ImmersiveScene and would like to enable the new 2.0 system controls like the home button and volume slider.
ImmersiveSpace(id: appModel.immersiveSpaceID) {
ImmersiveView()
}
.immersionStyle(selection: .constant(.full), in: .full)
.upperLimbVisibility(.visible)
While I can see my hands and arms in this view, I cannot get the "New Hand Gestures" to show up when on visionOS 2.0. When I leave the immersive space, they appear.
Hi,
I am writing in the hope to receive some clarification about the rationale of the Audit type sufficientElementDescription - in context with Accessibility Audit API.
Please see my test below:
And another example in context with Xcode, where the strings visible in the UI are also set as accessible labels of their respective elements.
Thanks for your help!
I am working on capturing 48MP images using the iPhone 16 Pro Max with the Ultra-wide camera. I’ve updated the code to capture the maximum supported dimensions with the following snippet:
if #available(iOS 16.0, *) {
photoOutput.maxPhotoDimensions = device.activeFormat.supportedMaxPhotoDimensions.last!
photoSettings.maxPhotoDimensions = .init(width: 5712, height: 4284)
}
However, I’m still not getting the expected results. My goal is to capture 48MP images, and I want to confirm if the Ultra-wide camera supports this resolution or if I’m missing any other configuration.
Any guidance would be appreciated!