I have two views in a container view as below
@IBOutlet weak var dataDisclosureView: UIStackView! // Main ContainerView
@IBOutlet private weak var titleLabel: UILabel! {
didSet {
titleLabel.text = "Hello"
}
}
@IBOutlet private weak var descriptionLabel: UILabel! {
didSet {
descriptionLabel.text = "World"
}
}
@IBOutlet weak var descriptionView: UIStackView! { // sub container view containing titleLabel and descriptionLabel
didSet {
descriptionView.isAccessibilityElement = true
descriptionView.accessibilityLabel = "Hello"
descriptionView.accessibilityIdentifier = "test_hello"
}
}
@IBOutlet private weak var requestButton: UIButton! {
didSet {
requestButton.isAccessibilityElement = true
requestButton.accessibilityLabel = "Request Button"
requestButton.accessibilityIdentifier = "test_button"
}
}
override func viewDidLoad() {
super.viewDidLoad()
dataDisclosureView.isAccessibilityElement = false
dataDisclosureView.accessibilityElements = [ descriptionView ?? "" ]
if #available(iOS 17.0, *) {
dataDisclosureView.automationElements = [ descriptionView ?? "",
requestButton ?? ""]
} else {
// Fallback on earlier versions
}
let requestButtonAction = UIAccessibilityCustomAction(name: "start",
target: self,
selector: #selector( request))
dataDisclosureView.accessibilityCustomActions = [ requestButtonAction ]
}
Mx issue is I want AccessibilityIdentifers for descriptionLabel,titleLabel,requestButton and hintLabel(For Automation) and accessibility labels for descriptionView and requestButton(VoiceOver Accessibility).
But I am unable to see accessibilityIdentifier for Button, TitleLabel and descriptionLabel in AccessibilityInspector. what am I doing wrong here?
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Added a view controller in the storyboard, added a tableview in this view, and added a cell under the table, when I run the APP to jump to the page, when using the narration function, I find that when I use three fingers to swipe up or down, a sentence will be broadcast in English, I want to no longer change the accessiblity of the cell, when I perform the behavior of swiping up or down with three fingers, Broadcast how Chinese should be implemented.
Dear Apple Support,
I am reporting a critical issue affecting parental control apps like my app, Choreio, which is live on the App Store.
When Screen Time settings are configured to require a parent’s password for changes, parents must log in on their child’s device to make any adjustments. This restriction is expected to extend to apps using the Screen Time API, such as Choreio.
However, I’ve discovered a significant bug: children can bypass this restriction by simply toggling off Choreio in the Screen Time settings—without needing the parent’s password. This effectively disables the app and defeats its purpose as a parental control tool.
Please address this issue as soon as possible to ensure the intended functionality of parental controls. Let me know if you need any additional information to assist with resolving this.
Thank you for your attention to this matter.
Best regards,
Jeff Houston
STEPS TO REPRODUCE
Here are the steps to reproduce the issue clearly:
Install Choreio from the App Store on the child’s phone.
Enable parental controls in Screen Time and set it to require the parent’s password for any changes to Screen Time settings.
Go to the Screen Time settings on the child’s phone.
Observe that the child can simply toggle off Choreio, effectively deactivating the app, without needing the parent’s password.
Expected behavior: Toggling off Choreio should require the parent’s password, just like it does for other Screen Time settings.
Let me know if additional details are needed!
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello, my submission is based on Haptics. Without it the App doesn't make sense. And only real iPhone can give this opportunity. But it says that Xcode playgrounds will be tested on Simulator.
Is it indeed like this? What can I do?
Thank you in advance!
Having issues with the screen will not wake when a notification comes in. it was make sound and vibrate but screen does not wake up. Also, unable to change wallpaper on lock screen as well. Anyone else experiencing this?
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post
Working Implementation with Static Audio File:
let audioPlayer = SCNAudioPlayer(source: audioSource)
node.addAudioPlayer(audioPlayer)
Attempted Implementation with Procedural Audio:
// Audio generation code
}
let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode)
node.addAudioPlayer(audioPlayer)
In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here:
Apple docs
Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating:
Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use.
However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
I m unable to connect CarPlay to Mercedes GLC 2024 with Apple 15PMX. Made all recommendations by Apple, renew the car's sw too. It didn't work despite I tried many times. Funny that my iPhone 11 connects flawlessly both with cord or wireless.
However my iPhone 15PMX does connect to any other cars too either with cable or wireless.
Is it a bug or a really inability?
Hello everyone,
I am experiencing random and untimely disconnection issues with my iStorage diskAshur external HDD/SSD while using macOS. I have already contacted Apple Support but unfortunately, they could not provide a solution and suggested I reach out to this forum for further assistance.
Has anyone else faced similar issues with the diskAshur on macOS? If so, what steps did you take to resolve the problem? Any help or advice would be greatly appreciated.
Thank you!
Topic:
Accessibility & Inclusion
SubTopic:
General
I was using a macOS 15.0 beta. and I tried to update to macOS 15.1. I clicked the update and entered my password but then it just stuck on loading forever. How can I fix that?
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures.
Specifically... iframes.
There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch.
If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users.
VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes.
VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor.
While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
My app does not automatically switch languages (voices) in VoiceOver when I have VoiceOver on and the screen includes both English and Spanish content. Instead of switching between the correctly accented voice, whatever my manual Voices rotor setting is, that's what the content is announced as. I can manually switch the Voice in the rotor to make words sound inteligible but my main concern is that language changes are not auto-detected even though that feature in my Settings is on.
VO does detect language changes in other apps, so I think there must be either misplaced or missing accessibiiltyLanguage strings somewhere in my app. Or is it more than that for localization considerations?
I reached out to the Apple Accessibilty team and was directed to open a ticket here, as my question is about the underlying code.
I am a novice developer and primarily accessibility SME; i expect that wnen "detect languages" is on in the user settings for VoiceOver, that the voice for the screen reader speech output will automatically switch to the correct language / accent. I recognize there is a problem but am not sure where the breakdown is. I would like guidance how to fix it to relay to my teams.
https://developer.apple.com/documentation/objectivec/nsobject/1615192-accessibilitylanguage
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Swift
Accessibility
Localization
Internationalization
again and again this issue is coming , restarted my laptop, have storage , I don't why this issue is coming!!
Topic:
Accessibility & Inclusion
SubTopic:
General
您好,我们是中国的一家软件开发公司。我们应用程序的底层框架基于美国开源社区的 XMPP 协议框架。它是一个国际开源项目,许多中国软件公司以开源框架为基础,在美国开源社区框架之上开发 UI 和替换功能。但是当我们需要将应用上传到应用商店时,会收到提示(拒绝 4.3A 应用垃圾邮件),因为中国目前没有独立开发这个框架的能力。因此,我们不得不承认,我们的 app 使用了美国开源社区的框架,因为中国的很多其他公司也都使用了美国开源社区的框架。这导致我们在将应用程序上传到应用商店时出现提示 (reject 4.3A application spam),因为其他公司在将其上传到应用商店时也使用相同的底层开源框架。这会导致我们公司在将此应用程序上传到应用商店时收到提示(拒绝 4.3A 应用程序垃圾邮件)。我们公司在这个 App 上花费了大量的精力和金钱,如果不允许中国公司上传到 App Store,就会被拒绝。使用来自美国的开源框架,这导致很多中国公司在后期无法使用美国的开源框架作为开发的基础。我们恳请贵公司为中国的小企业提供指导,并就我们未来的发展提供建议。谢谢。
Topic:
Accessibility & Inclusion
SubTopic:
General
Watched videos, blog post and downloaded their projects and there the core spot lights works accordingly.
I copied code to an empty project and did the same as what they did but still is not working
os: macOS and iOS
on coredataobject I settled up a attribute to index for spotlight and in object it self I putted the attribute name in display name for spotlight.
static let shared = PersistenceController()
var spotlightDelegate: NSCoreDataCoreSpotlightDelegate?
@MainActor
static let preview: PersistenceController = {
let result = PersistenceController(inMemory: true)
let viewContext = result.container.viewContext
for _ in 0..<10 {
let newItem = Item(context: viewContext)
newItem.timestamp = Date()
}
do {
try viewContext.save()
} catch {
let nsError = error as NSError
fatalError("Unresolved error \(nsError), \(nsError.userInfo)")
}
return result
}()
let container: NSPersistentContainer
init(inMemory: Bool = false) {
container = NSPersistentContainer(name: "SpotLightSearchTest")
if inMemory {
container.persistentStoreDescriptions.first!.url = URL(fileURLWithPath: "/dev/null")
}
container.loadPersistentStores(completionHandler: { [weak self] (storeDescription, error) in
if let error = error as NSError? {
fatalError("Unresolved error \(error), \(error.userInfo)")
}
if let description = self?.container.persistentStoreDescriptions.first {
description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey)
description.type = NSSQLiteStoreType
if let coordinator = self?.container.persistentStoreCoordinator {
self?.spotlightDelegate = NSCoreDataCoreSpotlightDelegate(
forStoreWith: description,
coordinator: coordinator
)
self?.spotlightDelegate?.startSpotlightIndexing()
}
}
})
container.viewContext.automaticallyMergesChangesFromParent = true
}
}
in my @main view
struct SpotLightSearchTestApp: App {
let persistenceController = PersistenceController.shared
var body: some Scene {
WindowGroup {
ContentView()
.environment(\.managedObjectContext, persistenceController.container.viewContext)
.onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
}
}
}
onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
never gets triggered. Sow What am I missing that they dont explain in the blog post or videos ?
Hello, I have a question regarding the voice and sound recognition features on the iPhone 15 Pro.
The iPhone 15 Pro is equipped with four microphones, and I understand that for features like Apple’s sound recognition and when invoking Siri, the microphone(s) must always be active. My question is whether the device uses a single microphone (mono channel) for these functions or if multiple microphones are activated simultaneously.
I would appreciate clarification on how the microphones are utilized in sound and voice recognition features.
Thank you for your assistance.
Best regards.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
App Tracking Transparency
iPhone
AudioToolbox
ML Compute
I have a Twitter account that I registered with Apple id and I still don't know the PIN and I'm having a problem with it knowing the PIN I need help
privaterelay.appleid.com
Topic:
Accessibility & Inclusion
SubTopic:
General
I frequently use my iPad to develop remotely in VSCode or prototype designs in Figma. This is currently all done in the browser. Given that many of these experiences rely heavily on the keyboard I was hoping there would be a solution to make a keyboard persistent on the screen, or at least a few hot keys.
Is it possible for me to develop an accessibility tool that could stay persistent on the screen? Perhaps something that would talk with assistive touch? Or is that in Apple’s no no square?
Hi guys,
I'm trying to add accessibility labels to a static text and custom SwiftUI views. Example:
MyView {
...
}
//.accessibilityElement()
.accessibilityElement(children: .combine)
//.accessibilityRemoveTraits(.isStaticText)
//.accessibilityAddTraits(.isButton)
.accessibilityLabel("ACCESSIBILITY LABEL")
.accessibilityHint("ACCESSIBILITY HINT")
When using 'voiceover' or 'hover text' accessibility features, focus moves only between active elements and not on static elements.
When I add .focusable() it works, but I don't want to make those elements focusable when all accessibility features are off.
I suppose I could do something like this:
.focusable(UIApplication.shared.accessibility.voiceOver.isOn || UIApplication.shared.accessibility.hoverText.isOn)
Note: this is just pseudocode, because I don't remember exactly how to detect current accessibility settings.
However using focusable() with conditions on hundreds of static texts in an app seems to be overkill. Also the accessibility focus is needed on some control containers where we already have a little more complex handling of focus with conditions in focusable(...) on parent and child elements, so extending it for accesssiblity seems to be too complicated.
Is there a simple way to tell accessiblity that an element is focusable specifically for 'hover text' and for 'voiceover'?
Example what I want to accomplish for TV content:
VStack
{
HStack {
Text(Terminator)
if parentalLock {
Image(named: .lock)
{
}
.accessibilityLabel(for: hover, "Terminator - parental lock")
Text("Sci-Fi * 8pm - 10pm * Remaining 40 min. * Live")
.accessibilityLabel(for: hover, "Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live")
}
.accessibilityLabel(for: voiceover, "Terminator, Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live, parental lock")```
I saw all Accessibility WWDC videos 2016, 2022, 2024 and googling it for several hours, but I coudln't find any solution for static texts and custom views. From those videos it appears .accessibilityLabel() should be enough, but it clearly works only on actvie elements and does not work for other SwiftUI views on tvOS without focusable().
Can this be done without using focusable() with conditions for detection which accessibility feature is on?
The problem with focusable would be that for accessibility I may need to read a text for parent view, but focus needs to be placed on a child element. I remember problems when focusable() is set on parent view that child was not focusable or something like that - simply put: complications in focus logic.
Thanks.
In VoiceOver, when using Group Navigation style, the cursor first focuses on the semantic group. To navigate inside the group, a two-finger swipe (left or right) can be used. This behavior works for default containers like the Navigation Bar, Tab Bar, and Tool Bar.
How can I achieve the same behavior for a custom view?
I tried setting accessibilityContainerType = .semanticGroup, but it only works for Mac Catalyst. Is there an equivalent approach for iOS?
Is it possible to spoof mac adress of Ventura 13.6.9?
Topic:
Accessibility & Inclusion
SubTopic:
General