Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
1
0
583
Dec ’25
VoiceOver is not respecting lang in HTML option
I have an HTML select that has Spanish text in the options. When VoiceOver reads the selected option (unopened), it switches to Spanish as expected. However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text. I have tried adding lang on to the select tag and the option tag but neither helps https://codepen.io/grahamfowles/pen/VYYRxMK
0
0
139
May ’25
Frames rotated and shifted in landscape for iOS simulator
When I try to get the frames of a AXUIElementRef using AXUIElementCopyAttributeValue(element, (CFStringRef)attribute, &result) the frames are shifted and rotated on the iOS simulator. I get the same frames when using the Accessibility Inspector when the Max is selected as the host. When I switch the host to the iOS simulator the frames are correct. How is the Accessibility Inspector getting the correct frames? And how can I do the same in my app?
1
0
133
Jun ’25
BLE Device Not Appearing in Scan List on iOS After Name Change
I'm encountering an issue related to BLE device discovery on iOS. I have a BLE peripheral device that I initially connected to using an iOS device. After this connection, the BLE device's advertised name was programmatically changed by the peripheral. Now, when I try to scan for this device using other iOS devices, it does not appear in the scan results in most apps — including nRF Connect and our own custom BLE app that uses CoreBluetooth. A few observations: The device is definitely powered on and advertising (confirmed via Android). The name change is reflected correctly on Android and on the iOS device that originally connected to it. Other iOS devices no longer see the device in their scan list.
1
0
350
Jul ’25
Defining boundaries of inline dialogs for VO users
Hello, I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182. The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends? In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs. I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
0
0
131
May ’25
MAS restrictions on file read-write for desktop electron apps
We have an electron app developed for Mac. We would like to restore the user data previously saved in downloads once user installs the app from store and first launch. But MAS has restrictions with ""com.apple.security.files.downloads.read-write". We have enabled the user access in Entitlement files and request user permission before access What options can be user to auto restore the data from downlodas?
0
0
96
Apr ’25
Camera Crashes
Hi everybody, I'm trying to build a QR-Code Scanner and Generator App for IOS. Whenever I try to implement the camera the app crashes with this comment: This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data. I tried to reduce the app to the minimum of nothing but camera with the same result. Any ideas? Tank you and best Regards Horst Schippers
0
0
72
Apr ’25
iOS18.3.1+ widget: Local color picture load widget crashes
Environment:xcode 16.2 WidgetKit: Image(uiImage: UIImage(named: "jp_jump")!).resizable().scaledToFit().frame(width: 58, height: 16).padding(EdgeInsets(top: 0, leading: 16, bottom: 0, trailing: 0)) ”jp_jump“: Local color picture load widget crashes info: Thread 4: EXC_RESOURCE (RESOURCE_TYPE_MEMORY: high watermark memory limit exceeded) (limit=30 MB)
3
0
132
Mar ’25
Verification error: unable to get local issuer certificate
C:\Users\xjc>openssl s_client -connect gateway.push.apple.com:2195 -showcerts Connecting to 17.188.183.32 CONNECTED(000000AC) depth=1 C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K verify error:num=20:unable to get local issuer certificate verify return:1 depth=0 C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com verify return:1 B0640000:error:0A000410:SSL routines:ssl3_read_bytes:ssl/tls alert handshake failure:ssl\record\rec_layer_s3.c:908:SSL alert number 40 Certificate chain 0 s:C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com i:C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K a:PKEY: rsaEncryption, 2048 (bit); sigalg: RSA-SHA256 v:NotBefore: Aug 16 21:34:09 2024 GMT; NotAfter: Aug 15 21:34:07 2025 GMT -----BEGIN CERTIFICATE----- MIIGqDCCBZCgAwIBAgIQCUjuxVwL1mhSlrjSSk/+BzANBgkqhkiG9w0BAQsFADCB WnKd+td/wZ6Ej6EB mDF8JCSKz/ck+NnLfGM0jFdcTCl8dKuqM9XetP4ls1sVyUuLM7sJiQvMVDzluZ22 LA9EMc5ZcbdV96ZpKS3ETk5n7355fyVX+jZ24ZvfhtdyPvdUGuHzcrK/YfB0AsjY hIhXgkxMfqJDjj7Af1CDPSAv9cylGI5b9v5QX93pM8uGxSRZTGS5m4qJG0Jj4UpV QlzppFg+qE41yDrdy4rLxROW4bp/HPvEjo1YoAle3K208UMffVPBqGfZqbZ01+hP gHCeamBb6QlV2Zq6q/VEKUO6p6oFQnI0phQiAQ== -----END CERTIFICATE----- 1 s:C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K i:C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2009 Entrust, Inc. - for authorized use only, CN=Entrust Root Certification Authority - G2 a:PKEY: rsaEncryption, 2048 (bit); sigalg: RSA-SHA256 v:NotBefore: Oct 5 19:13:56 2015 GMT; NotAfter: Dec 5 19:43:56 2030 GMT -----BEGIN CERTIFICATE----- MIIFDjCCA/agAwIBAgIMDulMwwAAAABR03eFMA0GCSqGSIb3DQEBCwUAMIG+MQsw CQYDVQQGEwJVUzEWMBQGA1UEChMNRW50cnVzdCwgSW5jLjEoMCYGA1UECxMfU2Vl IHd3dy5lbnRydXN0Lm5ldC9sZWdhbC10ZXJtczE5MDcGA1UECxMwKGMpIDIwMDkg RW50cnVzdCwgSW5jLiAtIGZvciBhdXRob3JpemVkIHVzZSBvbmx5MTIwMAYDVQQD EylFbnRydXN0IFJvb3QgQ2VydGlmaWNhdGlvbiBBdXRob3JpdHkgLSBHMjAeFw0x NTEwMDUxOTEzNTZaFw0zMDEyMDUxOTQzNTZaMIG6MQswCQYDVQQGEwJVUzEWMBQG A1UEChMNRW50cnVzdCwgSW5jLjEoMCYGA1UECxMfU2VlIHd3dy5lbnRydXN0Lm5l dC9sZWdhbC10ZXJtczE5MDcGA1UECxMwKGMpIDIwMTIgRW50cnVzdCwgSW5jLiAt IGZvciBhdXRob3JpemVkIHVzZSBvbmx5MS4wLAYDVQQDEyVFbnRydXN0IENlcnRp ZmljYXRpb24gQXV0aG9yaXR5IC0gTDFLMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8A MIIBCgKCAQEA2j+W0E25L0Tn2zlem1DuXKVh2kFnUwmqAJqOV38pa9vH4SEkqjrQ jUcj0u1yFvCRIdJdt7hLqIOPt5EyaM/OJZMssn2XyP7BtBe6CZ4DkJN7fEmDImiK m95HwzGYei59QAvS7z7Tsoyqj0ip/wDoKVgG97aTWpRzJiatWA7lQrjV6nN5ZGhT JbiEz5R6rgZFDKNrTdDGvuoYpDbwkrK6HIiPOlJ/915tgxyd8B/lw9bdpXiSPbBt LOrJz5RBGXFEaLpHPATpXbo+8DX3Fbae8i4VHj9HyMg4p3NFXU2wO7GOFyk36t0F ASK7lDYqjVs1/lMZLwhGwSqzGmIdTivZGwIDAQABo4IBDDCCAQgwDgYDVR0PAQH/ BAQDAgEGMBIGA1UdEwEB/wQIMAYBAf8CAQAwMwYIKwYBBQUHAQEEJzAlMCMGCCsG AQUFBzABhhdodHRwOi8vb2NzcC5lbnRydXN0Lm5ldDAwBgNVHR8EKTAnMCWgI6Ah hh9odHRwOi8vY3JsLmVudHJ1c3QubmV0L2cyY2EuY3JsMDsGA1UdIAQ0MDIwMAYE VR0gADAoMCYGCCsGAQUFBwIBFhpodHRwOi8vd3d3LmVudHJ1c3QubmV0L3JwYTAd BgNVHQ4EFgQUgqJwdN28Uz/Pe9T3zX+nYMYKTL8wHwYDVR0jBBgwFoAUanImetAe 733nO2lR1GyNn5ASZqswDQYJKoZIhvcNAQELBQADggEBADnVjpiDYcgsY9NwHRkw y/YJrMxp1cncN0HyMg/vdMNY9ngnCTQIlZIv19+4o/0OgemknNM/TWgrFTEKFcxS BJPok1DD2bHi4Wi3Ogl08TRYCj93mEC45mj/XeTIRsXsgdfJghhcg85x2Ly/rJkC k9uUmITSnKa1/ly78EqvIazCP0kkZ9Yujs+szGQVGHLlbHfTUqi53Y2sAEo1GdRv c6N172tkw+CNgxKhiucOhk3YtCAbvmqljEtoZuMrx1gL+1YQ1JH7HdMxWBCMRON1 exCdtTix9qrKgWRs6PLigVWXUX/hwidQosk8WwBD9lu51aX8/wdQQGcHsFXwt35u Lcw= -----END CERTIFICATE----- Server certificate subject=C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com issuer=C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K Acceptable client certificate CA names C=US, O=Apple Inc., OU=Apple Certification Authority, CN=Apple Root CA CN=Apple Worldwide Developer Relations Certification Authority, OU=G4, O=Apple Inc., C=US CN=Apple Application Integration 2 Certification Authority, OU=Apple Certification Authority, O=Apple Inc., C=US CN=Apple Corporate Authentication CA 1, OU=Certification Authority, O=Apple Inc., C=US C=US, O=Apple Inc., OU=Apple Worldwide Developer Relations, CN=Apple Worldwide Developer Relations Certification Authority CN=Apple Corporate Root CA, OU=Certification Authority, O=Apple Inc., C=US C=US, O=Apple Inc., OU=Apple Certification Authority, CN=Apple Application Integration Certification Authority C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com Client Certificate Types: RSA sign, ECDSA sign Requested Signature Algorithms: ECDSA+SHA256:RSA-PSS+SHA256:RSA+SHA256:ECDSA+SHA384:RSA-PSS+SHA384:RSA+SHA384:RSA-PSS+SHA512:RSA+SHA512:RSA+SHA1 Shared Requested Signature Algorithms: ECDSA+SHA256:RSA-PSS+SHA256:RSA+SHA256:ECDSA+SHA384:RSA-PSS+SHA384:RSA+SHA384:RSA-PSS+SHA512:RSA+SHA512 SSL handshake has read 4138 bytes and written 687 bytes Verification error: unable to get local issuer certificate New, SSLv3, Cipher is AES128-SHA Protocol: TLSv1.2 Server public key is 2048 bit Secure Renegotiation IS supported Compression: NONE Expansion: NONE No ALPN negotiated SSL-Session: Protocol : TLSv1.2 Cipher : AES128-SHA Session-ID: Session-ID-ctx: Master-Key: D504C13BDBC59CDF3B883D1B626FA2B59000754DED57CD77A72F761A52AEED719DA06C100FBA1430BB9D8DECFC7C9307 PSK identity: None PSK identity hint: None SRP username: None Start Time: 1741092949 Timeout : 7200 (sec) Verify return code: 20 (unable to get local issuer certificate) Extended master secret: yes
1
0
544
Mar ’25
Attaching procedural audio to an ARKit SCNNode
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post Working Implementation with Static Audio File: let audioPlayer = SCNAudioPlayer(source: audioSource) node.addAudioPlayer(audioPlayer) Attempted Implementation with Procedural Audio: // Audio generation code } let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode) node.addAudioPlayer(audioPlayer) In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here: Apple docs Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating: Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use. However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
0
0
212
Apr ’25
Please consider having Name Recognition in a shortcut automation
Request: Name Recognition → Shortcut for SOS Flashlight + Vibration Right now, iOS Name Recognition works, but all I can do is flash the tiny notification light. It would be much more useful if Name Recognition could trigger a Shortcut. That way, I could set it to flash the flashlight in an SOS pattern and vibrate, making the alert impossible to miss. I tried using Custom Alarm, but it won’t let me record my spoken name, so it doesn’t really solve the problem. If Apple allowed Name Recognition to trigger Shortcuts — or expanded “Custom” to support names/words — this would open up far more practical, real-world alerts.
1
0
611
Sep ’25
VoiceOver Text Recognition Announcing Hidden Labels
I have a UIImageView as the background of a custom UIView subclass. The image itself does not contain any text. On top of this image view, I have added two UILabels. To improve accessibility, I converted the entire view into a single accessibility element and set a proper accessibilityLabel. Additionally, I disabled accessibility for the UIImageView and the labels by setting isAccessibilityElement = false. However, when VoiceOver's Accessibility Recognition's Text Recognition feature is enabled, VoiceOver still detects and announces the text inside the UILabels at the end after reading my custom accessibility properties. This text should not be announced. It seems that VoiceOver treats the UILabel content as part of the UIImageView. Additionally, when using the Explore Image rotor action, the entire subview is recognized as a single image. Is this the expected behavior? If so, is there a way to disable VoiceOver’s text recognition for this view while keeping custom accessibility intact? class BackgroundLabelView: UIView { private let backgroundImageView = UIImageView() private let backgroundImageView2 = UIImageView() private let titleLabel = UILabel() private let subtitleLabel = UILabel() override init(frame: CGRect) { super.init(frame: frame) setupView() } required init?(coder: NSCoder) { super.init(coder: coder) setupView() configureAceesibility() } private func configureAceesibility() { backgroundImageView.isAccessibilityElement = false backgroundImageView2.isAccessibilityElement = false titleLabel.isAccessibilityElement = false subtitleLabel.isAccessibilityElement = false isAccessibilityElement = true accessibilityTraits = .button } func configure(backgroundImage: UIImage?, title: String, subtitle: String) { backgroundImageView.image = backgroundImage titleLabel.text = title subtitleLabel.text = subtitle accessibilityLabel = "Holiday Offer ," + title + "," + subtitle } private func setupView() { backgroundImageView2.contentMode = .scaleAspectFill backgroundImageView2.clipsToBounds = true backgroundImageView2.translatesAutoresizingMaskIntoConstraints = false backgroundImageView2.image = UIImage(resource: .bannerfestival) addSubview(backgroundImageView2) backgroundImageView.contentMode = .scaleAspectFit backgroundImageView.clipsToBounds = true backgroundImageView.translatesAutoresizingMaskIntoConstraints = false addSubview(backgroundImageView) titleLabel.font = UIFont.systemFont(ofSize: 18, weight: .bold) titleLabel.textColor = .white titleLabel.translatesAutoresizingMaskIntoConstraints = false titleLabel.numberOfLines = 0 addSubview(titleLabel) subtitleLabel.font = UIFont.systemFont(ofSize: 14, weight: .regular) subtitleLabel.textColor = .white.withAlphaComponent(0.8) subtitleLabel.translatesAutoresizingMaskIntoConstraints = false subtitleLabel.numberOfLines = 0 addSubview(subtitleLabel) NSLayoutConstraint.activate([ backgroundImageView2.leadingAnchor.constraint(equalTo: leadingAnchor), backgroundImageView2.trailingAnchor.constraint(equalTo: trailingAnchor), backgroundImageView2.heightAnchor.constraint(equalToConstant: 200), backgroundImageView.centerYAnchor.constraint(equalTo: centerYAnchor), backgroundImageView.topAnchor.constraint(equalTo: topAnchor), backgroundImageView.leadingAnchor.constraint(greaterThanOrEqualTo: leadingAnchor), backgroundImageView.trailingAnchor.constraint(equalTo: trailingAnchor), backgroundImageView.bottomAnchor.constraint(equalTo: bottomAnchor), titleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16), titleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor), titleLabel.bottomAnchor.constraint(equalTo: centerYAnchor, constant: -4), subtitleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16), subtitleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor), subtitleLabel.topAnchor.constraint(equalTo: centerYAnchor, constant: 4) ]) } override func layoutSubviews() { super.layoutSubviews() backgroundImageView.layer.cornerRadius = layer.cornerRadius } }
2
0
145
Apr ’25
AirPlay connection to a large monitor
I created a desktop app for Mac using Xojo. The app has a controller in the main window and displays advertisements and notices on a connected external display. I'm currently connecting my iMac24 to a REGZA-55M550M via AirPlay, and displaying video from the iMac to the REGZA, but the connection occasionally drops out. Yesterday, the connection dropped about 3.5 hours after connecting. Of course, I have other apps running on the iMac, but I'm not using any operations that would put a strain on the network or memory. Does AirPlay connection to non-Apple products become unstable over long periods of time?
0
0
665
Sep ’25
VoiceOver for Accessibility Labels with Localization
Hello! I'm adding VoiceOver support for my app, but I'm having an issue where my accessibility value is not being spoken. I have made a helper class that creates an NSString from a double and converts it to the user's region currency. CurrencyFormatter.m + (NSString *) localizedCurrencyStringFromDouble: (double) value { NSNumberFormatter *formatter = [[NSNumberFormatter alloc] init]; formatter.numberStyle = NSNumberFormatterCurrencyStyle; formatter.locale = [NSLocale currentLocale]; NSString *currencyString = [formatter stringFromNumber: @(value)]; [formatter release]; return currencyString; } View Contoller self.checkTotalLabel.accessibilityLabel = NSLocalizedString(@"Total Amount", @"Accessibility Label for Total"); self.checkTotalLabel.accessibilityValue = [CurrencyFormatter localizedCurrencyStringFromDouble: total]; I'm confused on whether the value should go into the accessibility label or not. When the currency is just USD and the language is English, it's a simple fix. But when the currency needs to be converted, I'm not sure where to go from here. If anyone has any guidance, it would help me a lot! Thank you!
1
0
759
Jul ’25
SwiftUI tvOS Accessibility VoiceOver - prevent reading all items in ScrollView over and over
Hi, I'm trying to fix tvOS view for VoiceOver accessibility feature: TabView { // 5 tabs Text(title) Button(play) ScrollView { // Live LazyHStack { 200 items } } ScrollView { // Continue watching LazyHStack { 500 items } } } When the view shows up VoiceOver reads: "Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack. VocieOver should only read "Home tab 1 of 5" When moving focus to scroll view it reads: "Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to second item it reads: "Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to third item it reads: "Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4" It should be just reading what is focused, idealy just "Live, Item 1, 1 of 200" then after moving focus on item 2 "Item 2, 2 of 200" this time without the word "Live" because we are on the same scroll view (the same horizontal list) Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused. This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction. Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists. How do I disable reading content that is not focused? I tried: .accessibilityLabel(isFocused ? title : "") .accessibilityHidden(!isFocused) .accessibilityHidden(true) - tried on various levels in view hierarchy .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .contain) - tried on various levels in view hierarchy .accessiblityElement(children: .combine) - tried on various levels in view hierarchy .accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy .accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy // the last 2 was basically an attempt to hack it .accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view. 50+ other attempts at configuring accessibility tags attached to views. I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before. Any idea how to fix this? Thanks.
1
0
141
Apr ’25
Autonomous Single App Mode(ASAM) in macOS
Hello I tried implementing the ASAM for macOS as per apple guidelines with configuration profile mentioned here but didn't had any success. Then Apple suggested to use requestGuidedAccessSession in macOS but that is only supported in macOS Catalyst but that also didn't work with valid config profiles too. Did anyone get success with ASAM mode without assessment entitltlement?
0
0
263
Jul ’25