Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Wi-Fi connectivity Issue - Captive.apple.com returns “application/octet-stream” instead of “text/html”,
In our system, when a user enables a mobile hotspot and the system connects to it, the system attempts to verify WIFI availability by sending an HTTP GET request to http://captive.apple.com. Normally, the server returns: HTTP Status: 200 (OK) Content-Type: text/html This has always been used as a sign of normal connectivity. Issue: Since last Friday, the server sometimes responds with: Content-Type: application/octet-stream When this occurs, our system determines that the network is unavailable and displays a connection warning (a “!” icon). Question: Has Apple recently made any backend or CDN configuration changes to captive.apple.com that could affect the response type? Any advice how can we solve this problem? Thanks!
2
0
465
2d
Unable to Accept Invite
I am getting this issue when trying to accept an invite to a new test version of our app. ****Unable to Accept invite This invitation cannot be accepted because your Apple Account, xxxxxxxx.me.com, has already been associated to this app.**** Can you help please?
9
6
1.4k
2d
SwiftUI safe area stays offset after keyboard dismissal with “Reduce Motion” + “Prefer Cross-Fade” enabled (iOS 26)
I’m seeing a layout issue in SwiftUI on iOS 26 that only reproduces with specific Accessibility Motion settings. Steps to reproduce 1. Open Settings → Accessibility → Motion. 2. Enable Reduce Motion and Prefer Cross-Fade Transitions. 3. Launch an app with a SwiftUI TextField. 4. Tap the field to show the keyboard. 5. Dismiss the keyboard (tap outside, swipe down, etc.). Expected: After the keyboard is dismissed, the view’s bottom safe area / layout should return to normal. Actual: The view continues to reserve space equal to the keyboard height — as if the keyboard were still visible. UI anchored to the safe area remains shifted upward until the view is reloaded.
3
2
595
2d
AVSpeechSynthesisVoice ignores user-selected voices in iOS 26 (Regression)
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings. Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2. Particularly affects: Third-party speech synthesis voices (CereProc, Grammatek, etc.) Apps relying on automatic voice selection based on user preferences Example: // User selected CereProc Heather for en-GB in Accessibility settings let voice = AVSpeechSynthesisVoice(language: "en-GB") print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default) Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting. Tested methods: AVSpeechSynthesisVoice(language:) AVSpeechUtterance auto-selection Reflection for new APIs All return the system default voice, not the user's preference. Filed: FB[20271264] Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
4
1
289
2d
.sheet and focus elements inside
Hi. I wanted to open a .sheet with .presentationDetents([.height(320)]) and having a couple of pickers and buttons inside. The problem is that I can't tab with keyboard between the elements inside if I use .presentationDetents([.height(320)]) on the root VStack inside of the .sheet. It works perfectly if I remove it but then the sheet becomes fullscreen which I don't want. Is this a bug or am I using it incorrectly?
2
0
299
3d
My Enrollment is being processed for long time
Hello, I’m reaching out regarding an issue with our organization’s Apple Developer Program enrollment. We’ve successfully created a developer account and our organization is verified through the D-U-N-S system. The D-U-N-S ID is correctly displayed in our Apple Developer account. However, the enrollment status still shows: “Your enrollment is being processed.” It’s been 3 months, and we haven’t received any further communication or updates. Has anyone experienced a similar delay? Is there anything else we should do to expedite the process? Any guidance or insight would be greatly appreciated. Thanks,
1
0
814
5d
Medication data insert from third party app
I want to insert the medication data which is available from ios 26 from my app to apple health kit. I have tried to get the permission to read and write data but app got crashed while I tried to request that permission. Does apple allow to insert the medication data to apple health kit likewise we are able to add other health and fitness data or not? let healthStore = HKHealthStore() @available(iOS 26.0, *) @objc func requestAuthorization(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: @escaping RCTPromiseRejectBlock) { guard HKHealthStore.isHealthDataAvailable() else { print("not available ") return } let doseType = HKObjectType.medicationDoseEventType() let medType = HKObjectType.userAnnotatedMedicationType() healthStore.requestAuthorization(toShare: [doseType], read: [doseType]) { success, error in if let err = error { reject("auth_error", err.localizedDescription, err); return } self.healthStore.requestPerObjectReadAuthorization(for: medType, predicate: nil) { s, e in if let err2 = e { reject("per_obj_auth", err2.localizedDescription, err2); return } resolve(["ok": success && s]) } } }
1
1
812
1w
ARKit Eye Tracking Calibration Issues - Word-Level Reading Tracking Feasibility
Hi Apple Developer Community, I'm developing an eye-tracking application using ARKit's ARFaceTrackingConfiguration and ARFaceAnchor.blendShapes for gaze detection using Xcode. I'm experiencing several calibration and accuracy issues and would appreciate insights from the community. Current Implementation Using ARFaceAnchor.blendShapes (.eyeLookUpLeft, .eyeLookDownLeft, .eyeLookInLeft, .eyeLookOutLeft, etc.) Implementing custom sensitivity curves and smoothing algorithms Applying baseline correction and coordinate mapping Using quadratic regression for calibration point mapping Issues I'm Facing 1. Calibration Mismatch Red dot position doesn't align with where I'm actually looking Significant offset between intended gaze point and actual cursor position Calibration seems to drift or become inaccurate over time 2. Extreme Eye Movement Requirements Need to make exaggerated eye movements to reach screen edges/corners Natural eye movements don't translate to proportional cursor movement Difficulty reaching certain screen regions even with calibration 3. Sensitivity and Stability Issues Cursor jitters or jumps around when looking at center Too much sensitivity to micro-movements Inconsistent behavior between calibration and normal operation 4. I also noticed that tracking on calibration screen as well as tracking on reading screen works better as expected when head movement is there, but I do not want much head movement. I want tracking with normal eye movement while reading an Ebook. Primary Question: Word-Level Eye Tracking Feasibility Is word-level eye tracking (tracking gaze as users read through individual words in an ebook) technically feasible with current iPhone/iPad hardware? I understand that Apple's built-in eye tracking is primarily an accessibility feature for UI navigation. However, I'm wondering if the TrueDepth camera and ARKit's eye tracking capabilities are sufficient for: Tracking natural reading patterns (left-to-right, line-by-line progression) Detecting which specific words a user is looking at Maintaining accuracy for sustained reading sessions (15-30 minutes) Working reliably across different users and lighting conditions Questions for the Community Hardware Limitations: Are iPhone/iPad TrueDepth cameras capable of the precision needed for word-level tracking, or is this beyond current hardware capabilities? Calibration Best Practices: What calibration strategies have worked best for accurate gaze mapping? How many calibration points are typically needed? Reading-Specific Challenges: Are there particular challenges when tracking reading behavior vs. general gaze tracking? Alternative Approaches: Are there better approaches than ARKit blend shapes for this use case? Current Setup Devices: iPhone 14 Pro iOS Version: iOS 18.3 ARKit Version: Latest available Any insights, experiences, or technical guidance would be greatly appreciated. I'm particularly interested in hearing from developers who have worked on similar eye tracking applications or have experience with the limitations and capabilities of ARKit's eye tracking features. Thank you for your time and expertise!
0
0
632
1w
Your app's binary includes references to HealthKit components, but the app still does not appear to include any primary features that require health or fitness data.
Your app's binary includes references to HealthKit components, but the app still does not appear to include any primary features that require health or fitness data. Next Steps To resolve this issue, please remove any HealthKit functionality from the app, as well as any references to this app’s interactivity with HealthKit from the app or its metadata. This includes removing any HealthKit-related keys in the app's Info.plist or InfoPlist.strings files, as well as removing any calls to HealthKit APIs, including those from third-party platforms, from the app.
1
1
224
1w
iOS 26 Voice Over is reporting an extra tab
Feedback number: FB20451665 When building with Xcode 26, Voice Over is reporting an extra tab when swiping through tabs. Please see the sample project below: /* This is a Sample project to show that I believe there is a Voice Over bug in iOS 26. When swiping through tabs with Voice Over active, there always appears to be an extra tab. Here I have 5 tabs, when on tab one VO reads out tab 1 of 6, then tab 2 of 6, all the way to the last tab, when voice over reads out tab 5 of 6. Never tab 6 of 6. Is there a possibility that voice over is picking up the underlying `more` tab and reading that out? This has also been reportedly found in the Files app here: https://www.applevis.com/comment/195441#comment-195441 */ struct ContentView: View { var body: some View { TabView { /// Activating this has Voice over telling us there are 6 Tabs. Tab(RootTab.home.title, systemImage: "circle.fill") { Text("This is the \(RootTab.home.title.capitalized) screen") } .accessibilityLabel("\(RootTab.home.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.home.title.capitalized) tab") Tab(RootTab.diary.title, systemImage: "circle.fill") { Text("This is the \(RootTab.diary.title.capitalized) screen") } .accessibilityLabel("\(RootTab.diary.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.diary.title.capitalized) tab") Tab(RootTab.meals.title, systemImage: "circle.fill") { Text("This is the \(RootTab.meals.title.capitalized) screen") } .accessibilityLabel("\(RootTab.meals.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.meals.title.capitalized) tab") Tab(RootTab.knowledge.title, systemImage: "circle.fill") { Text("This is the \(RootTab.knowledge.title.capitalized) screen") } .accessibilityLabel("\(RootTab.knowledge.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.knowledge.title.capitalized) tab") Tab(RootTab.profile.title, systemImage: "circle.fill") { Text("This is the \(RootTab.profile.title.capitalized) screen") } .accessibilityLabel("\(RootTab.profile.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.profile.title.capitalized) tab") /// Activating this also has Voice over telling us there are 6 Tabs. // ForEach(RootTab.allCases, id: \.self) { tab in // // Text("This is the \(tab.title.capitalized) screen") // .tabItem { // Label(tab.title.capitalized, systemImage: "circle.fill") // } // .accessibilityLabel("\(tab.title.capitalized) tab") // .accessibilityHint("Double tap to open the \(tab.title.capitalized) tab") // } } } enum RootTab: CaseIterable { case home case diary case meals case knowledge case profile var title: String { switch self { case .home: "home" case .diary: "diary" case .meals: "meals" case .knowledge: "knowledge" case .profile: "profile" } } } } I'm curious if anyone else can see this issue, or if anyone knows of a workaround for it.
3
0
2.0k
2w
Potential Structural Swift Concurrency Issue: unsafeForcedSync called from Swift Concurrent context
I occasionally get this error in Xcode’s console: Potential Structural Swift Concurrency Issue: unsafeForcedSync called from Swift Concurrent context. What does this mean, and how can I resolve it? Googling it doesn’t turn up any results. This doesn't crash the app - it’s just an error diagnostic that I see in the Xcode console. The app keeps running before and after the issue. Is there a way I can set a breakpoint to catch this where it happens?
7
2
1.6k
3w
Attempting to go directly to a help book page results in the main help book page being displayed instead
There is an issue with Help Books that started with the release of macOS 14.4. The issue is that when an app attempts to go directly to a Help Book page, the help viewer opens to the Help Book's main index page, rather than the specific page requested. As I investigated the issue I found that the requested page was actually part of help viewer's navigation history, and all I had to do was to click the Back navigation arrow and the requested page would be displayed. So it seems like the requested page is momentarily visited but is then (for whatever reason) quickly replaced by the main index page. Our app uses the AHGotoPage() API for directly accessing our Help Book's pages. This is the same mechanism/code that our app has used for more than a decade and has never caused us any issues. Everything works fine on macOS 14.3.0 and earlier. I've scoured the documentation and can't find any newer APIs for accessing Help pages. I've also tried various other things (e.g. reworking the code, creating new indexes for the app's Help, etc.), but none of it seems to make a difference. As far as I can tell, the issue seems to stem from some change made to the OS. So my questions are: Is this a known bug? And if so, is there any ETA on a fix? Is there something different we should be doing for newer versions of the OS (create indexes differently, use a different API, etc.)?
3
0
1.9k
3w
RTT call option and confirmation dialog missing when dialing emergency numbers
Hello, In our app we provide a button that initiates a phone call using tel://. For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel. If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call. However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT. Is this the expected system behavior for emergency numbers on iOS? 
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option? Thanks in advance for clarifying.
2
0
596
3w
Nearby Interactions, wih camera assistance
I have an app that uses nearby with a custom accessory. works great on iPhone 11-13, starting with iPhone 14, one must use ARkit to get angles we have two problems ARkit is light sensitive, and we do not control the lighting where this app would run.. the 11-13 action works great even in the dark. (our users are blind, this is an accessibility app) ARkit wants to be foreground, but our uses cannot see it, and we have a voice oriented UI that provides navigation instructions.. IF ARkit is foreground, our app doesn't work. with iPhone 15 ProMax, on IOS 18, I got an error, access denied. (not permission denied) now that I am on IOS 26.. bt scan doesn't happen also fails same way on iPhone 17 on IOS26, can't callback now as release signing is no longer done this same code works ok on iOS 17.1 on iPhone 12. Info.plist here info.txt if(SearchedServices == [] ){ services = [TransferService.serviceUUID,QorvoNIService.serviceUUID] } logger.info( "scannerready, starting scan for peripherals \(services) and devices \(IDs)") filteredIDs=IDs; scanning=true; centralManager.scanForPeripherals(withServices: services, options: [CBCentralManagerScanOptionAllowDuplicatesKey: true]) the calling code dataChannel.autoConnect=autoConnect; dataChannel.start(x,ids) // datachannel.start is above self.scanning = true; return "scanning started"; ... log output services from js = and devices= 5FE04CBB services in implementation = bluetooth ready, starting scan for peripherals [] and devices ["5FE04CBB"] scannerready, starting scan for peripherals [6E400001-B5A3-F393-E0A9-E50E24DCCA9E, 2E938FD0-6A61-11ED-A1EB-0242AC120002] and devices ["5FE04CBB"] ⚡️ TO JS {"value":"scanning started"}
1
0
484
3w
I have a problem
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
1
0
545
4w
How to capture 48MP capture with Ultra wide lens using iPhone 16 pro max
I am working on capturing 48MP images using the iPhone 16 Pro Max with the Ultra-wide camera. I’ve updated the code to capture the maximum supported dimensions with the following snippet: if #available(iOS 16.0, *) { photoOutput.maxPhotoDimensions = device.activeFormat.supportedMaxPhotoDimensions.last! photoSettings.maxPhotoDimensions = .init(width: 5712, height: 4284) } However, I’m still not getting the expected results. My goal is to capture 48MP images, and I want to confirm if the Ultra-wide camera supports this resolution or if I’m missing any other configuration. Any guidance would be appreciated!
2
2
758
4w