Overview

Post

Replies

Boosts

Views

Activity

Vision face landmarks shifted on iOS 26 but correct on iOS 18 with same code and image
I'm using Vision framework (DetectFaceLandmarksRequest) with the same code and the same test image to detect face landmarks. On iOS 18 everything works as expected: detected face landmarks align with the face correctly. But when I run the same code on devices with iOS 26, the landmark coordinates are outside the [0,1] range, which indicates they are out of face bounds. Fun fact: the old VNDetectFaceLandmarksRequest API works very well without encountering this issue How I get face landmarks: private let faceRectangleRequest = DetectFaceRectanglesRequest(.revision3) private var faceLandmarksRequest = DetectFaceLandmarksRequest(.revision3) func detectFaces(in ciImage: CIImage) async throws -> FaceTrackingResult { let faces = try await faceRectangleRequest.perform(on: ciImage) faceLandmarksRequest.inputFaceObservations = faces let landmarksResults = try await faceLandmarksRequest.perform(on: ciImage) ... } How I show face landmarks in SwiftUI View: private func convert( point: NormalizedPoint, faceBoundingBox: NormalizedRect, imageSize: CGSize ) -> CGPoint { let point = point.toImageCoordinates( from: faceBoundingBox, imageSize: imageSize, origin: .upperLeft ) return point } At the same time, it works as expected and gives me the correct results: region is FaceObservation.Landmarks2D.Region let points: [CGPoint] = region.pointsInImageCoordinates( imageSize, origin: .upperLeft ) After that, I found that the landmarks are normalized relative to the unalignedBoundingBox. However, I can’t access it in code. Still, using these values for the bounding box works correctly. Things I've already tried: Same image input Tested multiple devices on iOS 26.2 -> always wrong. Tested multiple devices on iOS 18.7.1 -> always correct. Environment: macOS 26.2 Xcode 26.2 (17C52) Real devices, not simulator Face Landmarks iOS 18 Face Landmarks iOS 26
0
0
152
4w
How to monitor heart rate in background without affecting Activity Rings?
I'm developing a watchOS nap app that detects when the user falls asleep by monitoring heart rate changes. == Technical Implementation == HKWorkoutSession (.mindAndBody) for background execution HKAnchoredObjectQuery for real-time heart rate data CoreMotion for movement detection == Battery Considerations == Heart rate monitoring ONLY active when user explicitly starts a session Monitoring continues until user is awakened OR 60-minute limit is reached If no sleep detected within 60 minutes, session auto-ends (user may have abandoned or forgotten to stop) App displays clear UI indicating monitoring is active Typical session: 15-30 minutes, keeping battery usage minimal == The Problem == HKWorkoutSession affects Activity Rings during the session. Users receive "Exercise goal reached" notifications while resting — confusing. == What I've Tried == Not using HKLiveWorkoutBuilder → Activity Rings still affected Using builder but not calling finishWorkout() (per https://developer.apple.com/forums/thread/780220) → Activity Rings still affected WKExtendedRuntimeSession (self-care type) (per https://developer.apple.com/forums/thread/721077) → Only ~10 min runtime, need up to 60 min HKObserverQuery + enableBackgroundDelivery (per https://developer.apple.com/forums/thread/779101) → ~4 updates/hour, too slow for real-time detection Audio background session for continuous processing (suggested in https://developer.apple.com/forums/thread/130287) → Concerned about App Store rejection for non-audio app; if official approves this technical route, I can implement in this direction Some online resources mention "Health Monitoring Entitlement" from WWDC 2019 Session 251, but I could not find any official documentation for this entitlement. Apple Developer Support also confirmed they cannot locate it? == My Question == Is there any supported way to: Monitor heart rate in background for up to 60 minutes WITHOUT affecting Activity Rings or creating workout records? If this requires a special entitlement or API access, please advise on the application process. Or allow me to submit a code-level support request. Any guidance would be greatly appreciated. Thank you!
0
0
63
3w
ANE Error with Statefu Model: "Unable to compute prediction" when State Tensor width is not 32-aligned
Hi everyone, I believe I’ve encountered a potential bug or a hardware alignment limitation in the Core ML Framework / ANE Runtime specifically affecting the new Stateful API (introduced in iOS 18/macOS 15). The Issue: A Stateful mlprogram fails to run on the Apple Neural Engine (ANE) if the state tensor dimensions (specifically the width) are not a multiple of 32. The model works perfectly on CPU and GPU, but fails on ANE both during runtime and when generating a Performance Report in Xcode. Error Message in Xcode UI: "There was an error creating the performance report Unable to compute the prediction using ML Program. It can be an invalid input data or broken/unsupported model." Observations: Case A (Fails): State shape = (1, 3, 480, 270). Prediction fails on ANE. Case B (Success): State shape = (1, 3, 480, 256). Prediction succeeds on ANE. This suggests an internal memory alignment or tiling issue within the ANE driver when handling Stateful buffers that don't meet the 32-pixel/element alignment. Reproduction Code (PyTorch + coremltools): import torch.nn as nn import coremltools as ct import numpy as np class RNN_Stateful(nn.Module): def __init__(self, hidden_shape): super(RNN_Stateful, self).__init__() # Simple conv to update state self.conv1 = nn.Conv2d(3 + hidden_shape[1], hidden_shape[1], kernel_size=3, padding=1) self.conv2 = nn.Conv2d(hidden_shape[1], 3, kernel_size=3, padding=1) self.register_buffer("hidden_state", torch.ones(hidden_shape, dtype=torch.float16)) def forward(self, imgs): self.hidden_state = self.conv1(torch.cat((imgs, self.hidden_state), dim=1)) return self.conv2(self.hidden_state) # h=480, w=255 causes ANE failure. w=256 works. b, ch, h, w = 1, 3, 480, 255 model = RNN_Stateful((b, ch, h, w)).eval() traced_model = torch.jit.trace(model, torch.randn(b, 3, h, w)) mlmodel = ct.convert( traced_model, inputs=[ct.TensorType(name="input_image", shape=(b, 3, h, w), dtype=np.float16)], outputs=[ct.TensorType(name="output", dtype=np.float16)], states=[ct.StateType(wrapped_type=ct.TensorType(shape=(b, ch, h, w), dtype=np.float16), name="hidden_state")], minimum_deployment_target=ct.target.iOS18, convert_to="mlprogram" ) mlmodel.save("rnn_stateful.mlpackage") Steps to see the error: Open the generated .mlpackage in Xcode 16.0+. Go to the Performance tab and run a test on a device with ANE (e.g., iPhone 15/16 or M-series Mac). The report will fail to generate with the error mentioned above. Environment: OS: macOS 15.2 Xcode: 16.3 Hardware: M4 Has anyone else encountered this 32-pixel alignment requirement for StateType tensors on ANE? Is this a known hardware constraint or a bug in the Core ML runtime? Any insights or workarounds (other than manual padding) would be appreciated.
0
0
301
3w
SwiftUI List cell reuse / view lifecycle behavior when scrolling
I’m trying to understand how SwiftUI List handles row lifecycle and reuse during scrolling. I have a list with around 60 card views; on initial load, only about 7 rows are created, but after scrolling to the bottom all rows appear to be created, and when scrolling back to the top I again observe multiple updates and apparent re-creation of rows. I confirmed this behavior using Instruments by profiling my app. Even though each row has a stable identifier, the row views still seem to be destroyed and recreated, which doesn’t resemble UIKit’s cell reuse model. I’d like clarity on how List uses identifiers internally, what actually gets reused versus recreated, and how developers should reason about performance and view lifetime in this case.
0
1
70
1w
AVAudioEngine Voice Processing Fails with Mismatched Input/Output Devices: AggregateDevice Channel Count Mismatch
I'm encountering errors while using AVAudioEngine with voice processing enabled (setVoiceProcessingEnabled(true)) in scenarios where the input and output audio devices are not the same. This issue arises specifically with mismatched devices, preventing the application from functioning as expected. Works: Paired devices (e.g., MacBook Pro mic → MacBook Pro speakers) Fails: Mismatched devices (e.g., AirPods mic → MacBook Pro speakers) When using paired input and output devices: The setup works as expected. Example: MacBook Pro microphone → MacBook Pro speakers. When using mismatched devices: AVAudioEngine setup fails during aggregate device construction. Example: AirPods microphone → MacBook Pro speakers. Error logs indicate a channel count mismatch. Here are the partial logs. Due to the content limit, I cannot post the entire logs. AUVPAggregate.cpp:1000 client-side input and output formats do not match (err=-10875) AUVPAggregate.cpp:1036 err=-10875 AVAEInternal.h:109 [AVAudioEngineGraph.mm:1344:Initialize: (err = PerformCommand(*outputNode, kAUInitialize, NULL, 0)): error -10875 AggregateDevice.mm:329 Failed expectation of constructed aggregate (312): mInput.streamChannelCounts == inputStreamChannelCounts AggregateDevice.mm:331 Failed expectation of constructed aggregate (312): mInput.totalChannelCount == std::accumulate(inputStreamChannelCounts.begin(), inputStreamChannelCounts.end(), 0U) AggregateDevice.mm:182 error fetching default pair AggregateDevice.mm:329 Failed expectation of constructed aggregate (336): mInput.streamChannelCounts == inputStreamChannelCounts AggregateDevice.mm:331 Failed expectation of constructed aggregate (336): mInput.totalChannelCount == std::accumulate(inputStreamChannelCounts.begin(), inputStreamChannelCounts.end(), 0U) AUHAL.cpp:1782 ca_verify_noerr: [AudioDeviceSetProperty(mDeviceID, NULL, 0, isInput, kAudioDevicePropertyIOProcStreamUsage, theSize, theStreamUsage), 560227702] AudioHardware-mac-imp.cpp:3484 AudioDeviceSetProperty: no device with given ID AUHAL.cpp:1782 ca_verify_noerr: [AudioDeviceSetProperty(mDeviceID, NULL, 0, isInput, kAudioDevicePropertyIOProcStreamUsage, theSize, theStreamUsage), 560227702] AggregateDevice.mm:182 error fetching default pair AggregateDevice.mm:329 Failed expectation of constructed aggregate (348): mInput.streamChannelCounts == inputStreamChannelCounts AggregateDevice.mm:331 Failed expectation of constructed aggregate (348): mInput.totalChannelCount == std::accumulate(inputStreamChannelCounts.begin(), inputStreamChannelCounts.end(), 0U) Is it possible to use voice processing with different input/output devices? If yes, are there any specific configurations required to handle mismatched devices? How can we resolve channel count mismatch errors during aggregate device construction? Are there settings or API adjustments to enforce compatibility between input/output devices? Are there any workarounds or alternative approaches to achieve voice processing functionality with mismatched devices? For instance, can we force an intermediate channel configuration or downmix input/output formats?
0
0
209
4w
DeviceCheck framework error
We integrated DeviceCheck framework into our app to prevent fraudulent call to our app service around one year ago. Recently, we received a few cases related to this function over Christmas Eve period. Based on the logs we have, it indicated both the following two functions returned errors. But we don't have the exactly errors logged and now we cannot replicate. DCAppAttestService.shared.attestKey() DCAppAttestService.shared.generateAssertion() The other finding we have is some users reporting this issue recently upgraded their devices from iOS 18 to iOS 26. So we are suspecting it's due to either the OS upgrading, or Apple's app attest service degrading. Anyone encountered the similar issues before, or have any idea regarding the root cause? Thanks!
0
1
128
1w
Cannot edit banking info
Bank Accounts details are outdated and status is stack on processing with error: "Your banking updates are processing, and you should see the changes in 24 hours. You won't be able to make any additional updates until then." This is now stack for a few years since we activated a previous Apple developer account. we must change banking details as it holds up development of an app with in-app purchases. Finance department has been contacted and they do not answer What shall we do? senior support staff keep referring to finance department and is not helping
0
0
49
3w
iOS 26.1 iPhone 15 pro max 偶现冷启动,文件系统挂载失败?
冷启动后我们读文件,发现:"error_msg":"未能打开文件“FinishTasks.plist”,因为你没有查看它的权限。 是否有这些问题: 「iOS 26 iPhone 16,2 cold launch file access failure」) 核心内容:多名开发者反馈 iPhone 15 Pro(iOS 26.0/26.1)冷启动时读取 Documents 目录下的 plist 文件提示权限拒绝,切后台再切前台恢复,苹果员工回复「建议延迟文件操作至 applicationDidBecomeActive 后」。
0
0
228
2w
Particles rendered in the wrong order: back last instead of the back first
I've tried out a ParticleEmitter in Reality Composer Pro to produce a burst of particles that don't move (i.e. speed close to zero). When viewing from different angles, it clearly looks like the particles are rendered exactly in the wrong order, that is, front first and back last. In other words, back particles obscure front particles. I would prefer it the correct way around. I've only tried this interactively in Reality Composer Pro, not programmatically, but I assume I would get the same result. My Reality Composer Pro "File" (zipped): https://gert-rieger-edv.de/Posts/Post-1/RealityParticles.zip Screenshot: Click on the ParticleEmitter object, then on its Play button, then select the Particles tab and click on "Burst" a few times to get a few random particles. Mac Studio 2025 Apple M4 Max macOS 15.7.2 (24G325) Reality Composer Pro Version 2.0 (494.60.2)
0
0
454
3w
Subscription Group Remains as Prepare for Submission
I'm ready to submit a new app with 3 subscription plans. But the status of the group remains 'Prepare for Submission.' And it won't change. All the individual subscription plans have the 'Ready to Submit' status. I have triple-checked individual plans to see if there is anything missing. No. Nothing is missing. There are no pending business contracts to review, either. I have even deleted an existing group and created a whole new subscription group. And I still end up with this prepare status. Am I the only one having this subscription group difficulty? One thing that I realize is that the status appears as 'Ready to Submit' on individual subscription plans. But their respective localization pair of display name and description get the Prepared status. The display name and the description fields are both filled for all three plans. What a nice, merry, merry Christmas. I don't know what else I can do to resolve this Prepared madness. I've been stuck for 4 hours.
0
1
75
1w
Some variable SF Symbols don't work.
Some SF Symbols (wifi for example) render fine with the variable. But many, mostly ones with the circle being variable, do not seem to work. The SF Symbols app shows them rendering with a variable fine. But in code it doesn't work. Am I missing something or is there a reason? var body: some View { HStack { Image(systemName: "01.circle", variableValue: 0.5) Image(systemName: "figure.wave.circle", variableValue: 0.5) Image(systemName: "wifi", variableValue: 0.5) }.font(.largeTitle) } }
Topic: Design SubTopic: General Tags:
0
0
1.2k
3w
Snippet Intents and location
Hello, I’d like to ask about best practices for handling interactive snippet intents when working with the user’s location. My use case is: 1. Get the user’s location 2. Fetch nearby data 3. Display it My current flow is: try to show the snippet view in "loading" state while waiting for Core Location Manager, then fetch data and reload() the view. BUT I’m running into an issue where I sometimes receive Core Location error 1 (not authorized), even though the main app has “While In Use” authorization. It seems that in some cases, especially when the app has been force-closed, App Intents are unable to start location updates, even though I’m using supportedModes = .foreground(.dynamic). Any guidance would be appreciated. Cheers, Ondrej
0
0
74
4w
Promo code for Watch-only IAP failing
Hi, I have an "Apple Watch Only" app that you can download for free and conduct a 7 day trial. After that, there is an IAP to unlock a lifetime license. For the life of me I can't find out how the process of redeeming promo codes for the IAP should work. Once generated, people try to redeem them in the AppStore (on their phone), but then they end up in a hanging process saying it's reinstalling the app to redeem the offer. But then nothing happens. Also running "Restore Purchase" within the app to pickup any maybe by now activated license is not working. Why does it even want to reinstall the (free) app? This doesn't sound right. Does anyone have IAP for a watch-only app and can shed some light on the promo code topic?
0
0
52
4w
Error: "CoreImage Metal library does not contain function"
Hey I'm using the CIDepthBlurEffect Core Image Filter in my app. It seems to work ok but I get these errors in the console when calling the class. CoreImage Metal library does not contain function for name: sparserendering_xhlrb_scan CoreImage Metal library does not contain function for name: sparserendering_xhlrb_diffuse CoreImage Metal library does not contain function for name: sparserendering_xhlrb_copy_back CoreImage Metal library does not contain function for name: plain_or_sRGB_copy Am I missing some sort of import to gain these Metal functions? I am using my own custom shaders but I assume you'd be able to use them along side the built in ones.
0
0
479
1w
listRowBackground vs swipeActions ios26 compatibility
Hi! On iOS 26, Apple’s Mail app shows an effect where a list row gets rounded corners while you’re swiping (so the row visually “matches” the rounded swipe buttons). In my app I’m using SwiftUI List + .swipeActions. I also need a custom row tint (e.g. subtle red/gray highlight based on state). The problem is: If I apply my tint using .background / .clipShape, it moves with the row content during swipe and looks wrong. If I use .listRowBackground(...), I keep the tint, but I don’t get the same rounded-corners “morphing” effect as in Mail (or it looks inconsistent). Question: What’s the correct way in iOS 26 to keep a custom row tint and get the system-style rounded corners / liquid-glass effect while swiping?
Topic: UI Frameworks SubTopic: SwiftUI
0
0
74
2w
XCode Cloud Signing Error
As mentioned in the linked post, I can archive the project locally but not via Xcode Cloud. I have also created a new project, but the same thing happens here. https://developer.apple.com/forums/thread/746210 Error Code: ITMS-90035: Invalid Signature. Code failed to satisfy specified code requirement(s). The file at path “{AppName}.app/{AppName}” is not properly signed. Make sure you have signed your application with a distribution certificate, not an ad hoc certificate or a development certificate. Verify that the code signing settings in Xcode are correct at the target level (which override any values at the project level). Additionally, make sure the bundle you are uploading was built using a Release target in Xcode, not a Simulator target. If you are certain your code signing settings are correct, choose “Clean All” in Xcode, delete the “build” directory in the Finder, and rebuild your release target. For more information, please consult https://developer.apple.com/support/code-signing.
0
0
191
3w
CompileMetalFile Failed With Xcode Cloud
Hello guys, recently I integrated a third-party library into my code to handle blur effects (Glur). This library leverages Metal's compute capabilities and appears to automatically depend on the Metal toolchain during the build process. When using Xcode Cloud, the archive step consistently fails with a "CompileMetalFile Failed" error. However, when I manually archive the project in Xcode locally, everything works fine without any issues. I’ve confirmed that the macOS and Xcode versions specified in my Xcode Cloud workflow are correct. Reply if you know how to fix it or it will be fixed future, thanks. I'm using Xcode 26.2(17C52) and macOS(15.7.1 (24G231))
0
0
45
2w