Search results for

“iPhone 16 pro”

80,732 results found

Post

Replies

Boosts

Views

Activity

Reply to Core Image for depth maps & segmentation masks: numeric fidelity issues when rendering CIImage to CVPixelBuffer (looking for Architecture suggestions)
The problem might be this: Core Image uses 16-bit float RGBA as the default working format. That means that, whenever it needs an intermediate buffer for the rendering, it will create a 4-channel 16-bit float surface to render into. This also meant that your 1-channel unsigned integer values will automatically be mapped to float values in 0.0...1.0. That's probably where you lose precision. There are a few options to circumvent this: You could set the workingFormat context option to .L8 or .R8. However, this means all intermediate buffers will have that format. If you want to mix processing of the segmentation mask with other images, this won't work. If you only want to process the mask separately, you can set up a separate CIContext with this option. Note, however, that most built-in CIFilters assume a floating-point working format and might not perform well with this format. You can process your segmentation map with Metal (as you suggested) as part of your CIFilter pipeline using a CIImag
Topic: Machine Learning & AI SubTopic: General Tags:
Feb ’26
Reply to BGContinuedProcessingTask GPU access — no iPhone support?
Good question! I haven't seen the new GPU background entitlements. Good to know! My guess is that they don't want to support iPhone here because they prioritize battery over expensive processing on the mobile devices. The iPads on the other hand are more targeted towards pro use cases (and have a bigger battery). What we are currently doing in our apps is to prevent the screen from locking during video export (using UIApplication.shared.isIdleTimerDisabled = true). We also pause the processing when the app is backgrounded and resume when it's active again.
Topic: Graphics & Games SubTopic: Metal Tags:
Feb ’26
UITrackingElementWindowController crash when viewDidDisappear on iPadOS26.1
hello, I have been receiving crash reports on iPadOS 26.1, When UITrackingElementWindowController viewDidDisappear The feedback associated with this post is: FB20986398 and Exception Exception 'Cannot remove an observer for the key path frame from because it is not registered as an observer.' #1 0x0000000183529814 in objc_exception_throw () #2 0x00000001845065a4 in -[NSObject(NSKeyValueObserverRegistration) _removeObserver:forProperty:] () #3 0x00000001845069c8 in -[NSObject(NSKeyValueObserverRegistration) removeObserver:forKeyPath:] () #4 0x00000001845068e0 in -[NSObject(NSKeyValueObserverRegistration) removeObserver:forKeyPath:context:] () #5 0x00000001cb22e894 in -[PKTextEffectsWindowObserver dealloc] () #6 0x000000018beafb28 in _setInteractionView () #7 0x000000018d81e8b8 in -[UIView(Dragging) removeInteraction:] () #8 0x00000001cb216448 in -[PKTextInputInteraction willMoveToView:] () #9 0x000000018beafb1c in _setInteractionView () #10 0x000000018d81e8b8 in -[UIView(Dragging) removeInteraction:] () #11
Topic: UI Frameworks SubTopic: UIKit Tags:
2
0
293
Feb ’26
Reply to Can NWConnection.receive(minimumIncompleteLength:maximumLength:) return nil data for UDP while connection remains .ready?
I agree with what you stated above, and I apologize for not mentioning that we will prepare the datagram while keeping the maximum MTU in mind. The datagram size will be less than the maximum MTU. I had two queries following that: What are the pros and cons of using receive and receiveMessage, and when each should be used? We understand that in the case of receiveMessage, we will only receive nil data if some kind of error has occurred. Howerver if we use receive - in what situations can the data be nil?
Feb ’26
Reply to iOS App terminated by Watchdog (Signal 9) in Background state despite reporting call
Is RNVoipPushNotificationManager.addCompletionHandler causing a delay in the background run loop that triggers the Watchdog? I wouldn't think so, but it depends on what the full crash log shows. If you block inside your didReceiveIncomingPushWith delegate long enough then you can trigger this crash, but that time required is long enough that it doesn't really come up all that often. The more common problem is that an issue in your call reporting logic means that you DID in fact do this: Killing app because it never posted an incoming call to the system after receiving a PushKit VoIP push. Should completion() be called immediately in Swift for the Background state, rather than waiting for VoipPushNotification.onVoipNotificationCompleted in JS? To be honest, the completion handler is largely irrelevant. It was added in iOS 11.0 as part of adding PKPushType.fileProvider support, but it isn't actually part of the call handling process. You should call it as part of general correctness, but failing to call it won'
Topic: App & System Services SubTopic: Core OS Tags:
Feb ’26
CoreText crash on iOS 26.0 Simulator (Xcode 26.2) when rendering string with zero-width non-joiner and combining marks
Environment: Xcode 26.2 Simulator: 26.0 / iPhone 17 Summary: Assigning a specific Unicode string to a UILabel (or any UITextView / text component backed by CoreText) causes an immediate crash. The string contains a visible base character followed by a zero-width non-joiner and two combining marks. let label = UILabel() label.text = u{274D}u{200C}u{1CD7}u{20DB} // ^ Crash in CoreText during text layout Crash stack trace: The crash occurs inside CoreText's glyph layout/shaping pipeline. The combining marks U+1CD7 and U+20DB appear to stack on the ZWNJ (which has no visible glyph), causing CoreText to fail during run shaping or bounding box calculation. Questions: Is this a known CoreText regression in the iOS 26.0 simulator? Is there a recommended fix or a more targeted workaround beyond stripping zero-width Unicode characters? Will this be addressed in an upcoming update
Topic: UI Frameworks SubTopic: General
3
0
151
Feb ’26
Reply to Can I trigger AudioRecordingIntent from a bluetooth device
I have a BLE device which my app connects to and can detect button presses. On a button press, I want my app to start recording using the AudioRecordingIntent. But my app doesn't work and throws a background error. Is there any reliable way I can get the app to start recording audio in the background? No, at least not within the general audio API. There's a privacy block in place that prevents recording sessions from activating in the background. Our voice communication APIs (CallKit, LiveCommunicationKit, and the PushToTalk framework) do allow this, but that's done as part of the system’s larger call management architecture and is not something that's possible outside of those APIs. If you’re working on some kind of communication app, then you can certainly use one of those framework, but they can’t be used for any other purpose. I also wanted to check if my app can trigger audio recording from the background if my Bluetooth device supports and is active on HFP mode. No, this doesn't really change anything,
Topic: App & System Services SubTopic: Hardware Tags:
Feb ’26
Reply to Can we specify a simulator to be used for our app's evaluation?
No. All apps are expected to function correctly on every device, because customers can choose which device to run their apps on. A user can download an iPhone app and have it run in compatibility mode on an iPad, so you have to ensure that it functions correctly on all devices. By stating that your app looks significantly better on iPhones than iPads you are admitting that you haven't written it to function properly when it's run on an iPad. Apps like that tend to get rejected, so spend some time to fix the issues.
Feb ’26
Can we specify a simulator to be used for our app's evaluation?
I'm making my app in Xcode (as an app playground). I understand that the app will be run in a simulator when being evaluated if I choose the Xcode option on the submission form. However, my app looks significantly better on iPhones than on iPads. Is there a way for me to specify which device to use that will be respected by judges? What about device orientation? Thanks.
2
0
133
Feb ’26
Reply to CoreText crash on iOS 26.0 Simulator (Xcode 26.2) when rendering string with zero-width non-joiner and combining marks
Hi, Thank you for your reply. It is important work within the exact same environment. I am currently using Xcode 26.2 with the iOS 26.0 simulator iPhone 17 or iPhone17 pro (should happen of any of them), and I was able to reproduce the issue on other MacBooks as well with same environment. During our investigation of crash reports, we found that this issue occurs in production on certain iOS versions. Reproducing it locally in the simulator has been the only reliable way to validate the behavior. I would greatly appreciate your assistance with this matter. The problem does not appear to be related to UIKit. I can reproduce it in SwiftUI using Text as well. Unfortunately, the stack trace is not very informative. The crash also occurs when simply passing this specific character to Safari within the simulator. In most cases, the crash can be reproduced by adding the character to a project and building it with the iOS 26.0 simulator using iPhone 17. When running the same project on iOS
Topic: UI Frameworks SubTopic: General
Feb ’26
AirPods 4 Bluetooth Firmware Bug in L2CAP
Hello, I am a Bluetooth Engineer at Google investigating an interoperability bug between an Android device and AirPods 4. When requesting an L2CAP connection (with PSM = AVDTP) to the AirPods during SDP service discovery, The AirPods L2CAP layer incorrectly responds with a refused - no resources available status followed by a Pending status and a Success status. This violates the specification, which says that the request has been fully rejected after the refused status and should not receive followup responses. I suspect the no resources available response is a bug. This prevents A2DP from working with the AirPods. This bug does not exist with AirPods 2 firmware. Here is a packet capture: 1602 1969-12-31 16:07:04.805261 0.062473 localhost () Apple_6b:db:09 (AirPods) L2CAP 17 Sent Connection Request (AVDTP, SCID: 0x22c6) 1603 1969-12-31 16:07:04.810953 0.005692 controller host HCI_EVT 8 Rcvd Number of Completed Packets 1604 1969-12-31 16:07:04.811078 0.000125 Apple_6b:db:09 (AirPods
2
0
166
Feb ’26
Abnormally Long “Waiting for Review” (22+ Days) and No Response from App Review
Hello everyone, I am currently experiencing an unusually long review delay along with no response to multiple inquiries, which is very different from my past experience. Current Situation • 22 days have passed since the initial submission. • After more than 10 days with no progress, I canceled and resubmitted the version twice. It is still stuck in “Waiting for Review.” • I have sent 3 inquiry emails. • I also had phone support through another team, and they relayed messages to the App Review team twice. There has still been no response. At this point, I cannot determine whether this is a simple review delay or some kind of account-level hold. From the Developer site, my account appears to be in good standing with no visible warnings or issues. When I asked in other communities about recent review times, most developers reported approvals within 2–3 days, so this does not appear to be a general system-wide delay. Has anyone experienced something similar recently? Is there any more direct or reliable
3
0
268
Feb ’26
Reply to Core Image for depth maps & segmentation masks: numeric fidelity issues when rendering CIImage to CVPixelBuffer (looking for Architecture suggestions)
The problem might be this: Core Image uses 16-bit float RGBA as the default working format. That means that, whenever it needs an intermediate buffer for the rendering, it will create a 4-channel 16-bit float surface to render into. This also meant that your 1-channel unsigned integer values will automatically be mapped to float values in 0.0...1.0. That's probably where you lose precision. There are a few options to circumvent this: You could set the workingFormat context option to .L8 or .R8. However, this means all intermediate buffers will have that format. If you want to mix processing of the segmentation mask with other images, this won't work. If you only want to process the mask separately, you can set up a separate CIContext with this option. Note, however, that most built-in CIFilters assume a floating-point working format and might not perform well with this format. You can process your segmentation map with Metal (as you suggested) as part of your CIFilter pipeline using a CIImag
Topic: Machine Learning & AI SubTopic: General Tags:
Replies
Boosts
Views
Activity
Feb ’26
Reply to BGContinuedProcessingTask GPU access — no iPhone support?
Good question! I haven't seen the new GPU background entitlements. Good to know! My guess is that they don't want to support iPhone here because they prioritize battery over expensive processing on the mobile devices. The iPads on the other hand are more targeted towards pro use cases (and have a bigger battery). What we are currently doing in our apps is to prevent the screen from locking during video export (using UIApplication.shared.isIdleTimerDisabled = true). We also pause the processing when the app is backgrounded and resume when it's active again.
Topic: Graphics & Games SubTopic: Metal Tags:
Replies
Boosts
Views
Activity
Feb ’26
Reply to macOS 26.4 Dev Beta 2 Install Fails
I have the same issue on MacBook Pro M4 Pro
Topic: Community SubTopic: Apple Developers Tags:
Replies
Boosts
Views
Activity
Feb ’26
UITrackingElementWindowController crash when viewDidDisappear on iPadOS26.1
hello, I have been receiving crash reports on iPadOS 26.1, When UITrackingElementWindowController viewDidDisappear The feedback associated with this post is: FB20986398 and Exception Exception 'Cannot remove an observer for the key path frame from because it is not registered as an observer.' #1 0x0000000183529814 in objc_exception_throw () #2 0x00000001845065a4 in -[NSObject(NSKeyValueObserverRegistration) _removeObserver:forProperty:] () #3 0x00000001845069c8 in -[NSObject(NSKeyValueObserverRegistration) removeObserver:forKeyPath:] () #4 0x00000001845068e0 in -[NSObject(NSKeyValueObserverRegistration) removeObserver:forKeyPath:context:] () #5 0x00000001cb22e894 in -[PKTextEffectsWindowObserver dealloc] () #6 0x000000018beafb28 in _setInteractionView () #7 0x000000018d81e8b8 in -[UIView(Dragging) removeInteraction:] () #8 0x00000001cb216448 in -[PKTextInputInteraction willMoveToView:] () #9 0x000000018beafb1c in _setInteractionView () #10 0x000000018d81e8b8 in -[UIView(Dragging) removeInteraction:] () #11
Topic: UI Frameworks SubTopic: UIKit Tags:
Replies
2
Boosts
0
Views
293
Activity
Feb ’26
Reply to Can NWConnection.receive(minimumIncompleteLength:maximumLength:) return nil data for UDP while connection remains .ready?
I agree with what you stated above, and I apologize for not mentioning that we will prepare the datagram while keeping the maximum MTU in mind. The datagram size will be less than the maximum MTU. I had two queries following that: What are the pros and cons of using receive and receiveMessage, and when each should be used? We understand that in the case of receiveMessage, we will only receive nil data if some kind of error has occurred. Howerver if we use receive - in what situations can the data be nil?
Replies
Boosts
Views
Activity
Feb ’26
Reply to Unusually Long "Waiting for Review" Times This Week - Anyone Else?
My app has been in waiting for review status for almost a month. Have sent 3 emails with no response and talked to a rep on the phone 2 weeks ago that said they would expedite it. Still no change.
Replies
Boosts
Views
Activity
Feb ’26
Reply to Developer Program enrollment still pending after payment
I believe the phone number needs to be a US number, and mine is not. I am considering using Twilio to receive calls through a US phone number.
Replies
Boosts
Views
Activity
Feb ’26
Reply to iOS App terminated by Watchdog (Signal 9) in Background state despite reporting call
Is RNVoipPushNotificationManager.addCompletionHandler causing a delay in the background run loop that triggers the Watchdog? I wouldn't think so, but it depends on what the full crash log shows. If you block inside your didReceiveIncomingPushWith delegate long enough then you can trigger this crash, but that time required is long enough that it doesn't really come up all that often. The more common problem is that an issue in your call reporting logic means that you DID in fact do this: Killing app because it never posted an incoming call to the system after receiving a PushKit VoIP push. Should completion() be called immediately in Swift for the Background state, rather than waiting for VoipPushNotification.onVoipNotificationCompleted in JS? To be honest, the completion handler is largely irrelevant. It was added in iOS 11.0 as part of adding PKPushType.fileProvider support, but it isn't actually part of the call handling process. You should call it as part of general correctness, but failing to call it won'
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Feb ’26
CoreText crash on iOS 26.0 Simulator (Xcode 26.2) when rendering string with zero-width non-joiner and combining marks
Environment: Xcode 26.2 Simulator: 26.0 / iPhone 17 Summary: Assigning a specific Unicode string to a UILabel (or any UITextView / text component backed by CoreText) causes an immediate crash. The string contains a visible base character followed by a zero-width non-joiner and two combining marks. let label = UILabel() label.text = u{274D}u{200C}u{1CD7}u{20DB} // ^ Crash in CoreText during text layout Crash stack trace: The crash occurs inside CoreText's glyph layout/shaping pipeline. The combining marks U+1CD7 and U+20DB appear to stack on the ZWNJ (which has no visible glyph), causing CoreText to fail during run shaping or bounding box calculation. Questions: Is this a known CoreText regression in the iOS 26.0 simulator? Is there a recommended fix or a more targeted workaround beyond stripping zero-width Unicode characters? Will this be addressed in an upcoming update
Topic: UI Frameworks SubTopic: General
Replies
3
Boosts
0
Views
151
Activity
Feb ’26
Reply to Can I trigger AudioRecordingIntent from a bluetooth device
I have a BLE device which my app connects to and can detect button presses. On a button press, I want my app to start recording using the AudioRecordingIntent. But my app doesn't work and throws a background error. Is there any reliable way I can get the app to start recording audio in the background? No, at least not within the general audio API. There's a privacy block in place that prevents recording sessions from activating in the background. Our voice communication APIs (CallKit, LiveCommunicationKit, and the PushToTalk framework) do allow this, but that's done as part of the system’s larger call management architecture and is not something that's possible outside of those APIs. If you’re working on some kind of communication app, then you can certainly use one of those framework, but they can’t be used for any other purpose. I also wanted to check if my app can trigger audio recording from the background if my Bluetooth device supports and is active on HFP mode. No, this doesn't really change anything,
Topic: App & System Services SubTopic: Hardware Tags:
Replies
Boosts
Views
Activity
Feb ’26
Reply to Can we specify a simulator to be used for our app's evaluation?
No. All apps are expected to function correctly on every device, because customers can choose which device to run their apps on. A user can download an iPhone app and have it run in compatibility mode on an iPad, so you have to ensure that it functions correctly on all devices. By stating that your app looks significantly better on iPhones than iPads you are admitting that you haven't written it to function properly when it's run on an iPad. Apps like that tend to get rejected, so spend some time to fix the issues.
Replies
Boosts
Views
Activity
Feb ’26
Can we specify a simulator to be used for our app's evaluation?
I'm making my app in Xcode (as an app playground). I understand that the app will be run in a simulator when being evaluated if I choose the Xcode option on the submission form. However, my app looks significantly better on iPhones than on iPads. Is there a way for me to specify which device to use that will be respected by judges? What about device orientation? Thanks.
Replies
2
Boosts
0
Views
133
Activity
Feb ’26
Reply to CoreText crash on iOS 26.0 Simulator (Xcode 26.2) when rendering string with zero-width non-joiner and combining marks
Hi, Thank you for your reply. It is important work within the exact same environment. I am currently using Xcode 26.2 with the iOS 26.0 simulator iPhone 17 or iPhone17 pro (should happen of any of them), and I was able to reproduce the issue on other MacBooks as well with same environment. During our investigation of crash reports, we found that this issue occurs in production on certain iOS versions. Reproducing it locally in the simulator has been the only reliable way to validate the behavior. I would greatly appreciate your assistance with this matter. The problem does not appear to be related to UIKit. I can reproduce it in SwiftUI using Text as well. Unfortunately, the stack trace is not very informative. The crash also occurs when simply passing this specific character to Safari within the simulator. In most cases, the crash can be reproduced by adding the character to a project and building it with the iOS 26.0 simulator using iPhone 17. When running the same project on iOS
Topic: UI Frameworks SubTopic: General
Replies
Boosts
Views
Activity
Feb ’26
AirPods 4 Bluetooth Firmware Bug in L2CAP
Hello, I am a Bluetooth Engineer at Google investigating an interoperability bug between an Android device and AirPods 4. When requesting an L2CAP connection (with PSM = AVDTP) to the AirPods during SDP service discovery, The AirPods L2CAP layer incorrectly responds with a refused - no resources available status followed by a Pending status and a Success status. This violates the specification, which says that the request has been fully rejected after the refused status and should not receive followup responses. I suspect the no resources available response is a bug. This prevents A2DP from working with the AirPods. This bug does not exist with AirPods 2 firmware. Here is a packet capture: 1602 1969-12-31 16:07:04.805261 0.062473 localhost () Apple_6b:db:09 (AirPods) L2CAP 17 Sent Connection Request (AVDTP, SCID: 0x22c6) 1603 1969-12-31 16:07:04.810953 0.005692 controller host HCI_EVT 8 Rcvd Number of Completed Packets 1604 1969-12-31 16:07:04.811078 0.000125 Apple_6b:db:09 (AirPods
Replies
2
Boosts
0
Views
166
Activity
Feb ’26
Abnormally Long “Waiting for Review” (22+ Days) and No Response from App Review
Hello everyone, I am currently experiencing an unusually long review delay along with no response to multiple inquiries, which is very different from my past experience. Current Situation • 22 days have passed since the initial submission. • After more than 10 days with no progress, I canceled and resubmitted the version twice. It is still stuck in “Waiting for Review.” • I have sent 3 inquiry emails. • I also had phone support through another team, and they relayed messages to the App Review team twice. There has still been no response. At this point, I cannot determine whether this is a simple review delay or some kind of account-level hold. From the Developer site, my account appears to be in good standing with no visible warnings or issues. When I asked in other communities about recent review times, most developers reported approvals within 2–3 days, so this does not appear to be a general system-wide delay. Has anyone experienced something similar recently? Is there any more direct or reliable
Replies
3
Boosts
0
Views
268
Activity
Feb ’26