Search results for

“iPhone 16 pro”

80,726 results found

Post

Replies

Boosts

Views

Activity

Reply to TestFlight build visible on macOS but not installable: “Can only be tested on an iOS device” for Mac (Designed for iPad)
Had the same problem, testing an iOS app on macOS via TestFlight which previously worked, did recently stop working. But I found a setting in App Store Connect, under each testing group, that says Test iPhone and iPad Apps on Apple Silicon Macs. Enabling this setting made testing on macOS work again.
1w
Reply to Bug: Wi-Fi Aware (NAN) Subscriber Mode: nwPath.availableInterfaces Does Not Include nan0 Interface After Successful Peer Connection
Thanks for your reply!We have successfully connected an iOS device with a non-iOS device via Wi-Fi Aware, and established multiple TCP connections. However, we still have a throughput issue: when the iOS device acts as a Subscriber, and sends an HTTP request over an already-established TCP connection to download resources from the non-iOS device, we observe slow download speeds — only around 20 MB/s. We compared iOS system logs and found: When the non-iOS device connects to iOS, the timeslot bitmap used is ff ff 00 00 ff ff. When iOS connects to iOS, all timeslots are used. Could this be the reason for the large difference in transmission rates? Is this caused by configuration, or what leads to the reduced timeslot allocation? Log 1: Non-iOS device connecting to iOS default 16:33:33.204715+0800 kernel wlan0:com.apple.p2p.nan0: Availability: map 0, channel 6 (20M), timeBmap: 1111 1111 1111 1111 0000 0000 0000 0000 1111 1111 1111 1111 ( ff ff 00 00 ff ff ) default 16:33:33.204792+0800 kernel w
1w
Custom Reports GET API returning 403 Forbidden since March 16, 2026 — POST still works
Hi Apple Developer Community, Since March 16, 2026, our integration with the Apple Ads Campaign Management API (v5) is returning 403 Forbidden on all GET requests to the custom-reports endpoint, while POST requests to create reports continue to work without any issues. Environment API Version: v5 Base URL: https://api.searchads.apple.com/api/v5/ Authentication: OAuth 2.0 (Bearer token — token generation works fine) API User Role: API Account Manager What's broken GET /api/v5/custom-reports/63638557 → 403 Forbidden The response is raw HTML from Apple's gateway, not a JSON API error: 403 Forbidden 403 Forbidden Apple This indicates the request is being blocked at the reverse proxy / infrastructure level and never reaches the API application layer. A proper API-level authorization error would return JSON with messageCode and message fields. What still works POST creates report successfully POST /api/v5/custom-reports → 200 OK Response: { data: { id: 63638557, name: Impression_Share_Report_2026-03-22
1
0
77
1w
ARSession Error: Required sensor failed
Hi everyone, I’m currently using the RoomPlan API, which has been working reliably until recently. However, I’ve started encountering an intermittent error and I’m trying to understand what might be causing it. The error is triggered in the ARSession observer method: session(_ session: ARSession, didFailWithError error: Error) It has occurred on at least two devices: iPhone 14 Pro iPhone 17 Pro Here’s the full error message: ARSession failed domain=com.apple.arkit.error code=102 desc=Required sensor failed. userInfo=[NSLocalizedFailureReason: A sensor failed to deliver the required input., NSUnderlyingError: Error Domain=AVFoundationErrorDomain Code=-11819 Cannot Complete Action UserInfo={NSLocalizedDescription=Cannot Complete Action, NSLocalizedRecoverySuggestion=Try again later.}, NSLocalizedDescription: Required sensor failed.] This seems to indicate that a required sensor (likely LiDAR or camera) failed to provide input, but I’m not sure what’s causing it or why it happ
0
0
90
1w
Reply to EASession(accessory:forProtocol:) always returns nil — MFI accessory iAP2
Hi Kevin, Thank you for the troubleshooting steps. To clarify the current state of the environment: Testing Environment: I am testing exclusively on a physical iOS device (iPhone [Model] running iOS [Version]); I am not using the simulator for these tests. Framework: The implementation is already built using UIKit. I am managing the accessory lifecycle within a standard UIViewController and AppDelegate structure. Integration Status: Despite being in a pure UIKit environment, I am still facing the issue where [mention the specific symptom, e.g., the accessory is not appearing in the picker / the session fails to open]. Since I have already ruled out SwiftUI lifecycle interference and simulator limitations, are there specific logging categories in Console.app or internal ExternalAccessory states you recommend I monitor to diagnose why the connection is failing?
Topic: App & System Services SubTopic: Hardware Tags:
1w
Help with getting info for an WIFI USER EXPERIENCE APP
Hi I’m working on an app Called Wiux ( already on Android ) but one of my clients has a company with all iPhones so I need to develop the app for iOS, but I’m facing A huge wall, it’s an proactive wifi user experience monitor for distributed networks and the idea is that the app its sending every minute info about connectivity RSSI, which network , if is 2,4ghz or 5ghz channel used and device usage cpu ram etc but I find that is no getRSSI ( and I really need that data ) but some aps like iWifi or WiFi probe has that data and it works I check the reads with a phisical probe and my app on android and values match. i think maybe with NEHotspotHelper I could get the data but I don’t know how to ask to use it or if exist a dependency for quality monitoring that allows me to access that thow o info. ( And probably in the near future I face the same problem with LTE ( that I’m also monitoring with the app on Android and I think is going to be a problem on iOS )
1
0
39
1w
Reply to Cannot find devices in RemoteImmersiveSpace
Hi @Vision Pro Engineer, after testing that template a few times, I was able to see my app on the 'My Devices' tab. However, I seem to be stuck in the 'Waiting' state now. I've attached the logs. Thanks! Warning: -[NSWindow makeKeyWindow] called on windowNumber=7505 which returned NO from -[NSWindow canBecomeKeyWindow]. ((processConfiguration != nil && configuration != nil) || (processConfiguration == nil && configuration == nil)) - /AppleInternal/Library/BuildRoots/4~CKMVugC7xryZ7g5JjsaPBH1f25q2BM2pc57nT-M/Library/Caches/com.apple.xbs/TemporaryDirectory.kDsM7Z/Sources/ExtensionKit/ExtensionKit/Source/HostViewController/Internal/EXHostSessionDriver.m:80: `processConfiguration` and `configuration` must be both non-nil or both nil Unable to obtain a task name port right for pid 638: (os/kern) failure (0x5)
Topic: Spatial Computing SubTopic: General Tags:
1w
Reply to Why is the Documentation full of Conundrums?
@SymphOmni It seems you didn't understand the goal of the exercise ? The purpose is to let you discover through practice the basic Swift concepts. In the case the dictionaries, a very important type in Swift. Here you may have noted that interestingNumbers is a dictionary: a series of entries, which are pairs called key (a string) and value (here an array), such as Prime: [2, 3, 5, 7, 11, 13] The example shows that in the for loop, you explore the dictionary but don't need to know what the key is (because you don't use). as dark-as showed, if you need to use the key, change _ into a name as dataset, or key, or whatever you want. In another exercise, you could also iterate over the dictionary like this: let interestingNumbers = [ Prime: [2, 3, 5, 7, 11, 13], Fibonacci: [1, 1, 2, 3, 5, 8], Square: [1, 4, 9, 16, 25] ] var largest = 0 var largestSet = for aLine in interestingNumbers { // get entries of dictionary as a whole for number in aLine.value { // value is the array, for each entry line if number
1w
Reply to RealityKit fill the background environment
Thank you Albert so much for your explanation. It helped me a lot. I did as you suggested and the performance increased significantly. Right now I have 3 active chunks with total of around ~468,000 triangles (bushes, leaves, trunk, road and terrain). Do you think that is this a lot for a scene? Because when I take screenshot of the app, the application's frame rate drops by half. What might be the reason for that? My second question is that do you know any tool that converts EntityModels (loaded from Reality Composer Pro) into LowLevelMesh? Because the trees or the bushes that I created using LowLevelMesh doesn't look professional. (ex: I would like to see leaves a little bit with more detail, or I would like to make branches more edgy)
Topic: Graphics & Games SubTopic: RealityKit Tags:
1w
Reply to Why is the Documentation full of Conundrums?
They're asking you to give the _ variable a name, and it's entirely up to you what you call it; it's just a variable name. The right-hand side was already given a name, numbers, and it represents the right-hand side of the information in the interestingNumbers array, i.e. the arrays of numbers. The underscore represents the left-hand side of the data, i.e. the three String values: Prime, Fibonacci, and Square. If you do this, you should see how it works: let interestingNumbers = [ Prime: [2, 3, 5, 7, 11, 13], Fibonacci: [1, 1, 2, 3, 5, 8], Square: [1, 4, 9, 16, 25] ] var largest = 0 var largestSet = for (dataset, numbers) in interestingNumbers { for number in numbers { if number > largest { largest = number largestSet = dataset } } } print (largestSet '(largestSet)' with value: (largest))
1w
Reply to Tap to Pay entitlement stuck in development for nearly 1.5 months – do I need to resubmit?
Hi @mensrev, @devraj, You'll need to contact the provisioning team where you made your initial request. Please see the post below for more information: Resolving Tap to Pay on iPhone errors when building for App Store, TestFlight, or Enterprise distribution https://developer.apple.com/forums/thread/794192 Cheers, Paris X Pinkney |  WWDR | DTS Engineer
1w
Reply to "WilderDEX!" - Wildlife Collector App - NOW TESTING
I tried posting this a couple times now but it keeps getting taken down - I am not too sure why. I haven't received any explanation and I am pretty certain that I am posting in the correct place for TestFlight Apps. It gets taken down because this is absolutely and totally NOT the right place for posts like this. These are the Developer Forums where third-party developers of apps for Apple's platforms ask for hints and tips on coding issues, or the Apple Developer Programme. If you have an issue in your code, ask us and we may be able to help you. If you want us to test your apps, you must know that we have our own apps to develop and test, and don't have the time to test yours. Sorry, that's just the way it is. You can go on the various Mac-/iPhone-related news & rumours forums and ask for people to test your apps, because I can tell you right now, we don't have the time. Sorry.
1w
Apple Developer Program subscription is active, but my account is still not activated
Hello, My Apple Developer Program subscription is active in Apple subscriptions and valid until March 19, 2027, but my developer account is still not activated and I have not received the confirmation email. The purchase was completed through the Apple Developer app on my wife’s iPhone, but the enrollment was for my own Apple Account. In the app, I see the message: “You’ll receive an email soon.” Has anyone had the same issue? How was it resolved? Thank you.
0
0
143
1w
Error 7000 "Team is not yet configured for notarization" — 6 days, no resolution
I enrolled in the Apple Developer Program as an Individual on March 16, 2026 (Team ID: CAZ8X23YWW). I've been trying to notarize a macOS Electron desktop app ever since. Every submission is immediately rejected with: Status code: 7000 Message: Team is not yet configured for notarization What I've done: Accepted all agreements on developer.apple.com Accepted all agreements on App Store Connect Created a Developer ID Application certificate (G2 Sub-CA) App is properly signed with hardened runtime Submitted a support ticket under Distribution > Other Distribution Questions on March 18 — no response after 4 days
1
0
83
1w
Reply to SpeechTranscriber not supported
The 16-core Neural Engine theory lines up with what I have seen in practice on Mac hardware as well. Mac mini M4 (16-core NE) runs SpeechTranscriber and SpeechAnalyzer without issues. M1 devices (also 16-core NE) work too. For the Simulator issue — this is expected unfortunately. SpeechTranscriber relies on the Neural Engine for on-device inference, and the Simulator does not emulate the ANE. The isAvailable check returns false because the underlying model cannot run there. Practical workaround for development: use a conditional compilation check and fall back to SFSpeechRecognizer (the older API) in Simulator builds. SFSpeechRecognizer still works on Simulator and gives you a close-enough approximation for UI development and integration testing. You only need a real device for final accuracy testing. Regarding the 8-core vs 16-core cutoff: my guess is that SpeechTranscriber uses a model size that requires the throughput of a 16-core Neural Engine to meet real-time
Topic: Media Technologies SubTopic: Audio Tags:
1w
Reply to TestFlight build visible on macOS but not installable: “Can only be tested on an iOS device” for Mac (Designed for iPad)
Had the same problem, testing an iOS app on macOS via TestFlight which previously worked, did recently stop working. But I found a setting in App Store Connect, under each testing group, that says Test iPhone and iPad Apps on Apple Silicon Macs. Enabling this setting made testing on macOS work again.
Replies
Boosts
Views
Activity
1w
Reply to Bug: Wi-Fi Aware (NAN) Subscriber Mode: nwPath.availableInterfaces Does Not Include nan0 Interface After Successful Peer Connection
Thanks for your reply!We have successfully connected an iOS device with a non-iOS device via Wi-Fi Aware, and established multiple TCP connections. However, we still have a throughput issue: when the iOS device acts as a Subscriber, and sends an HTTP request over an already-established TCP connection to download resources from the non-iOS device, we observe slow download speeds — only around 20 MB/s. We compared iOS system logs and found: When the non-iOS device connects to iOS, the timeslot bitmap used is ff ff 00 00 ff ff. When iOS connects to iOS, all timeslots are used. Could this be the reason for the large difference in transmission rates? Is this caused by configuration, or what leads to the reduced timeslot allocation? Log 1: Non-iOS device connecting to iOS default 16:33:33.204715+0800 kernel wlan0:com.apple.p2p.nan0: Availability: map 0, channel 6 (20M), timeBmap: 1111 1111 1111 1111 0000 0000 0000 0000 1111 1111 1111 1111 ( ff ff 00 00 ff ff ) default 16:33:33.204792+0800 kernel w
Replies
Boosts
Views
Activity
1w
Custom Reports GET API returning 403 Forbidden since March 16, 2026 — POST still works
Hi Apple Developer Community, Since March 16, 2026, our integration with the Apple Ads Campaign Management API (v5) is returning 403 Forbidden on all GET requests to the custom-reports endpoint, while POST requests to create reports continue to work without any issues. Environment API Version: v5 Base URL: https://api.searchads.apple.com/api/v5/ Authentication: OAuth 2.0 (Bearer token — token generation works fine) API User Role: API Account Manager What's broken GET /api/v5/custom-reports/63638557 → 403 Forbidden The response is raw HTML from Apple's gateway, not a JSON API error: 403 Forbidden 403 Forbidden Apple This indicates the request is being blocked at the reverse proxy / infrastructure level and never reaches the API application layer. A proper API-level authorization error would return JSON with messageCode and message fields. What still works POST creates report successfully POST /api/v5/custom-reports → 200 OK Response: { data: { id: 63638557, name: Impression_Share_Report_2026-03-22
Replies
1
Boosts
0
Views
77
Activity
1w
ARSession Error: Required sensor failed
Hi everyone, I’m currently using the RoomPlan API, which has been working reliably until recently. However, I’ve started encountering an intermittent error and I’m trying to understand what might be causing it. The error is triggered in the ARSession observer method: session(_ session: ARSession, didFailWithError error: Error) It has occurred on at least two devices: iPhone 14 Pro iPhone 17 Pro Here’s the full error message: ARSession failed domain=com.apple.arkit.error code=102 desc=Required sensor failed. userInfo=[NSLocalizedFailureReason: A sensor failed to deliver the required input., NSUnderlyingError: Error Domain=AVFoundationErrorDomain Code=-11819 Cannot Complete Action UserInfo={NSLocalizedDescription=Cannot Complete Action, NSLocalizedRecoverySuggestion=Try again later.}, NSLocalizedDescription: Required sensor failed.] This seems to indicate that a required sensor (likely LiDAR or camera) failed to provide input, but I’m not sure what’s causing it or why it happ
Replies
0
Boosts
0
Views
90
Activity
1w
Reply to EASession(accessory:forProtocol:) always returns nil — MFI accessory iAP2
Hi Kevin, Thank you for the troubleshooting steps. To clarify the current state of the environment: Testing Environment: I am testing exclusively on a physical iOS device (iPhone [Model] running iOS [Version]); I am not using the simulator for these tests. Framework: The implementation is already built using UIKit. I am managing the accessory lifecycle within a standard UIViewController and AppDelegate structure. Integration Status: Despite being in a pure UIKit environment, I am still facing the issue where [mention the specific symptom, e.g., the accessory is not appearing in the picker / the session fails to open]. Since I have already ruled out SwiftUI lifecycle interference and simulator limitations, are there specific logging categories in Console.app or internal ExternalAccessory states you recommend I monitor to diagnose why the connection is failing?
Topic: App & System Services SubTopic: Hardware Tags:
Replies
Boosts
Views
Activity
1w
Help with getting info for an WIFI USER EXPERIENCE APP
Hi I’m working on an app Called Wiux ( already on Android ) but one of my clients has a company with all iPhones so I need to develop the app for iOS, but I’m facing A huge wall, it’s an proactive wifi user experience monitor for distributed networks and the idea is that the app its sending every minute info about connectivity RSSI, which network , if is 2,4ghz or 5ghz channel used and device usage cpu ram etc but I find that is no getRSSI ( and I really need that data ) but some aps like iWifi or WiFi probe has that data and it works I check the reads with a phisical probe and my app on android and values match. i think maybe with NEHotspotHelper I could get the data but I don’t know how to ask to use it or if exist a dependency for quality monitoring that allows me to access that thow o info. ( And probably in the near future I face the same problem with LTE ( that I’m also monitoring with the app on Android and I think is going to be a problem on iOS )
Replies
1
Boosts
0
Views
39
Activity
1w
Reply to Cannot find devices in RemoteImmersiveSpace
Hi @Vision Pro Engineer, after testing that template a few times, I was able to see my app on the 'My Devices' tab. However, I seem to be stuck in the 'Waiting' state now. I've attached the logs. Thanks! Warning: -[NSWindow makeKeyWindow] called on windowNumber=7505 which returned NO from -[NSWindow canBecomeKeyWindow]. ((processConfiguration != nil && configuration != nil) || (processConfiguration == nil && configuration == nil)) - /AppleInternal/Library/BuildRoots/4~CKMVugC7xryZ7g5JjsaPBH1f25q2BM2pc57nT-M/Library/Caches/com.apple.xbs/TemporaryDirectory.kDsM7Z/Sources/ExtensionKit/ExtensionKit/Source/HostViewController/Internal/EXHostSessionDriver.m:80: `processConfiguration` and `configuration` must be both non-nil or both nil Unable to obtain a task name port right for pid 638: (os/kern) failure (0x5)
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
1w
Reply to Why is the Documentation full of Conundrums?
@SymphOmni It seems you didn't understand the goal of the exercise ? The purpose is to let you discover through practice the basic Swift concepts. In the case the dictionaries, a very important type in Swift. Here you may have noted that interestingNumbers is a dictionary: a series of entries, which are pairs called key (a string) and value (here an array), such as Prime: [2, 3, 5, 7, 11, 13] The example shows that in the for loop, you explore the dictionary but don't need to know what the key is (because you don't use). as dark-as showed, if you need to use the key, change _ into a name as dataset, or key, or whatever you want. In another exercise, you could also iterate over the dictionary like this: let interestingNumbers = [ Prime: [2, 3, 5, 7, 11, 13], Fibonacci: [1, 1, 2, 3, 5, 8], Square: [1, 4, 9, 16, 25] ] var largest = 0 var largestSet = for aLine in interestingNumbers { // get entries of dictionary as a whole for number in aLine.value { // value is the array, for each entry line if number
Replies
Boosts
Views
Activity
1w
Reply to RealityKit fill the background environment
Thank you Albert so much for your explanation. It helped me a lot. I did as you suggested and the performance increased significantly. Right now I have 3 active chunks with total of around ~468,000 triangles (bushes, leaves, trunk, road and terrain). Do you think that is this a lot for a scene? Because when I take screenshot of the app, the application's frame rate drops by half. What might be the reason for that? My second question is that do you know any tool that converts EntityModels (loaded from Reality Composer Pro) into LowLevelMesh? Because the trees or the bushes that I created using LowLevelMesh doesn't look professional. (ex: I would like to see leaves a little bit with more detail, or I would like to make branches more edgy)
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
1w
Reply to Why is the Documentation full of Conundrums?
They're asking you to give the _ variable a name, and it's entirely up to you what you call it; it's just a variable name. The right-hand side was already given a name, numbers, and it represents the right-hand side of the information in the interestingNumbers array, i.e. the arrays of numbers. The underscore represents the left-hand side of the data, i.e. the three String values: Prime, Fibonacci, and Square. If you do this, you should see how it works: let interestingNumbers = [ Prime: [2, 3, 5, 7, 11, 13], Fibonacci: [1, 1, 2, 3, 5, 8], Square: [1, 4, 9, 16, 25] ] var largest = 0 var largestSet = for (dataset, numbers) in interestingNumbers { for number in numbers { if number > largest { largest = number largestSet = dataset } } } print (largestSet '(largestSet)' with value: (largest))
Replies
Boosts
Views
Activity
1w
Reply to Tap to Pay entitlement stuck in development for nearly 1.5 months – do I need to resubmit?
Hi @mensrev, @devraj, You'll need to contact the provisioning team where you made your initial request. Please see the post below for more information: Resolving Tap to Pay on iPhone errors when building for App Store, TestFlight, or Enterprise distribution https://developer.apple.com/forums/thread/794192 Cheers, Paris X Pinkney |  WWDR | DTS Engineer
Replies
Boosts
Views
Activity
1w
Reply to "WilderDEX!" - Wildlife Collector App - NOW TESTING
I tried posting this a couple times now but it keeps getting taken down - I am not too sure why. I haven't received any explanation and I am pretty certain that I am posting in the correct place for TestFlight Apps. It gets taken down because this is absolutely and totally NOT the right place for posts like this. These are the Developer Forums where third-party developers of apps for Apple's platforms ask for hints and tips on coding issues, or the Apple Developer Programme. If you have an issue in your code, ask us and we may be able to help you. If you want us to test your apps, you must know that we have our own apps to develop and test, and don't have the time to test yours. Sorry, that's just the way it is. You can go on the various Mac-/iPhone-related news & rumours forums and ask for people to test your apps, because I can tell you right now, we don't have the time. Sorry.
Replies
Boosts
Views
Activity
1w
Apple Developer Program subscription is active, but my account is still not activated
Hello, My Apple Developer Program subscription is active in Apple subscriptions and valid until March 19, 2027, but my developer account is still not activated and I have not received the confirmation email. The purchase was completed through the Apple Developer app on my wife’s iPhone, but the enrollment was for my own Apple Account. In the app, I see the message: “You’ll receive an email soon.” Has anyone had the same issue? How was it resolved? Thank you.
Replies
0
Boosts
0
Views
143
Activity
1w
Error 7000 "Team is not yet configured for notarization" — 6 days, no resolution
I enrolled in the Apple Developer Program as an Individual on March 16, 2026 (Team ID: CAZ8X23YWW). I've been trying to notarize a macOS Electron desktop app ever since. Every submission is immediately rejected with: Status code: 7000 Message: Team is not yet configured for notarization What I've done: Accepted all agreements on developer.apple.com Accepted all agreements on App Store Connect Created a Developer ID Application certificate (G2 Sub-CA) App is properly signed with hardened runtime Submitted a support ticket under Distribution > Other Distribution Questions on March 18 — no response after 4 days
Replies
1
Boosts
0
Views
83
Activity
1w
Reply to SpeechTranscriber not supported
The 16-core Neural Engine theory lines up with what I have seen in practice on Mac hardware as well. Mac mini M4 (16-core NE) runs SpeechTranscriber and SpeechAnalyzer without issues. M1 devices (also 16-core NE) work too. For the Simulator issue — this is expected unfortunately. SpeechTranscriber relies on the Neural Engine for on-device inference, and the Simulator does not emulate the ANE. The isAvailable check returns false because the underlying model cannot run there. Practical workaround for development: use a conditional compilation check and fall back to SFSpeechRecognizer (the older API) in Simulator builds. SFSpeechRecognizer still works on Simulator and gives you a close-enough approximation for UI development and integration testing. You only need a real device for final accuracy testing. Regarding the 8-core vs 16-core cutoff: my guess is that SpeechTranscriber uses a model size that requires the throughput of a 16-core Neural Engine to meet real-time
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
1w