The issue reproducible with empty project. When you run it and tap Open immersive space it takes a couple of minutes to respond. The issue only reproducible on real device with debugger attached. Reproducible other developers too (not specific to my environment). Issue doesn't exists in Xcode 16. Afer initial long delay subsequent opens works fine. Console logs: nw_socket_copy_info [C1:2] getsockopt TCP_INFO failed [102: Operation not supported on socket] nw_socket_copy_info getsockopt TCP_INFO failed [102: Operation not supported on socket] Failed to set dependencies on asset 9303749952624825765 because NetworkAssetManager does not have an asset entity for that id. void * _Nullable NSMapGet(NSMapTable * _Nonnull, const void * _Nullable): map table argument is NULL PSO compilation completed for driver shader copyFromBufferToTexture so=0 sbpr=256 sbpi=16384 ss=(64, 64, 1) p=70 sc=1 ds=0 dl=0 do=(0, 0, 0) in 1997 XPC connection interrupted <<<< FigAudioSession(AV) >>>> audioSess
Search results for
iPhone 16 pro
78,720 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Prerequisite: After the MDM APP issues the command, the camera on the phone is no longer visible (unusable). After upgrading to iOS 26.1, the isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method keeps returning true when the camera is unavailable. The isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method on iOS 26.0.1 is normal, returning false when the camera is unavailable and true when it is available.
I am using iPhone 16 pro, and I am running on the IOS 26.1 (23B5064e) in settings, going to Camera, then to formats, then in “Pro Raw and Resolution Control”, I am only getting “Pro Raw Format”(JPEG Lossless Most Compatible), but despite after numerous attempts I am unable to find the option for “Pro Raw Max”(upto 48 MP) or HEIF Max (upto 48 MP); please help me anyone I desperately want “Pro Raw Max”, moreover I cannot find the “Raw Max” option in the camera app, but only getting the “Raw” option; thanks a lot in advance…🙏🏻🙏🏻🙏🏻🙏🏻
Topic:
App & System Services
SubTopic:
General
I am having a rare crash when calling FileHandle(forWritingTo:) initializer with a file that does not exist. In the documentation, I see that the expected behaviour is to return nil, but in my app, in rare cases I have a crash. You're passing in a URL object that's already been released. The key clue is here: Exception Subtype: KERN_INVALID_ADDRESS at 0x00000000deadbeef Exception Codes: 0x0000000000000001, 0x00000000deadbeef The address 0xdeadbeef isn't accidental. It's a sentinel value that the system commonly uses when it wants to set a pointer to something that's recognizably known to be invalid. In the particular case, CFURL's* deallocate function sets the pointer that points to the actual string bytes to 0xdeadbeef just before it returns. Your app crashes like this the next time you try to use that URL object. In terms of tracking the issue down, testing with ASAN or the Zombie’s instrument are your best options. *NSURL and CFURL share their underlying implementation. As an aside, and not directly releva
Topic:
App & System Services
SubTopic:
Core OS
Tags:
in an iPhone with both sim (as default) and eSim (as secondary) contacts that are associated with eSim (secondary line) are being reaassociated with the default line when changing phones.
Topic:
App & System Services
SubTopic:
iCloud & Data
Hi, I am trying to load files from the Apple Vision Pro's storage into a Unity App (using Apple visionOS XR Plugin and not PolySpatial package). So far, I've tried using UnitySimpleFileBrowser and UnityStandaloneFileBrowser (both aren't made for the Vision Pro and don't work there), and then implemented my own naive file browser that at least allows me to view directories (that I can see from the App Sandbox). This is of course very limited: Gray folders can't be accessed, the only 3 available ones don't contain anything where a user would put files through the Files app. I know that an app can request access to these Files & Folders: So my question is: Is there a way to request this access for a Unity-built app at the moment? If yes, what do I need to do? I've looked into the generated Xcode project's Capabilities, but did not find anything related to file access. Any help is appreciated!
I am working on a Gaussian Splatting App using the UnityGaussianSplatting package and building for Apple Vision Pro. I want to be able to load splat (.ply / .splat) files from the Vision Pro's storage into the application. OK. So the main issue to be aware of here is the difference between works fine for a developer and works well for a user. For development purposes, you can basically just set the two keys (UIFileSharingEnabled/LSSupportsOpeningDocumentsInPlace) and then use your Documents directory as your working storage. There could be issues if you edited or deleted those files while your app was actively running, but you will just avoid those issues... by not doing that. On the other hand, if you're planning to ship this to end users then you'll need to use things like file coordination to avoid those issues. Similarly, apps that are more viewer oriented (meaning, they don't edit the files they're working with) often use an import model where they copy (actually clone, so they don't us
Topic:
Spatial Computing
SubTopic:
General
Tags:
Hi everyone, We’re developing a Unity project for Apple Vision Pro that connects PSVR2 Sense controllers for advanced interaction and input. We’ve encountered a major limitation: when the controller is not held close to the designated hand (e.g., resting on a table or held by the non designated hand), the Sense controller enters a low-power or reduced-update mode. This results in noticeably reduced tracking update frequency and responsiveness until the controller is held again. For certain use cases, this behavior is undesirable. In our case, it prevents continuous real-time tracking of the controller even when it’s stationary or being tracked externally. Request: Please consider exposing an API flag or developer option in ARKit to disable and optionally delay the low-power mode when the app requires full-rate updates regardless of proximity or hand pose detection.
If I trigger the apple rating modal in an Immersive space it appears on the ground in (0,0,0) I need it to be in front of the user like push notification perimssion does or other permissions requests.
When our Bluetooth device is scanned and a connection is initiated through the app on the iPhone 17, the air log shows that the iPhone sends an LL_LENGTH_REQ to execute the Data Length Update Procedure. However, our peripheral does not support the Bluetooth LE Data Length Extension, so it responds with an LL_UNKNOWN_RSP PDU with the UnknownType field set to LL_LENGTH_REQ. After receiving the LL_UNKNOWN_RSP, the iPhone 17 does not proceed with the subsequent Bluetooth LE service discovery process. The connection is maintained until the peripheral actively disconnects. Once the peripheral disconnects and continues broadcasting Bluetooth signals, the iPhone 17 repeatedly tries to connect to the peripheral and executes the aforementioned process, even if the app has been terminated. According to the Bluetooth 4.2 core specification ([Vol. 6] Part B, Section 5.1.9), which can be found here: https://www.bluetooth.com/specifications/specs/core-specification-amended-4-2/, the iPhone
Thanks @DTS Engineer for your quick & detailed answer!! So, my immediate question here is what your larger goal here actually is? I am working on a Gaussian Splatting App using the UnityGaussianSplatting package and building for Apple Vision Pro. I want to be able to load splat (.ply / .splat) files from the Vision Pro's storage into the application. This won't work. The primary goal of the App Sandbox is to protect user data, which is exactly the kind of data you want access to. Yeah, that was also what I suspected. Hence my question what the intended way to access files from an app would be. apps get access to files through one of two broad mechanisms: Thanks for referring to these, I think I stumbled upon it while researching, but did not consider them further yet because they are solutions that require editing the code in Xcode. Which poses two problems for me: 1 - I don't have a lot of experience with Swift and how Xcode apps are structured. 2 - The Xcode project is generated from
Topic:
Spatial Computing
SubTopic:
General
Tags:
I see in iPhone built-in apps that action sheets are presented as popovers without arrows over their originating views. Here is an example in Messages and Shortcuts apps. In WWDC 2025 session Build a UIKit app with the new design, the speaker explains that all you have to do is to configurate the popover like we do for iPad. Here is the relevant transcript: 14:33 ActionSheets on iPad are anchored to their source views. Starting in iOS 26, they behave the same on iPhone, appearing directly over the originating view. 14:46 On the alertController, make sure to set the sourceItem or the sourceView on popoverPresentationController, regardless of which device it’s displayed on. Assigning the source view automatically applies the new transitions to action sheets as well! Action sheets presented inline don’t have a cancel button because the cancel action is implicit by tapping anywhere else. If you don’t specify a source, the action sheet will be centered, and you will have a cancel button. iOS 26 p
Topic:
UI Frameworks
SubTopic:
UIKit
I have an iOS app with a QuickLook extension. I also added Apple Vision Pro in the target's General > Supported Destinations section. About one year ago, I was able to run the app on iPhone, iPad and Apple Vision Pro Simulators. Today I tried running it again on Apple Vision Pro with Xcode 26.0.1, but Xcode shows this error: Try again later. Appex bundle at ~/Library/Developer/CoreSimulator/Devices/F6B3CCA8-82FA-485F-A306-CF85FF589096/data/Library/Caches/com.apple.mobile.installd.staging/temp.PWLT59/extracted/problem.app/PlugIns/problemQuickLook.appex with id org.example.problem.problemQuickLook specifies a value (com.apple.quicklook.preview) for the NSExtensionPointIdentifier key in the NSExtension dictionary in its Info.plist that does not correspond to a known extension point. I tried again later a couple times, even after running Clean Build Folder Immediately, without any change. I can reproduce this with a fresh Xcode project to which I add a Quick Look Preview Ext
I tried some more myself and I was able to solve the issue by adding Apple Vision Pro to the Supported Destinations of the extension. Previously I had only added it to the main app.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
Hello everyone. Can I download ios 17 on a regular iphone x, not an xs, but an x? How can I do this using the developer program?
Topic:
App & System Services
SubTopic:
Core OS