visionOS

RSS for tag

Discuss developing for spatial computing and Apple Vision Pro.

Posts under visionOS tag

200 Posts

Post

Replies

Boosts

Views

Activity

Main Camera Access Entitlement Bug
Hello everyone can you help me, i have requested main camera access API Enterprise and have got the license to, and i have setting up the project main camera access demo from apple with my new license and have create app bundle and identifier for it but when i tried to deploy it test flight i got some error say "Profile doesn't support Main Camera Access" and "Profile doesn't include the com.apple.developer.arkit.main-camera-access.alow entitlement, even have do it it app Certificates, Identifiers & Profiles and add the additional capability Main Camera Access. can you help me fixing this so that i can use Main Camera Access Entitlement
5
0
145
Jul ’25
Tipkit for VisionOS (TabView, etc.)
I am trying to create a user flow where I can guide the user how to navigate through my app. I want to add a tip on a TabView that indicates user to navigate to a specific tab. I have seen this work with iOS properly but I am a little lost as VisionOS is not responding the same for .popoverTip etc. Any guidance is appreciated!
0
0
174
Jul ’25
visionOS Window launch size
Before visionOS Beta 4 it was possible to define the launch size in the Info.plist using PreferredLaunchSize like so: <key>UILaunchPlacementParameters</key> <dict> <key>PreferredLaunchSize</key> <dict> <key>Height</key> <integer>750</integer> <key>Width</key> <integer>750</integer> </dict> </dict> In visionOS Beta 4 this now doesn't work anymore and the window opens in a 16:9 format and then will scale down to the .defaultSize of the WindowGroup with an animation. Settings, Notes, Safari still open with a different default size though, including the launch screen. How are we supposed to do this now?
2
1
1.1k
Jul ’25
Personas and Lighting in a visionOS 26 (or earlier) SharePlay Session
Use case: When SharePlay -ing a fully immersive 3D scene (e.g. a virtual stage), I would like to shine lights on specific Personas, so they show up brighter when someone in the scene is recording the feed (think a camera person in the scene wearing Vision Pro). Note: This spotlight effect only needs to render in the camera person's headset and does NOT need to be journaled or shared. Before I dive into this, my technical question: Can environmental and/or scene lighting affect Persona brightness in a SharePlay? If not, is there a way to programmatically make Personas "brighter" when recording? My screen recordings always seem to turn out darker than what's rendered in environment, and manually adjusting the contrast tends to blow out the details in a Persona's face (especially in visionOS 26).
0
0
64
Jun ’25
ShaderGraphMaterial.getParameter(handle:) always return nil, is this expected behavior?
I've loaded a ShaderGraphMaterial from a RealityKit content bundle and I'm attempting to access the initial values of its parameters using getParameter(handle:), but this method appears to always return nil: let shaderGraphMaterial = try await ShaderGraphMaterial(named: "MyMaterial", from: "MyFile") let namedParameterValue = shaderGraphMaterial.getParameter(name: "myParameter") // This prints the value of the `myParameter` parameter, as expected. print("namedParameterValue = \(namedParameterValue)") let handle = ShaderGraphMaterial.parameterHandle(name: "myParameter") let handleParameterValue = shaderGraphMaterial.getParameter(handle: handle) // Expected behavior: prints the value of the `myParameter` parameter, as above. // Observed behavior: prints `nil`. print("handleParameterValue = \(handleParameterValue)") Is this expected behavior? Based on the documentation at https://developer.apple.com/documentation/realitykit/shadergraphmaterial/getparameter(handle:) I'd expect getParameter(handle:) to return the value of the parameter, just as getParameter(name:) does. I've tested this on iOS 18.5 and iOS 26.0 beta 2. Assuming this getParameter(handle:) works as designed, is the following ShaderGraphMaterial extension an appropriate workaround, or can you recommend a better approach? Thank you. public extension ShaderGraphMaterial { /// Reassigns the values of all named material parameters using the handle-based API. /// /// This works around an issue where, at least as of RealityKit 26.0 beta 2 and /// earlier, `getParameter(handle:)` will always return `nil` when used to read the /// initial value of a shader graph material parameter read using /// `ShaderGraphMaterial(named:from:in:)`, whereas `getParameter(name:)` will work /// as expected. private mutating func copyNamedParametersToHandles() { for parameterName in self.parameterNames { if let value = self.getParameter(name: parameterName) { let handle = ShaderGraphMaterial.parameterHandle(name: parameterName) do { try self.setParameter(handle: handle, value: value) } catch { assertionFailure("Cannot set parameter value") } } } } }
1
0
313
Jun ’25
VisionOS 26 - threadsPerThreadgroup limit causing crash on device (but not in simulator)
Hi all, I'm running into an issue with an app that previously worked fine on device using visionOS 2.0. After updating to visionOS 26, the same code runs fine in the simulator but crashes on the device with the following error: -[MTLDebugComputeCommandEncoder _validateThreadsPerThreadgroup:]:1330: failed assertion `(threadsPerThreadgroup.width(32) * threadsPerThreadgroup.height(32) * threadsPerThreadgroup.depth(1))(1024) must be <= 832. (kernel threadgroup size limit)` Is there any documented way to check or increase the allowed threadsPerThreadgroup size on Apple Vision Pro? Or any recommended workaround for this regression? Thanks in advance!
3
0
131
Jun ’25
WebXR and PSVR2
We got very excited when we saw support for the PSVR2 on WWDC! Particularly interesting is WebXR to us, so we got the controllers to give it a try. Unfortunately they only register as gamepads in the navigator but not as XRInputDevice's We went through the experimental flags and didn't find something that is directly related. Is there a flag we missed? If not, when do you have PSVR2 support planned for WebXR?
1
5
197
Jun ’25
Can we access a "Locked in Place" value when a window has been locked without being snapped to a surface?
Starting in visionOS 26, users can snap windows to surfaces. These windows are locked in place and are later restored by visionOS. We can access the snapped data with surfaceSnappingInfo docs. Users can also lock a free-floating (unsnapped) window from a context menu in the window controls. Is there a way to tell when a window has been locked without being snapped to a surface?
7
3
150
Jun ’25
Apple Vision Pro Developer Strap: Video Out and Recording Time Limit
Any way to extend the video recording time in Reality Composer Pro from 3:00 to any longer value, such as editing preferences in Terminal or other workaround? Is there any way to use the strap and a USB-C cable as a live video stream input source that would mirror to Quicktime or some other video capture tool? I am assuming there is no online documentation or user manual for the strap, but please correct me if I'm wrong. Thank you.
1
0
97
Jun ’25
How to play blend shape animations or morph animations exported from blender in Vision Pro apps using Reality Kit
So I am exporting a .usdc file from blender that already has some morph animations. The animations play well in blender but when I export I cannot seem to play them in RealityKit or RCP. Entity.availableAnimations is an empty array. Not of the child objects in the entity hierarchy has an animation library component with it. Maybe I am exporting it wrong but I tried multiple combinations but doesn't seem to work. Here are my export settings in blender The original file I purchased is an FBX file that has the animation but when I try to directly get it in RealityConverter it doesn't seem to play animations.
2
0
128
Jun ’25
How to load a USDZ inside of a Reality Composer Pro package as ModelEntity in RealityKit ?
I need a MeshResource from ModelEntity to generate a box collider, but ModelEntity fails to load USDZ files from the Reality Composer Pro (RCP) bundle. This code works for loading an Entity: // Successfully loads as generic Entity previewEntity = try await Entity(named: fileName, in: realityKitContentBundle) But this fails when trying to load as ModelEntity: // Fails to load as ModelEntity modelEntity = try await ModelEntity(named: fileName, in: realityKitContentBundle) I found this thread mentioning: "You'll likely go from USDZ to Entity which contains a MeshResource when you load/init the USDZ file." But it doesn't explain ​how to actually extract the MeshResource. Could anyone advise: How to properly load USDZ files as ModelEntity from RCP bundles? How to extract MeshResource from a successfully loaded Entity? Alternative approaches to generate box colliders if direct extraction isn't possible? Sample code for extraction or workarounds would be greatly appreciated!
1
0
134
Jun ’25
Extracting IP with swift on visionOS
Hey everyone, I’m developing an app for visionOS where I need to display the Apple Vision Pro’s current IP address. For this I’m using the following code, which works for iOS, macOS, and visionOS in the simulator. Only on a real Apple Vision Pro it’s unable to extract an IP. Could it be that visionOS currently doesn’t allow this? Have any of you had the same experience and found a workaround? var address: String = "no ip" var ifaddr: UnsafeMutablePointer<ifaddrs>? = nil if getifaddrs(&ifaddr) == 0 { var ptr = ifaddr while ptr != nil { defer { ptr = ptr?.pointee.ifa_next } let interface = ptr?.pointee let addrFamily = interface?.ifa_addr.pointee.sa_family if addrFamily == UInt8(AF_INET) { if let name: Optional<String> = String(cString: (interface?.ifa_name)!), name == "en0" { var hostname = [CChar](repeating: 0, count: Int(NI_MAXHOST)) getnameinfo(interface?.ifa_addr, socklen_t((interface?.ifa_addr.pointee.sa_len)!), &hostname, socklen_t(hostname.count), nil, socklen_t(0), NI_NUMERICHOST) address = String(cString: hostname) } } } freeifaddrs(ifaddr) } return address } Thanks in advance for any insights or tips! Best Regards, David
2
1
126
Jun ’25
Please provide access to face tracking blendshapes on vision os
Apple, please provide access to face tracking blend shapes on vision os, just like you do on iOS. You have the best eye and face tracking implementation on the market, please let us use it. There is a sizable audience who will buy the headset just for it. I personally know multiple people who are not buying the headset simply because you locked those features out. No raw camera access is needed, just abstracted blendshapes values. You will make the headset so much more useful if you do this simple thing.
1
1
91
Jun ’25
WebXR Consent Dialog
Based on the "Build immersive web experiences with WebXR"-Video for visionOS there is no way to disable the consent prompts for entering an immersive experience or consent hand-tracking. For the microphone it's possible to "greenlight" specific websites for mic input, which works great. I'd welcome it, if it were possible to add specific websites in the settings, in which those consent dialogs aren't shown each time. In my opinion, the user interaction through a button that launches the experience would be sufficient to not disorient.
0
1
66
Jun ’25