visionOS

RSS for tag

Discuss developing for spatial computing and Apple Vision Pro.

Posts under visionOS tag

200 Posts

Post

Replies

Boosts

Views

Created

Looking for a visionOS collaborator (brief + feedback for spatial prototype)
Hi everyone, I’m a mixed reality designer currently building visionOS experiences (RealityKit + SwiftUI), and I’m working on a course at Arizona State University where we develop a project in response to a real-world client. I’m looking to collaborate with someone in the visionOS ecosystem who would be open to sharing a simple prompt or brief (nothing confidential), which I can use to create a proposal response. The outcome would include: Spatial concept + storyboard Interaction design direction Rough scope, timeline, and team structure Optional small prototype in visionOS The goal on my side is to better understand how spatial work is framed and communicated in real-world scenarios, especially within Apple’s ecosystem. The ask is very lightweight: 1–2 short check-ins during April or async feedback via email A bit about my work: Built Spatial Synesthesia for Apple Vision Pro Eye tracking interaction that isolates instruments in a live orchestral mix based on color perception Working with computer vision and spatial anchoring in Unity and visionOS Focused on perception-driven interaction and minimal UI systems If you’re a developer, designer, or part of a small studio working in visionOS and this sounds interesting, I’d love to connect. Thanks in advance. Leo
0
0
36
7h
How to renew visionOS Enterprise API entitlement / license?
Hi everyone, I’m currently using the visionOS Enterprise APIs, and I noticed in the official documentation that: However, I couldn’t find any clear instructions on how to actually renew the license. What is the correct process to renew a visionOS Enterprise API license? Do I need to submit a new entitlement request again to renew? Is there any official step-by-step guide or documentation for renewal? Any advice or shared experience would be greatly appreciated 🙏 Thank you!
1
0
516
5d
Xcode fails to compile Blender-exported USDZ in .rkassets with misleading "permission" error — Xcode 26.3
The error: When building a RealityKitContent package that contains a USDZ file exported from Blender, Xcode throws the following error: error: [xrsimulator] Exception thrown during compile: Cannot get rkassets content for path .../RealityKitContent.rkassets because 'The file "RealityKitContent.rkassets" couldn't be opened because you don't have permission to view it.' error: Tool exited with code 1 The error message mentions "permission" — but permissions are not the issue. This appears to be a misleading error from realitytool masking a USD validation failure. What I've ruled out File permissions — all files are -rw-r--r--, user has Read & Write on the folder Extended attributes / quarantine flag — other files with the same @ flag work fine Corrupted archive — unzip -t confirms the USDZ is valid (board.usdc + textures) Stale build cache — deleted DerivedData and com.apple.DeveloperTools cache, no change Key observations The same file builds successfully on my colleague's machine running identical Xcode 26.3 - MacOS 26.3 Other USDZ files in the same .rkassets bundle (downloaded from Sketchfab, or created in Reality Composer Pro) compile without any issue. Only USDZ files exported directly from Blender are affected. When the file is placed in Bundle.main and loaded via Entity(named:in:.main), it works perfectly — no errors Reality Converter flags the file with two errors: UsdGeomPointInstancers not allowed, and root layer must be .usdc with no external dependencies The confusing part: the same file compiles fine on an identical Xcode 26.3 setup and importing method. This suggests either a machine-specific difference in Xcode's validation behavior, or a cached .reality bundle on my colleague's machine that isn't being recompiled. Current workaround: Loading from Bundle.main instead of the RealityKitContent package bypasses realitytool entirely and works, but loses Reality Composer Pro integration: if let entity = try? await Entity(named: "test", in: Bundle.main)
1
0
624
6d
Immersive API
After updating to visionOS 26.4, I went to Safari's Feature Flags page to reenable Website environment, and I found the Website environment switch had been moved from the top and into the list of switches below. In its place is "Immersive API". I could not find any documentation on this. Anyone know what it is for or can point me to documentation?
1
0
146
1w
Website environment disappears suddenly
After I updated to visionOS 26.4, I noticed my website environment would suddenly turn off occasionally while I was watching YouTube in Safari. My M2 AVP was still warm after the update. Is turning off a website environment expected behavior when the headset gets warm (e.g., perhaps to reduce load)? If not, anyone have an idea this might happen?
1
0
243
1w
RealityView content disappears when selecting Lock In Place on visionOS
Hi, I'm experiencing an issue where all RealityView content disappears when the user selects "Lock In Place" from the window management menu (long press on close button). "Follow Me" works correctly and this happens in Testflight builds only not reproducible when I run locally I have reproduced this with a minimal project containing nothing but a simple red cube — no custom anchors, no app state, no dependencies. Steps to Reproduce: Open an ImmersiveSpace. A red cube is placed 1m in front of the user via RealityView. Long press the X button on any floating window Select "Lock In Place". The cube disappears immediately. Expected: Cube remains visible after window is locked Actual: Cube disappears. Minimal reproducible code: var body: some View { RealityView { content in let cube = ModelEntity( mesh: .generateBox(size: 0.3), materials: [SimpleMaterial(color: .red, isMetallic: false)] ) cube.setPosition(SIMD3<Float>(0, 1.5, -1), relativeTo: nil) content.add(cube) } } } Device: Apple Vision Pro visionOS version: Vision OS 26.2 (23N301) Xcode version: Version 26.3 (17C529) Is this a known issue? Is there a recommended workaround to preserve RealityView content during Lock In Place transitions? Thank you!
3
0
487
1w
RealityView content disappears when selecting Lock In Place on visionOS
I'm experiencing an issue where all RealityView content disappears when the user selects "Lock In Place" from the window management menu (long press on close button). "Follow Me" works correctly and this happens in Testflight builds only not reproducible when I run locally I have reproduced this with a minimal project containing nothing but a simple red cube — no custom anchors, no app state, no dependencies. Steps to Reproduce: Open an ImmersiveSpace A red cube is placed 1m in front of the user via RealityView Long press the X button on any floating window Select "Lock In Place" The cube disappears immediately Expected: Cube remains visible after window is locked Actual: Cube disappears. Note: "Follow Me" does NOT reproduce this issue. Minimal reproducible code: struct ImmersiveView: View { var body: some View { RealityView { content in let cube = ModelEntity( mesh: .generateBox(size: 0.3), materials: [SimpleMaterial(color: .red, isMetallic: false)] ) cube.setPosition(SIMD3(0, 1.5, -1), relativeTo: nil) content.add(cube) } } } Device: Apple Vision Pro visionOS version: Vision OS 26.2 (23N301) Xcode version: Version 26.3 (17C529) Is this a known issue? Is there a recommended workaround to preserve RealityView content during Lock In Place transitions? Thank you!
0
0
288
1w
Can I use metal shader if I use RealityKit to build a VisionOS app?
I asked AI to build a realistic ocean shader for a VisionOS project using RealityKit. It gave a bunch instructions and asked me to connect large amount of nodes for the ShaderGraph in the Reality Composer Pro. I am just wondering if AI can help to generate the Metal shader code directly, or build the ShaderGraph nodes for me automatically.
0
0
501
2w
Can video reflections in immersive space work with VideoMaterial, or is AVPlayerViewController with dockingRegion required?
Hi Apple Developer Forums, I'm developing a visionOS video streaming app that uses a custom immersive cinema experience with RealityKit. I have a question about enabling video reflections in an immersive environment. My Current Implementation I'm using VideoMaterial with AVPlayer to display video on a ModelEntity plane in an immersive space: // Create screen mesh let screenMesh = MeshResource.generatePlane( width: VideoTheater.screenWidth, height: VideoTheater.screenHeight, cornerRadius: 0.0 ) let screenEntity = ModelEntity(mesh: screenMesh) // Apply VideoMaterial with AVPlayer screenEntity.model?.materials = [VideoMaterial(avPlayer: player)] The video renders correctly in the immersive space, but I don't see any video reflections on surrounding surfaces. Apple's Documentation Approach According to the documentation at https://developer.apple.com/documentation/visionos/enabling-vid eo-reflections-in-an-immersive-environment, the recommended approach uses: AVPlayerViewController for video playback dockingRegion modifier to specify where the video should appear The system automatically handles reflections My Question Is using AVPlayerViewController with dockingRegion the only way to get video reflections in an immersive environment? Or is it possible to enable reflections when using VideoMaterial directly with RealityKit's ModelEntity? My app requires a custom immersive cinema experience with: Custom screen positioning and scaling Danmaku (bullet comments) overlay Custom gesture controls HDR/Dolby Vision support Switching to AVPlayerViewController would require significant architectural changes, so I'd prefer to keep my current VideoMaterial approach if reflections can be enabled somehow. If VideoMaterial cannot produce reflections, are there any alternative approaches to achieve diffuse video reflections with a custom RealityKit setup? Environment visionOS 2.x RealityKit AVPlayer with custom resource loader (for DASH streams) Thank you for any guidance!
0
0
592
2w
Cannot find devices in RemoteImmersiveSpace
Hi, I'm running the Spatial Rendering App sample on a Macbook Pro running 26.4 Beta and the Vision Pro running visionOS 26.3.1. Handoff and SharePlay are on, both devices are on the same Apple ID and network, and SharePlay screen sharing works fine between the two devices. However, when calling openImmersiveSpace, the device picker fails to present and no devices are found. Errors from console: ((processConfiguration != nil && configuration != nil) || (processConfiguration == nil && configuration == nil)) - .../ExtensionKit/Source/HostViewController/Internal/EXHostSessionDriver.m:80: `processConfiguration` and `configuration` must be both non-nil or both nil Unable to obtain a task name port right for pid 638: (os/kern) failure (0x5) Unable to present an ImmersiveSpace for Scene id 'Compositor Services' Is this a known bug or I'm I missing something? Thanks!
2
1
1.1k
3w
Attaching a hand model to your hands
Hi, we have been working on an application that attaches a hand model to the users hands. Apple provides an animating hand models in visionOS project that is a useful starting point. https://developer.apple.com/documentation/visionOS/animating-hand-models-in-visionOS We have been trying to create our own hand model to attach but have had some issues with how it is attaching to the hand. For our hand model we want to include the forearm all the way up to the users elbow. I have attached a sample project of what our code currently looks like so you can run it. Just select show immersive space to attach the models. The left hand model is the space glove that we were trying to mirror. The right hand model is our model that we have been using. I have mapped each of the joints to the pertaining joint name on our model. The first issue we are having seems to be based around the placement of the forearm. It attaches itself at the wrist. The second issue seems to be around rotation. Our team is looking for some guidance on what needs to change in order to map this model correctly. Thanks in advance!
2
0
252
Feb ’26
360° video playback Issue
When rendering an equirectangular video on a sphere using VideoMaterial and MeshResource.generateSphere(), there is a visible black seam line running vertically on the sphere. This appears to be at the UV seam where the texture coordinates wrap from 1.0 back to 0.0. The same video file plays without any visible seam in other 360° video players on Vision Pro, so the issue is not with the video content itself. Here is the relevant code: private func createVideoSphere(content: RealityViewContent, player: AVPlayer) { let sphere = MeshResource.generateSphere(radius: 1000) let material = VideoMaterial(avPlayer: player) let entity = ModelEntity(mesh: sphere, materials: [material]) entity.scale *= .init(x: -1, y: 1, z: 1) // Flip to render on inside content.add(entity) player.play() } The setup is straightforward: MeshResource.generateSphere(radius: 1000) generates the sphere mesh VideoMaterial(avPlayer:) provides the video texture X scale is flipped to -1 so the texture renders on the inside of the sphere The video is a standard equirectangular 360° MP4 file What I've tried: I attempted to create a custom sphere mesh using MeshDescriptor with duplicate vertices at the UV seam (longitude 0°/360°) to ensure proper UV continuity. However, VideoMaterial did not render any video on the custom mesh (only audio played), and the app eventually crashed. It seems VideoMaterial may have specific mesh requirements. Questions: Is the black seam a known limitation of MeshResource.generateSphere() when used with VideoMaterial for 360° video? Is there a recommended way to eliminate this UV seam — for example, a texture addressing mode or a specific mesh configuration that works with VideoMaterial? Is there an official sample project or code example for playing 360° equirectangular video in a fully immersive space on visionOS? That would be extremely helpful as a reference. Any guidance would be greatly appreciated. Thank you!
0
0
288
Feb ’26
Possible Bug - Hover Effects/Spatial Event Compatibilty with PSVR2 Controllers?
Hi, I would like clarification on whether the new hover effects feature introduced in vision os 26 supported pinch gestures through the psvr 2 controllers. In your sample application, I found that this was not working. Pulling the trigger on the controller whilst looking at the 3d object did not activate the hover effect spatial event in the sample application. (The object is showing the highlight though), only pinch clicking with my fingers seem to be registering/triggering the spatial event. I am using Vision OS 26.3 This is inconsistent with how the psvr2 controller behaves on swift ui views and ui view elements, where the trigger press does count as a button click. The sample I used was this one: https://developer.apple.com/documentation/compositorservices/rendering_hover_effects_in_metal_immersive_apps
2
0
814
Feb ’26
AVP, Developer Strap v2, Ethernet connection and Xcode
Hi, I wonder if there's something that can be configured to force Xcode (and preferably MVD too) to use Ethernet connection between Mac Mini and Apple Vision Pro (over a USB hub, not a direct USB connection)? If I connect AVP to Mac directly via USB, the bridge gets created and both MVD and Xcode default to it, which is great because of higher speed and lower latency. My problem is that I work with external camera, so I can have either the camera, or the Mac connection, but not both. I tried to solve that by plugging in a small active USB hub, so the strap and camera are connected to it, plus it has Ethernet adapter, which is plugged into Mac port. I tried with internet sharing on Mac - AVP has internet access, I can ping AVP from Mac, but Xcode and MVD still use wifi. I tried to manually configure bridge without internet sharing - same effect. I tried to make the bridge highest priority connection - nothing changed. I tried to force routing to AVP IP over the bridge - nothing (and it seems that my routing entry went missing after some time and was replaced by "use wifi interface"). So - is there something more I can do to make at least Xcode go over the cable? Debugging over wifi often takes forever.
0
0
127
Feb ’26
VisionOs Development: Seeking Advice on Key Strategic Crossroads
I am a developer working on developing a space journal application. During the development process, I encountered several crucial strategic and technical decisions, and I would like to hear the experiences of those who have gone through similar situations. Here are the simplified versions of several questions I have. Resource allocation: Which problem should I address first? Design direction: In terms of interaction and UI design, how should I balance "immersion" and "usability"? Market selection: Was it easier for a business to survive in the early stages as a B2B or B2C entity? Cost estimation: How can I reasonably present to my investors the development costs of this project? In order to avoid relying solely on intuition in my decisions, I created a short questionnaire, hoping to gather more structured opinions from my colleagues. If you are also exploring VisionOS, I sincerely hope you can take a few minutes to fill it out. The results are extremely important to me, and I would be more than happy to share the final summary findings with you.
1
0
158
Feb ’26
Looking for a visionOS collaborator (brief + feedback for spatial prototype)
Hi everyone, I’m a mixed reality designer currently building visionOS experiences (RealityKit + SwiftUI), and I’m working on a course at Arizona State University where we develop a project in response to a real-world client. I’m looking to collaborate with someone in the visionOS ecosystem who would be open to sharing a simple prompt or brief (nothing confidential), which I can use to create a proposal response. The outcome would include: Spatial concept + storyboard Interaction design direction Rough scope, timeline, and team structure Optional small prototype in visionOS The goal on my side is to better understand how spatial work is framed and communicated in real-world scenarios, especially within Apple’s ecosystem. The ask is very lightweight: 1–2 short check-ins during April or async feedback via email A bit about my work: Built Spatial Synesthesia for Apple Vision Pro Eye tracking interaction that isolates instruments in a live orchestral mix based on color perception Working with computer vision and spatial anchoring in Unity and visionOS Focused on perception-driven interaction and minimal UI systems If you’re a developer, designer, or part of a small studio working in visionOS and this sounds interesting, I’d love to connect. Thanks in advance. Leo
Replies
0
Boosts
0
Views
36
Activity
7h
How to renew visionOS Enterprise API entitlement / license?
Hi everyone, I’m currently using the visionOS Enterprise APIs, and I noticed in the official documentation that: However, I couldn’t find any clear instructions on how to actually renew the license. What is the correct process to renew a visionOS Enterprise API license? Do I need to submit a new entitlement request again to renew? Is there any official step-by-step guide or documentation for renewal? Any advice or shared experience would be greatly appreciated 🙏 Thank you!
Replies
1
Boosts
0
Views
516
Activity
5d
Xcode fails to compile Blender-exported USDZ in .rkassets with misleading "permission" error — Xcode 26.3
The error: When building a RealityKitContent package that contains a USDZ file exported from Blender, Xcode throws the following error: error: [xrsimulator] Exception thrown during compile: Cannot get rkassets content for path .../RealityKitContent.rkassets because 'The file "RealityKitContent.rkassets" couldn't be opened because you don't have permission to view it.' error: Tool exited with code 1 The error message mentions "permission" — but permissions are not the issue. This appears to be a misleading error from realitytool masking a USD validation failure. What I've ruled out File permissions — all files are -rw-r--r--, user has Read & Write on the folder Extended attributes / quarantine flag — other files with the same @ flag work fine Corrupted archive — unzip -t confirms the USDZ is valid (board.usdc + textures) Stale build cache — deleted DerivedData and com.apple.DeveloperTools cache, no change Key observations The same file builds successfully on my colleague's machine running identical Xcode 26.3 - MacOS 26.3 Other USDZ files in the same .rkassets bundle (downloaded from Sketchfab, or created in Reality Composer Pro) compile without any issue. Only USDZ files exported directly from Blender are affected. When the file is placed in Bundle.main and loaded via Entity(named:in:.main), it works perfectly — no errors Reality Converter flags the file with two errors: UsdGeomPointInstancers not allowed, and root layer must be .usdc with no external dependencies The confusing part: the same file compiles fine on an identical Xcode 26.3 setup and importing method. This suggests either a machine-specific difference in Xcode's validation behavior, or a cached .reality bundle on my colleague's machine that isn't being recompiled. Current workaround: Loading from Bundle.main instead of the RealityKitContent package bypasses realitytool entirely and works, but loses Reality Composer Pro integration: if let entity = try? await Entity(named: "test", in: Bundle.main)
Replies
1
Boosts
0
Views
624
Activity
6d
Immersive API
After updating to visionOS 26.4, I went to Safari's Feature Flags page to reenable Website environment, and I found the Website environment switch had been moved from the top and into the list of switches below. In its place is "Immersive API". I could not find any documentation on this. Anyone know what it is for or can point me to documentation?
Replies
1
Boosts
0
Views
146
Activity
1w
Website environment disappears suddenly
After I updated to visionOS 26.4, I noticed my website environment would suddenly turn off occasionally while I was watching YouTube in Safari. My M2 AVP was still warm after the update. Is turning off a website environment expected behavior when the headset gets warm (e.g., perhaps to reduce load)? If not, anyone have an idea this might happen?
Replies
1
Boosts
0
Views
243
Activity
1w
How do you collect eye gaze data from vision pro
Hello, I know that Apple bans user from accessing to raw gaze data like gaze vector (x,y,z) or eye position. But when you do research on gaze data, how did you collect them from vision pro? Is there any App to solve this problem?
Replies
0
Boosts
0
Views
99
Activity
1w
RealityView content disappears when selecting Lock In Place on visionOS
Hi, I'm experiencing an issue where all RealityView content disappears when the user selects "Lock In Place" from the window management menu (long press on close button). "Follow Me" works correctly and this happens in Testflight builds only not reproducible when I run locally I have reproduced this with a minimal project containing nothing but a simple red cube — no custom anchors, no app state, no dependencies. Steps to Reproduce: Open an ImmersiveSpace. A red cube is placed 1m in front of the user via RealityView. Long press the X button on any floating window Select "Lock In Place". The cube disappears immediately. Expected: Cube remains visible after window is locked Actual: Cube disappears. Minimal reproducible code: var body: some View { RealityView { content in let cube = ModelEntity( mesh: .generateBox(size: 0.3), materials: [SimpleMaterial(color: .red, isMetallic: false)] ) cube.setPosition(SIMD3<Float>(0, 1.5, -1), relativeTo: nil) content.add(cube) } } } Device: Apple Vision Pro visionOS version: Vision OS 26.2 (23N301) Xcode version: Version 26.3 (17C529) Is this a known issue? Is there a recommended workaround to preserve RealityView content during Lock In Place transitions? Thank you!
Replies
3
Boosts
0
Views
487
Activity
1w
RealityView content disappears when selecting Lock In Place on visionOS
I'm experiencing an issue where all RealityView content disappears when the user selects "Lock In Place" from the window management menu (long press on close button). "Follow Me" works correctly and this happens in Testflight builds only not reproducible when I run locally I have reproduced this with a minimal project containing nothing but a simple red cube — no custom anchors, no app state, no dependencies. Steps to Reproduce: Open an ImmersiveSpace A red cube is placed 1m in front of the user via RealityView Long press the X button on any floating window Select "Lock In Place" The cube disappears immediately Expected: Cube remains visible after window is locked Actual: Cube disappears. Note: "Follow Me" does NOT reproduce this issue. Minimal reproducible code: struct ImmersiveView: View { var body: some View { RealityView { content in let cube = ModelEntity( mesh: .generateBox(size: 0.3), materials: [SimpleMaterial(color: .red, isMetallic: false)] ) cube.setPosition(SIMD3(0, 1.5, -1), relativeTo: nil) content.add(cube) } } } Device: Apple Vision Pro visionOS version: Vision OS 26.2 (23N301) Xcode version: Version 26.3 (17C529) Is this a known issue? Is there a recommended workaround to preserve RealityView content during Lock In Place transitions? Thank you!
Replies
0
Boosts
0
Views
288
Activity
1w
Apple Vision Pro Usecase
Use case proposal: immersive mental visualisation and manifestation experiences on visionOS I'd like to open a discussion around a use case I believe is currently missing from the visionOS ecosystem
Replies
0
Boosts
0
Views
56
Activity
1w
Can I use metal shader if I use RealityKit to build a VisionOS app?
I asked AI to build a realistic ocean shader for a VisionOS project using RealityKit. It gave a bunch instructions and asked me to connect large amount of nodes for the ShaderGraph in the Reality Composer Pro. I am just wondering if AI can help to generate the Metal shader code directly, or build the ShaderGraph nodes for me automatically.
Replies
0
Boosts
0
Views
501
Activity
2w
Can video reflections in immersive space work with VideoMaterial, or is AVPlayerViewController with dockingRegion required?
Hi Apple Developer Forums, I'm developing a visionOS video streaming app that uses a custom immersive cinema experience with RealityKit. I have a question about enabling video reflections in an immersive environment. My Current Implementation I'm using VideoMaterial with AVPlayer to display video on a ModelEntity plane in an immersive space: // Create screen mesh let screenMesh = MeshResource.generatePlane( width: VideoTheater.screenWidth, height: VideoTheater.screenHeight, cornerRadius: 0.0 ) let screenEntity = ModelEntity(mesh: screenMesh) // Apply VideoMaterial with AVPlayer screenEntity.model?.materials = [VideoMaterial(avPlayer: player)] The video renders correctly in the immersive space, but I don't see any video reflections on surrounding surfaces. Apple's Documentation Approach According to the documentation at https://developer.apple.com/documentation/visionos/enabling-vid eo-reflections-in-an-immersive-environment, the recommended approach uses: AVPlayerViewController for video playback dockingRegion modifier to specify where the video should appear The system automatically handles reflections My Question Is using AVPlayerViewController with dockingRegion the only way to get video reflections in an immersive environment? Or is it possible to enable reflections when using VideoMaterial directly with RealityKit's ModelEntity? My app requires a custom immersive cinema experience with: Custom screen positioning and scaling Danmaku (bullet comments) overlay Custom gesture controls HDR/Dolby Vision support Switching to AVPlayerViewController would require significant architectural changes, so I'd prefer to keep my current VideoMaterial approach if reflections can be enabled somehow. If VideoMaterial cannot produce reflections, are there any alternative approaches to achieve diffuse video reflections with a custom RealityKit setup? Environment visionOS 2.x RealityKit AVPlayer with custom resource loader (for DASH streams) Thank you for any guidance!
Replies
0
Boosts
0
Views
592
Activity
2w
Cannot find devices in RemoteImmersiveSpace
Hi, I'm running the Spatial Rendering App sample on a Macbook Pro running 26.4 Beta and the Vision Pro running visionOS 26.3.1. Handoff and SharePlay are on, both devices are on the same Apple ID and network, and SharePlay screen sharing works fine between the two devices. However, when calling openImmersiveSpace, the device picker fails to present and no devices are found. Errors from console: ((processConfiguration != nil && configuration != nil) || (processConfiguration == nil && configuration == nil)) - .../ExtensionKit/Source/HostViewController/Internal/EXHostSessionDriver.m:80: `processConfiguration` and `configuration` must be both non-nil or both nil Unable to obtain a task name port right for pid 638: (os/kern) failure (0x5) Unable to present an ImmersiveSpace for Scene id 'Compositor Services' Is this a known bug or I'm I missing something? Thanks!
Replies
2
Boosts
1
Views
1.1k
Activity
3w
Attaching a hand model to your hands
Hi, we have been working on an application that attaches a hand model to the users hands. Apple provides an animating hand models in visionOS project that is a useful starting point. https://developer.apple.com/documentation/visionOS/animating-hand-models-in-visionOS We have been trying to create our own hand model to attach but have had some issues with how it is attaching to the hand. For our hand model we want to include the forearm all the way up to the users elbow. I have attached a sample project of what our code currently looks like so you can run it. Just select show immersive space to attach the models. The left hand model is the space glove that we were trying to mirror. The right hand model is our model that we have been using. I have mapped each of the joints to the pertaining joint name on our model. The first issue we are having seems to be based around the placement of the forearm. It attaches itself at the wrist. The second issue seems to be around rotation. Our team is looking for some guidance on what needs to change in order to map this model correctly. Thanks in advance!
Replies
2
Boosts
0
Views
252
Activity
Feb ’26
360° video playback Issue
When rendering an equirectangular video on a sphere using VideoMaterial and MeshResource.generateSphere(), there is a visible black seam line running vertically on the sphere. This appears to be at the UV seam where the texture coordinates wrap from 1.0 back to 0.0. The same video file plays without any visible seam in other 360° video players on Vision Pro, so the issue is not with the video content itself. Here is the relevant code: private func createVideoSphere(content: RealityViewContent, player: AVPlayer) { let sphere = MeshResource.generateSphere(radius: 1000) let material = VideoMaterial(avPlayer: player) let entity = ModelEntity(mesh: sphere, materials: [material]) entity.scale *= .init(x: -1, y: 1, z: 1) // Flip to render on inside content.add(entity) player.play() } The setup is straightforward: MeshResource.generateSphere(radius: 1000) generates the sphere mesh VideoMaterial(avPlayer:) provides the video texture X scale is flipped to -1 so the texture renders on the inside of the sphere The video is a standard equirectangular 360° MP4 file What I've tried: I attempted to create a custom sphere mesh using MeshDescriptor with duplicate vertices at the UV seam (longitude 0°/360°) to ensure proper UV continuity. However, VideoMaterial did not render any video on the custom mesh (only audio played), and the app eventually crashed. It seems VideoMaterial may have specific mesh requirements. Questions: Is the black seam a known limitation of MeshResource.generateSphere() when used with VideoMaterial for 360° video? Is there a recommended way to eliminate this UV seam — for example, a texture addressing mode or a specific mesh configuration that works with VideoMaterial? Is there an official sample project or code example for playing 360° equirectangular video in a fully immersive space on visionOS? That would be extremely helpful as a reference. Any guidance would be greatly appreciated. Thank you!
Replies
0
Boosts
0
Views
288
Activity
Feb ’26
Possible Bug - Hover Effects/Spatial Event Compatibilty with PSVR2 Controllers?
Hi, I would like clarification on whether the new hover effects feature introduced in vision os 26 supported pinch gestures through the psvr 2 controllers. In your sample application, I found that this was not working. Pulling the trigger on the controller whilst looking at the 3d object did not activate the hover effect spatial event in the sample application. (The object is showing the highlight though), only pinch clicking with my fingers seem to be registering/triggering the spatial event. I am using Vision OS 26.3 This is inconsistent with how the psvr2 controller behaves on swift ui views and ui view elements, where the trigger press does count as a button click. The sample I used was this one: https://developer.apple.com/documentation/compositorservices/rendering_hover_effects_in_metal_immersive_apps
Replies
2
Boosts
0
Views
814
Activity
Feb ’26
Foveated Streaming - Can I stream Immersive Video from Mac
This is a very exciting feature in 26.4 beta. But from the document, it seems can only integrate with NVIDIA CloudXR™ SDK. I'm wondering if it's possible to use this tool to stream immersive video from Mac to Vision Pro?
Replies
1
Boosts
0
Views
734
Activity
Feb ’26
AVP, Developer Strap v2, Ethernet connection and Xcode
Hi, I wonder if there's something that can be configured to force Xcode (and preferably MVD too) to use Ethernet connection between Mac Mini and Apple Vision Pro (over a USB hub, not a direct USB connection)? If I connect AVP to Mac directly via USB, the bridge gets created and both MVD and Xcode default to it, which is great because of higher speed and lower latency. My problem is that I work with external camera, so I can have either the camera, or the Mac connection, but not both. I tried to solve that by plugging in a small active USB hub, so the strap and camera are connected to it, plus it has Ethernet adapter, which is plugged into Mac port. I tried with internet sharing on Mac - AVP has internet access, I can ping AVP from Mac, but Xcode and MVD still use wifi. I tried to manually configure bridge without internet sharing - same effect. I tried to make the bridge highest priority connection - nothing changed. I tried to force routing to AVP IP over the bridge - nothing (and it seems that my routing entry went missing after some time and was replaced by "use wifi interface"). So - is there something more I can do to make at least Xcode go over the cable? Debugging over wifi often takes forever.
Replies
0
Boosts
0
Views
127
Activity
Feb ’26
VisionOs Development: Seeking Advice on Key Strategic Crossroads
I am a developer working on developing a space journal application. During the development process, I encountered several crucial strategic and technical decisions, and I would like to hear the experiences of those who have gone through similar situations. Here are the simplified versions of several questions I have. Resource allocation: Which problem should I address first? Design direction: In terms of interaction and UI design, how should I balance "immersion" and "usability"? Market selection: Was it easier for a business to survive in the early stages as a B2B or B2C entity? Cost estimation: How can I reasonably present to my investors the development costs of this project? In order to avoid relying solely on intuition in my decisions, I created a short questionnaire, hoping to gather more structured opinions from my colleagues. If you are also exploring VisionOS, I sincerely hope you can take a few minutes to fill it out. The results are extremely important to me, and I would be more than happy to share the final summary findings with you.
Replies
1
Boosts
0
Views
158
Activity
Feb ’26
Full Body Tracking
Hi, is there a way to track feet in a visionOS app in an immersive space? I want the whole body to be visible in VR, and I want to know if the user touches an object with their foot.
Replies
1
Boosts
0
Views
292
Activity
Feb ’26
VisionOS 2 - Passthrough in screen capture
I have already add liscene and entitlement for Passthrough in screen capture, but still get black background instaed of real world. Do you have any idea ? I've been stuck with this question for a long time 😢
Replies
2
Boosts
1
Views
431
Activity
Feb ’26