Discuss Spatial Computing on Apple Platforms.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

USDZ Security
I am working on an app that will allow a user to load and share their model files (usdz, usda, usdc). I'm looking at security options to prevent bad actors. Are there security or validation methods built into ARKit/RealityKit/CloudKit when loading models or saving them on the cloud? I want to ensure no one can inject any sort of exploit through these file types.
0
0
478
Jul ’25
mipmapsMode trade-off?
I am building a 360 photo viewer in VisionOS 26. Which allows the user to choose a 2 by 1 jpg and then renders it with a sphere mesh entity. And I use: TextureResource(contentsOf: url, options: options). I noticed two situations here in terms of mipmaps options. When setting "mipmapsMode: .none": The graphic quality within the "gaze area" looks sharp and clear The two poles (top and bottom) are perfectly rendered Massive shimmer around the "gaze area" When setting "mipmapsMode: .allocateAndGenerateAll": The graphic looks slightly blurrier than in ".none" within the "gaze area" The two poles are very blurry and hard to recognize the texture Much less shimmer around the "gaze area" My question would be: Is there a way to have the perfect graphic quality in ".none" without the massive shimmer? Thank you! Screenshots: mipmapsMode: .none mipmapsMode: .allocateAndGenerateAll
0
0
221
Oct ’25
Animation handling on Scene change
I work on a game where I use timeline animations in Reality Composer Pro. The game runs in an immersive space, but can be paused where I then move the whole level root entity from the immersive space to another RealityView in a Window Group. When the player continues I do it exactly the other way around to move the level root from the window group back to my immersive space RealityView. And it seems like all animations get automatically stopped and restarted when the scene gets changed. The problem is, it does not resume where it stopped before, it completely starts again from where it stopped and therefore, has for example a wrong y offset as visible in the picture. For example in the picture, the yellow sphere loops the following animation: 0 to 100 100 to -100 -100 to 0 If I now pause the game (and basically switch scenes), the previous animation gets stopped and restarted at position y = 100. So now it loops: 100 to 200 200 to 0 0 to 100 I already tried all kind of setups - like: Setting the animations relative to root, parent, local Using behaviors (on Added to Scene, on Notification) And finally even by accessing the availableAnimations directly and saving the playback controller of the animation There I saw, if I manually trigger the following code before switching the scene, everything works as expected: Button("Reset") { animationPlaybackController.time = 0 animationPlaybackController.pause() animationPlaybackController.stop(blendOutDuration: 0.00001) } But if I use time = 0 with .stop() directly, the time = 0 seems to be ignored and I get the same behavior as before that it stops in a wrong y offset, hence my assumption that animations get stopped and invalidated once they change the scene. I tried to call the code manually on ImmersiveSpace.onDisappear, WindowGroup.onAppear and different kind of SceneEvents subscriptions, but unfortunately nothing worked. So am I doing something wrong in general or is there a way to fix this?
0
0
89
Apr ’25
Spaceship sample code does not compile
I'm getting the following error message when compiling the Apple provided sample, Spaceship game for the Apple Visio Pro. I've already tried deleting the derived data resetting the package cache and restarting Xcode but still getting the following error: [xrsimulator] Exception thrown during compile: Cannot get rkassets content for path /Users/myoungkang/Downloads/CreatingASpaceshipGame/Packages/Studio/Sources/Studio/Studio.rkassets because 'The file “Studio.rkassets” couldn’t be opened because you don’t have permission to view it.' error: Tool exited with code 1
0
1
67
Apr ’25
vision shareplay nearby codes expired
it looks like one week after accepting as a nearby other AVP device... it expires since we are providing our clients for a timeless app to walk inside archtiecture, it's a shame that not technical staff should connect every week 5 devices to work together is there any roundabout for this issue or straight to the wishlist ? thanks for the support !!
0
0
68
Sep ’25
[Rnedering] Always display a Window in front of any Entity in a mixedImmersiveView?
Currently I am using mixed style immersive view to place both my WindowView(plain style) and ImmersiveView content together. The issue is that the rendering depth testing may always let the virtual content block my normal WindowView. Is it possible to manually set windowedVIew always displays in the front of my virtual view in mixed style immersion? (I know modelSortGroup but it doesn't quite fits here) Or if I can dynamically change the .progressive value when the immersive space is open (set the value to zero means .mixed itself right?)
0
0
96
Mar ’25
MagnifyGesture in RealityView does not work on macOS
I have tested the MagnifyGesture code below on multiple devices: Vision Pro - working iPhone - working iPad - working macOS - not working In Reality Composer Pro, I have also added the below components to the test model entity: Input Target Collision For macOS, I tried the touchpad pinch gesture and mouse scroll wheel, but neither approach works. How to resolve this issue? Thank you. import SwiftUI import RealityKit import RealityKitContent struct ContentView: View { var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) } } .gesture(MagnifyGesture() .targetedToAnyEntity() .onChanged(onMagnifyChanged) .onEnded(onMagnifyEnded)) } func onMagnifyChanged(_ value: EntityTargetValue<MagnifyGesture.Value>) { print("onMagnifyChanged") } func onMagnifyEnded(_ value: EntityTargetValue<MagnifyGesture.Value>) { print("onMagnifyEnded") } }
0
0
58
Mar ’25
How to combine Occlusion nodes with soft edges.
At a recent community meeting we were wondering how Apple creates this soft-edge effect around the occlusion cutouts. We see this effect on keyboard cutouts, iPhone cutouts, and in progressive spaces. An example: Notice the soft edged around the occlusion cutout for the keyboard One of our members created some Shader Graph materials to explore soft edges. These work by sending data into the opacity channel of the PreviewSurface node. Unfortunately, the Occlusion Surface nodes lack any sort of input. If you know how to blend these concepts with RealityKit Occlusion, please let us know!
0
1
902
Sep ’25
Vision Pro 画面传输至 Mac 后分辨率偏低
传输后的直播流分辨率显著下降,画面细节丢失、清晰度不足,导致 3D 家具商品的纹理、尺寸等关键信息无法精准展示,影响用户对商品的判断。 期望 优化流传输过程中的分辨率压缩策略,减少传输过程中的画质损耗,提升 Mac 端接收的直播流清晰度,匹配 3D 商品展示的高精度需求。
0
0
77
2w
How to get the transform of the joint in a skeleton when it is Animating
I have an entity that was created using Mixamo, and it has an animation. after the animation completes the mesh of the robot is not where the entity is positioned. I want to do something like when the animation finishes, I set the root entity's transform to the mesh's transform. There are no transformations applied to any of the children of this root of the model, which means that the transformations are applied to the skeleton due the the playing of animations. Is there a way where I can apply the final position of the root of the skeleton to the root entity to make sure to position the entity where the animation has ended just before the next animation plays?
0
0
116
May ’25
I built apple.PHASE with Unity and targeted with visionOS, but Reverb does not sound.
Environment Versions ・macOS15.6.1 ・visionOS26.0.1 ・Xcode16.1 or 26.0.1 ・unity6000.2.9f1 ・Apple.core3.2.0 ・Apple.PHASE1.2.7 ・polyspatial2.4.2 With the above environment, after installing Apple.PHASE into unity and building to a visionOS device, Audio is available and distance attention works, but Early Reflection and Late Reverb produce no audible change even when checked and their parameters are adjusted. What is required to make Early Reflection and Late Reverb take effect on a visionOS device build? action taken ・created a SoundEvent. ・in composer, created a Sampler and a SpatialMixer; attached an AudioClip to the Sampler; enabled Direct Path, Early Reflection, and Late Reverb on the SpatialMixer. ・attached a PHASE Source to the object to be played, attached the created SoundEvent to it, and set non-zero values for Early Reflection and Late Reverb. ・attached a PHASE Listener to the mainCamera and set the ReverbPreset to a value other than None. ・in project settings > Audio, set Spatializer plugin to PHASE Spatializer. ・from there, build for visionOS.
0
0
723
Nov ’25
How to make .blur(radius:) visually affect RealityView content?
According to the official documentation, the .blur(radius:) modifier could apply gaussian blur to a realityview. However, when applied directly to a RealityView, nothing inside it (neither 2D attachments nor 3D entities) appears to be blurred. Here’s the test code: struct ContentView: View { var body: some View { VStack(spacing: 20) { Text("Above the RealityView") .font(.title) RealityView { content, attachments in if let text = attachments.entity(for: "2dView") { text.position.y = 0.1 content.add(text) } let box = ModelEntity( mesh: .generateBox(size: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)] ) content.add(box) } attachments: { Attachment(id: "2dView") { Text("Above the Box") .font(.title) } } .frame(width: 300, height: 300) .border(.blue) .blur(radius: 99) // Has no visual effect Text("Below the RealityView") .font(.subheadline) } .padding() } } My question: How can I make .blur(radius:) visually affect the content rendered in a RealityView? Can you provide a working example that .blur() to visually affect any part of a RealityView? Thanks!
0
0
113
May ’25
New Spatial Rendering App on macOS doesn't display on visionOS device
I created a new Spatial Rendering App from the template in Xcode 26.0.1. When I run the app, click 'Show Immersive Space' and select my Vision Pro from the pop-up dialog, the content in the dialog flickers (which seems to indicate something crashed) and nothing appears on my Vision Pro. I'm running the released macOS 26.0 (25A354) and visionOS 26.0 (23M336). Filed as FB20397093.
0
0
340
Sep ’25
Spatial-backdrop standards process
Apple's WWDC video What’s new for the spatial web says the spatial-backdrop markup may change as it goes through the standards process (at 27:26 mark). I have started adding spatial-backdrops to web pages, so I want to keep an eye out for status updates by Apple and follow the standards progress. Is there any place I can keep an eye on this standards process? Has Apple announced any feature updates or news on spatial-backdrops?
0
0
97
Nov ’25
VisionOS Unity - VR Mode - Control Render Resolution?
I'm using Unity 2022.3.56f, with Apple VisionOS App Mode set to 'Virtual Reality - Fully Immersive Space'. It seems that the render resolution of my game in the Apple Vision Pro when I build is well below the native resolution of the AVP displays. I can't see a setting in XR Plug-in Management Apple visionOS options, or in Quality settings, to increase the render resolutions. Is this possible? I tried setting: UnityEngine.XR.XRSettings.eyeTextureResolutionScale= 2.0f For example, but this doesn't seem to do anything to the render resolution in the build.
0
0
353
Feb ’25
Apple Vision Pro - hand tracking with gloves
I am considering adding finger pad haptics (Data flow for haptic feedback is directed from the AVP to the fingers, not vice versa). Simple piezos wired to a wrist connection holding the driver/battery. But I'm concerned it will impact the hand tracking. Any guidance regarding gloves and/or the size of any peripherals attached to fingers? Or, if anyone has another (inexpensive) low profile option on the market please LMK. Thanks
0
0
231
Feb ’25