Discuss spatial computing on Apple platforms and how to design and build an entirely new universe of apps and games for Apple Vision Pro.

All subtopics
Posts under Spatial Computing topic

Post

Replies

Boosts

Views

Activity

Make subdivision surfaces work in Reality Composer Pro
Information is light on the new subdivision support for USD models in RealityKit, and I have been unable so far to get one of my models to actually subdivide within Reality Composer Pro or Quick Look (or when viewing on Vision Pro). I've exported a few test models from Houdini and verified that it contains ' uniform token subdivisionScheme = "catmullClark"'. I've started with some very lightweight, basic meshes. But, when viewing, they simply look like polygonal meshes. There's no 'subdividing' occurring at runtime when viewing the models. Is there a trick to getting them to actually smooth-out?
5
0
627
Oct ’24
Manipulate objects in Reality Composer Pro
The documentation at https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro states: Reality Composer Pro treats your imported assets as read-only. This is a huge obstacle for me, as I need to do multiple adjustments to the scene. I somehow managed to actually import one of my assets into the scene and can manipulate it directly, but now I can't figure out how I did this. As I have to prepare further assets and would like to do this directly in Reality Composer Pro, I'm looking for a way to actually load them into the scene. Any idea how this can be done?
5
0
719
Oct ’24
DeactivatedEntities in scene not found with entity.findEntity(named:)?
I have a rkasset package, from which I load my scene. In the scene, I'm using entity.findEntity(named:"..") to find entities to activate/deactivate. When I have entities deactivated in the *.usda, they are not found with this method. Further inspection shows that the deactivated entities seem not to be compiled into the build. Is there anything I can set that prevents skipping the build for these deactivated entities?
1
0
347
Oct ’24
CompositorServices Or RealityKit
I have been concentrating on developing the visionOS application. While I am currently quite familiar with RealityKit, CompositorServices has also captured my attention. I have not yet acquired knowledge of CompositorServices. Could you please clarify whether it is essential for me to learn CompositorServices? Additionally, I would appreciate it if you could provide insights into the advantages of RealityKit and CompositorServices.
2
0
666
Oct ’24
Anchor Wall
I have created a portal and attached it to a wall using the AnchorEntity. However, I am seeking guidance on how to determine the size of the wall so that the portal can fully occupy it. Initially, I attempted to locate relevant information within the demo code, but I encountered difficulties in comprehending certain sections. I would appreciate it if someone could provide a step-by-step explanation or a reference to the appropriate code. Thank you for your assistance.
2
0
665
Oct ’24
How to use CubeMap in Reality Composer Pro?
I just do as the document in https://developer.apple.com/documentation/shadergraph/realitykit/cube-image-(realitykit) I have a .ktx file, and use CubeImage node to load it, then Convert node, but it shows black. I check it on my Vision Pro, it's still black, I don't know why? Is it something wrong? ps: I also use Image node to load .ktx file, it shows one image, so I belive .ktx file is right. I alse checked it on Vision Pro.
2
0
879
Oct ’24
Reality Composer Pro timelines management, but using code
Is it possible to manage the behavior of timeline totally from code? I am exploring the Compose interactive 3D content in Reality Composer Pro sample project after seeing the related video, but the example shows only the use of Behaviors from RCP to activate timelines actions. I was wondering if it is possible to, somehow, retrieve some kind of timeline controller that allows me access to its informations just like the AnimationPlaybackController does with single animations. What I would like to achieve is being able to play/pause/retrieve timestamp from them in order to allow synchronization between different users on SharePlay
1
0
578
Oct ’24
Creating and applying a shader to change an entity's rendering in RealityKit
Hello, I am looking to create a shader to update an entity's rendering. As a basic example say I want to recolour an entity, but leave its original textures showing through: I understand with VisionOS I need to use Reality Composer Pro to create the shader, but I'm lost as how to reference the original colour that I'm trying to update in the node graph. All my attempts appear to completely override the textures in the entity (and its sub-entities) that I want to impact. Also the tutorials / examples I've looked at appear to create materials, not add an effect on top of existing materials. Any hints or pointers? Assuming this is possible, I've been trying to load the material in code, and apply to an entity. But do I need to do this to all child entities, or just the topmost? do { let entity = MyAssets.createModelEntity(.plane) // Loads from bundle and performs config let material = try await ShaderGraphMaterial(named: "/Root/TestMaterial", from: "Test", in: realityKitContentBundle) entity.applyToChildren { $0.components[ModelComponent.self]?.materials = [material] } root.addChild(entity) } catch { fatalError(error.localizedDescription) }
3
0
903
Oct ’24
USDZ with Blend Shapes Workflow Recommendations
Hi, since RealityKit 4 now supports Blend Shapes I was wondering if there are any workflow or tooling recommendations to bake/export them into a USDZ. Are Blender or Cinema4D capable to do that out of the box? Should we look into NVIDIA omniverse (https://docs.omniverse.nvidia.com/connect/latest/blender/manual.htm) So far this topic seems very sparsely documented and I would appreciate any hints. Thank you!
3
0
1.3k
Oct ’24
visionOS pushWindow being dismissed on app foreground
We seen to have found an issue when using the pushWindow action on visionOS. The issue occurs if the app is backgrounded then reopened by selecting the apps icon on the home screen. Any window that is opened via the pushWindow action is then dismissed. We've been able to replicate the issue in a small sample project. Replication steps Open app Open window via the push action Press the digital crown On the home screen select the apps icon again The pushed window will now be dismissed. There is a sample project linked here that shows off the issue, including a video of the bug in progress
2
1
322
Oct ’24
Understanding the Distance for Physical Object Visibility in Vision Pro Immersive System
I understand that the system helps maintain user comfort by automatically adjusting the opacity of content in certain situations, like when someone moves too quickly or gets too close to a physical object. The content in front of them dims briefly to allow a clearer view of their surroundings. And I'd like to know the specific distance at which the system begins to show the physical object, or what criteria are used for this adjustment.
0
0
318
Oct ’24
Synchronizing Multi-User AR Experiences on Apple Vision Pro
Hello Developers, I am currently in the initial planning stages of my bachelor thesis in computer science, where I will be developing an application in collaboration with a manufacturer of large-scale machinery. One of the core features I aim to implement is the ability for multiple Apple Vision Pro users to view the same object in augmented reality simultaneously, each from their respective positions relative to the object. I am still exploring how best to achieve this feature. My initial approach involves designating one device as the host of a "room" within the application, allowing other users to join. If I can accurately determine the relative positions of all users to the host device, it should be possible to display the AR content correctly in terms of angle, size, and location for each user. Despite my research, I haven't found much information on similar projects, and I would appreciate any insights or suggestions. Specifically, I am curious about common approaches for synchronizing AR experiences across multiple devices. Given that the Apple Vision Pro does not have a GPS sensor, I am also looking for alternative methods to precisely determine the positions of multiple devices relative to each other. Any advice or shared experiences would be greatly appreciated! Best regards, Revin
5
0
939
Oct ’24
Tips for developing shader graphs in Reality Composer Pro
I am new to the graph editor and was able to achieve some results. However, I am noticing that my graphs are getting very tangled, confusing, and hard to debug. I was wondering whether: is it possible to define variables, to store the value of computations, and refer to them in other parts of the graph, without having to link them graphically? This would help in tidying the tangled mess I created. In the "Explore materials in Reality Composer Pro" video, I saw that it is possible to create "instances", but I am not sure if that is what I need. For example: does the shader compiler optimize them, so that there is no need to recompute each instance? Is there any functionality to debug the graph, trying inputs and seeing what the numeric outputs would be?
1
0
749
Oct ’24
RealityView Limits VisionOS
Im asking myself we are the limits of RealityView. For example is it possible to place an entity on postion (x=800m,y=0,z=-900m) What happens if i walk from my (0,0,0) to this point, will i see the entity then ? Does someone know where are the limits ?
1
0
297
Oct ’24
How to set the AttractionCenter for a ParticleEmitterComponent in a System with real time updates
I am trying to achieve an effect such that the particles of a particle system are attracted to my hand entity. The hand entity is essentially an AnchorEntity that is tracking my right hand. let particleEmitterEntities = context.entities(matching: particleEmitterQuery, updatingSystemWhen: .rendering) for particleEmitterEntity in particleEmitterEntities { if var particleEmitter = particleEmitterEntity.components[ParticleEmitterComponent.self] { particleEmitter.mainEmitter.attractionCenter = rightHandEntity.position(relativeTo: nil) // trying to get the world space position of the hand // I also tried relative to particleEmitterEntity particleEmitterEntity.components[ParticleEmitterComponent.self] = particleEmitter } else { fatalError("Cannot find particle emitter") } } The particle attraction center doesn't seem to update Another issue I am noticing here that My particle system doesn't show the particle image a lot of times and just renders a placeholder square when I do this, when I comment this code out I get the right particle image. I believe this is due to the number of times this loop runs to update the position of the attraction center. What is the right way to do an effect where the particles are attracted to my hand.
3
0
531
Oct ’24