Reality Composer Pro

RSS for tag

Leverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps

Reality Composer Pro Documentation

Posts under Reality Composer Pro tag

203 Posts
Sort by:
Post not yet marked as solved
3 Replies
937 Views
In my project, i want to use new shadergraphmaterial to do the stereoscopic render, i notice that there is a node called Camera Index Switch Node can do this. But when i tried it , i found that : It can only output Integer type value, when i change to float value , it change back again, i don't konw if it is a bug. 2. So i test this node with a IF node,i found that it output is weird. Below is zero should output,it is black but when i change to IF node,it is grey,it is neither 0 nor 1(My IF node result is TRUE result 1, FALSE result 0) I wanna ask if this is a bug, and if this is a correct way to do the stereoscopic render.
Posted
by bYsdTd.
Last updated
.
Post not yet marked as solved
4 Replies
250 Views
I am trying to implement a way to rotate a 3D model around its y axis, but this doesn't seem to work. What am I missing? The scene only contains one model entity. @State private var rotateBy:Double = 0.0 RealityView { content in do { let entity = try await Entity.init(named: "VinylScene", in: realityKitContentBundle) entity.scale = SIMD3<Float>(repeating: 0.6) content.add(entity) } catch { ProgressView() } } .gesture( DragGesture(minimumDistance: 0.0) .targetedToAnyEntity() .onChanged { value in let location3d = value.convert(value.location3D, from: .local, to: .scene) let startLocation = value.convert(value.startLocation3D, from: .local, to: .scene) let delta = location3d - startLocation rotateBy = Double(atan(delta.x * 200)) } )
Posted
by mdkBsenA.
Last updated
.
Post not yet marked as solved
0 Replies
170 Views
Through testing, I have been able to get 5.1 and 7.1 Dolby Atmos files created in Logic Pro to work in Reality Composer Pro and then in Vision Pro. However, 5.1.4 and 7.1.4 files crash when added. Can someone confirm that these are not supported?
Posted
by stevenmc.
Last updated
.
Post not yet marked as solved
1 Replies
294 Views
Is it possible to use an image sequence, .mov or sprite sheet as a node source for a custom material in Reality Composer Pro? I have noticed that in the particle emitter, the magic preset uses a 4x4 sprite sheet as a particle source. Can this be done within the shader graph for the diffuse or normal slot?
Posted
by stevenmc.
Last updated
.
Post marked as solved
2 Replies
321 Views
I am trying to make a shader for a disco ball lighting effect for my app. I want the light to reflect on the scene mesh. i was curious if anyone has pointers on how to do this in shader graph in reality composer pro or writing a surface shader. The effect rotates the dots as the ball spins. This is the effect in the apple clips that applies the effect to the scene mesh
Posted
by doomdave.
Last updated
.
Post not yet marked as solved
0 Replies
222 Views
Hello everyone, I have just started learning the development and learning of visionPro app. I have a scene called Scene, and inside it is an object called Sphere. I want to add a drag animation to this Sphere alone. I follow the code below to achieve it. But my Sphere cannot actually be dragged in the Apple simulator. What is the reason? struct ContentView: View { @State var enlarge = false @State var offset: Point3D = .zero @State var sphereEntity: Entity? var body: some View { RealityView { content in if let scene = try? await Entity(named: "Scene", in: realityKitContentBundle) { content.add(scene) sphereEntity = content.entities.first?.findEntity(named: "Sphere") sphereEntity?.components.set(InputTargetComponent(allowedInputTypes: .all)) } }.gesture(DragGesture().targetedToEntity(sphereEntity ?? Entity()).onChanged({ value in print(value.location3D) sphereEntity?.position = value.convert(value.location3D, from: .local, to: sphereEntity?.parent! ?? Entity()) })) .gesture(SpatialTapGesture().targetedToAnyEntity().onEnded({ _ in print("Ssssssss") })) .onAppear() { } } }
Posted
by cjlalala.
Last updated
.
Post not yet marked as solved
4 Replies
430 Views
Hi, I'm trying to have an entity (and some attachments to it) to rotate. If I add the entity to content, add the attachment as a child entity, and set the entity as InputTargetComponent, then when I add a gesture ONLY the entity rotates and NOT the attachments (added as child entities). If I add a parent entity with let parentEntity = ModelEntity(), add my entity to the parentEntity, then add the attachments to an entity (which is now a child of the ModelEntity) and set the ModelEntity as InputTargetComponent then the whole thing rotates (including attachments) I'm sure there must be a bug, why would it work only with an added ModelEntity? Anyway, bug or not a bug, the problem I have now is that it rotates around the axes of the ModelEntity, not my primary entity, which is what I want. Is there a way to set the ModelEntity axes to be the axes of my primary child entity so it rotates like I want? What call should I use to move the axes where would I find the axes of the first child entity which should be the focus of my app? Here is my code: var body: some View { RealityView { content, attachments in // Add the initial RealityKit content if let specimenentity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { let parentEntity = ModelEntity() parentEntity.addChild(specimenentity) content.add(parentEntity) let entityBounds = specimenentity.visualBounds(relativeTo: parentEntity) parentEntity.collision = CollisionComponent(shapes: [ShapeResource.generateBox(size: entityBounds.extents).offsetBy(translation: entityBounds.center)]) parentEntity.generateCollisionShapes (recursive: true) parentEntity.components.set(InputTargetComponent()) if let Left_Hemisphere = attachments.entity(for: "Left_Hemisphere") { //4. Position the Attachment and add it to the RealityViewContent Left_Hemisphere.position = [-0.5, 1, 0] specimenentity.addChild(Left_Hemisphere) } } } attachments: { Attachment(id: "Left_Hemisphere") { //2. Define the SwiftUI View Text("Left_Hemisphere") .font(.extraLargeTitle) .padding() .glassBackgroundEffect() } } .gesture( DragGesture() .targetedToAnyEntity() .onChanged { value in let entity = value.entity var orientation = Rotation3D(entity.orientation(relativeTo: nil)) var newOrientation: Rotation3D if (value.location.x >= lastGestureValueX) { newOrientation = orientation.rotated(by: .init(angle: .degrees(0.5), axis: .y)) } else { newOrientation = orientation.rotated(by: .init(angle: .degrees(-0.5), axis: .y)) } entity.setOrientation(.init(newOrientation), relativeTo: nil) lastGestureValueX = value.location.x orientation = Rotation3D(entity.orientation(relativeTo: nil)) if (value.location.y >= lastGestureValueY) { newOrientation = orientation.rotated(by: .init(angle: .degrees(0.5), axis: .x)) } else { newOrientation = orientation.rotated(by: .init(angle: .degrees(-0.5), axis: .x)) } entity.setOrientation(.init(newOrientation), relativeTo: nil) lastGestureValueY = value.location.y } ) } }
Posted
by michelefu.
Last updated
.
Post not yet marked as solved
1 Replies
207 Views
I currently have an iOS app that transmits h264 code through wifi, uses videotoolbox to decode and displays it with MTKView, and I want to implement similar functions in visionOS. What should I do? MTKView is not available on visionOS
Posted
by yl_pang.
Last updated
.
Post not yet marked as solved
1 Replies
378 Views
Can anyone point me to an approach for handling drag, rotation and scale on a 'TargetedToAnyEnity' asset coming from a realityKitContentBundle? I've looked through all of the code examples, and have cobbled together something using PlacementGesturesModifer and DragRotationModifier from the HelloWorld code example but I can't figure out how to make it work on individual assets -- it only works on the root. When I do something simple like this (outside the modifier script I mentioned above) I can make individual drag work... but can't figure out how to apply the same thing to rotation and scale. .gesture(DragGesture() .targetedToAnyEntity() .onChanged({ value in value.entity.position = value.convert(value.location3D, from: .local, to: value.entity.parent!) }) Are there any examples of a solution for drag, rotation and scale on an individual basis in the code examples? Any advice or hints would be appreciated. :)
Posted Last updated
.
Post marked as solved
7 Replies
2.2k Views
Hi, I would like to learn how to create custom materials using Shader Graph in Reality Composer Pro. I would like to know more about Shader Graph in general, including node descriptions and how the material's display changes when nodes are connected. However, I cannot find a manual for Shader Graph in Reality Composer Pro. This leaves me totally clueless on how to create custom materials. Thanks. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
Posted Last updated
.
Post not yet marked as solved
0 Replies
411 Views
I've got a couple 2D PNG assets that I want to add to a scene made of a couple other udsz files in RCP (picture adding a couple 2D videogame characters to a simple 3D diorama). When I try to drag the PNGs to the workspace or the file tree…nothing happens. I found a walkthrough on Medium (called "Importing and Exporting Personalized Objects for Augmented Reality: Reality Composer and SwiftUI" for those curious as I can't link to Medium posts here) that makes it look like users could do this with simple drag-and-drop. The Medium post is from June 2023, and in the screenshots RCP visually looks a lot more like Reality Composer on iPad, so I'm assuming it's changed a lot since then? Is there still a way to do this? I've tried adding the 2D elements to a scene with Blenders "import images as planes," but I'm getting weird halos around them and was hoping RCP could make the process a bit easier/cleaner.
Posted
by Plexofill.
Last updated
.
Post not yet marked as solved
0 Replies
160 Views
I'm following the Meet Reality Composer Pro walkthrough and ran into something that didn't function as expected. When I got to the step where I add five "Bird_With_Audio.usda" references to the scene, I found they did not play audio. After some trial and error, I found that Preview > Resource in each of their Spatial Audio items was set to "None." If I click the dropdown menu, I see several "Bird_Calls" groups to pick from. I checked the original Bird_With_Audio.usda that I had created, and the "Bird_Calls" audio group was correctly assigned and worked. I tried dragging a sixth Bird_With_Audio into the scene and confirmed that the Spatial Audio item suddenly empties, rendering the bird silent. I was able to go through each of the five birds and set their Spatial Audio Resource to Bird_Calls, and the group worked like the video demonstrates. While this fixed the issue, as a beginner I'd like to know why this happened. It doesn't seem right that I would build and item and then have to re-attach any sounds to it when I place it in the main scene. So…where did I mess up?
Posted
by Plexofill.
Last updated
.
Post marked as solved
2 Replies
310 Views
I'm trying to make a simple demo of using ShaderGraphMaterial in a USDZ file that I can preview on Mac and VisionOS but I'm having trouble. In Reality Composer, I make a sphere, then assign a ShaderGraphMaterial to the material, with a simple diffuse color (green) input. When I save the file as .usda, it displays as a gray sphere on mac rather than the green sphere shown in reality composer. If I then convert to usdz using Reality Converter, I get a warning on import: "Shader nodes must have “id” as the implementationSource, with id values that begin with “Usd”. Also, shader inputs with connections must each have a single, valid connection source." And the exported .usdz also shows as a gray sphere. Is there a simple demo of a .usda file using ShaderGraphMaterial that displays on Mac, iOS, and VisionOS that I can look at to see how it looks internally? My actual problem is creating usdz / usda files on visionOS for viewing on iOS / Mac / VisionOS.. but the first step is showing it's possible to even use ShaderGraphMaterial across all platforms. Thanks
Posted
by cc4.
Last updated
.
Post not yet marked as solved
2 Replies
363 Views
objc[27000]: Class XROS1_1SimRuntime is implemented in both /Library/Developer/CoreSimulator/Volumes/xrOS_21O209/Library/Developer/CoreSimulator/Profiles/Runtimes/xrOS 1.1.simruntime/Contents/MacOS/xrOS 1.1 (0x1025f80e0) and /Library/Developer/CoreSimulator/Volumes/xrOS_21O5181e/Library/Developer/CoreSimulator/Profiles/Runtimes/xrOS 1.1.simruntime/Contents/MacOS/xrOS 1.1 (0x1027c00e0). One of the two will be used. Which one is undefined. error: Tool terminated by signal 'Segmentation fault: 11' This build failed issue occur every time when I play build
Posted
by B2D.
Last updated
.
Post not yet marked as solved
0 Replies
200 Views
I'm trying to better understand how loading entities works. If I do this: RealityView { content in // Add the initial RealityKit content if let scene = try? await Entity(named: "RCP_Scene", in: realityKitContentBundle) { content.add(scene) } } It returns the root with the two objects I have in the scene (sphere_01 and sphere_02). If I add a drag gesture to this entity it works on the root and gets applied to both sphere_01 and sphere_02 together (they both indiviually have collision and input components set to allow gestures). How do I get individual control of sphere_01 and sphere_02? Is it possible to load the root scene, as I'm doing above, and have individual control?
Posted Last updated
.
Post not yet marked as solved
2 Replies
497 Views
Hi, I create an entity and add a bunch of attachments (code is based on the Diorama demo). I can rotate the entity with this: .gesture( DragGesture() .targetedToAnyEntity() .onChanged { value in let entity = value.entity let orientation = Rotation3D(entity.orientation(relativeTo: nil)) let newOrientation: Rotation3D if (value.location.x >= lastGestureValue) { newOrientation = orientation.rotated(by: .init(angle: .degrees(0.5), axis: .y)) } else { newOrientation = orientation.rotated(by: .init(angle: .degrees(-0.5), axis: .y)) } entity.setOrientation(.init(newOrientation), relativeTo: nil) lastGestureValue = value.location.x } ) But the attachments stay still. How can I rotate the entity AND the attachment at the same time?
Posted
by michelefu.
Last updated
.
Post not yet marked as solved
0 Replies
180 Views
Hi, I am investigating how to emit the following in my visionOS app. https://www.hiroakit.com/archives/1432 https://blog.terresquall.com/2020/01/getting-your-emission-maps-to-work-in-unity/ Right now, I'm trying various things with Shader Graph in Reality Composer Pro, but I can't tell from the official documentation and WWDC session videos what the individual functions and combined effects of Reality Composer Pro's Shader Graph nodes are, I am having a hard time understanding the effects of the individual functions and combinations of them. I have a feeling that such luminous materials and expressions are not possible in visionOS to begin with. If there is a way to achieve this, please let me know. Thanks.
Posted Last updated
.