Reality Composer Pro

RSS for tag

Prototype and produce content for AR experiences using Reality Composer Pro.

Learn More

Post

Replies

Boosts

Views

Activity

Rotate an entity with the attachments
Hi, I create an entity and add a bunch of attachments (code is based on the Diorama demo). I can rotate the entity with this: .gesture( DragGesture() .targetedToAnyEntity() .onChanged { value in let entity = value.entity let orientation = Rotation3D(entity.orientation(relativeTo: nil)) let newOrientation: Rotation3D if (value.location.x >= lastGestureValue) { newOrientation = orientation.rotated(by: .init(angle: .degrees(0.5), axis: .y)) } else { newOrientation = orientation.rotated(by: .init(angle: .degrees(-0.5), axis: .y)) } entity.setOrientation(.init(newOrientation), relativeTo: nil) lastGestureValue = value.location.x } ) But the attachments stay still. How can I rotate the entity AND the attachment at the same time?
2
0
629
Mar ’24
Custom material getting converted to PhysicallyBasedMaterial
I have a custom material in Reality Composer. When I attach it to a cube and try loading the scene in XCode, the material cannot be cast to a ShaderGraphMaterial because it has been changed to a PhysicallyBasedMaterial. The material was always a Custom material, I did not change the type in Reality Composer. Does anyone know how to fix?
1
0
655
Mar ’24
Converting a Unity model / prefab to UDZ
We are porting a iOS Unity AR app to native visionOS. Ideally, we want to re-use our AR models in both applications. These AR models are rather simple. But still, converting them manually would be time-consuming, especially when it gets to the shaders. Is anyone aware of any attempts to write conversion tools for this? Maybe in other ecosystems like Godot or Unreal, where folks also want to convert the proprietary Unity format to something else? I've seen there's an FBX converter, but this would not care for shaders or particles. I am basically looking for something like the Polyspatial-internal conversion tools, but without the heavy weight of all the rest of Unity. Alternatively, is there a way to export a Unity project to visionOS and then just take the models out of the Xcode project?
0
0
367
Mar ’24
Adding 2D PNG to Reality Composer Pro
I've got a couple 2D PNG assets that I want to add to a scene made of a couple other udsz files in RCP (picture adding a couple 2D videogame characters to a simple 3D diorama). When I try to drag the PNGs to the workspace or the file tree…nothing happens. I found a walkthrough on Medium (called "Importing and Exporting Personalized Objects for Augmented Reality: Reality Composer and SwiftUI" for those curious as I can't link to Medium posts here) that makes it look like users could do this with simple drag-and-drop. The Medium post is from June 2023, and in the screenshots RCP visually looks a lot more like Reality Composer on iPad, so I'm assuming it's changed a lot since then? Is there still a way to do this? I've tried adding the 2D elements to a scene with Blenders "import images as planes," but I'm getting weird halos around them and was hoping RCP could make the process a bit easier/cleaner.
1
0
423
Mar ’24
USDZ + ShaderGraphMaterial not working?
I'm trying to make a simple demo of using ShaderGraphMaterial in a USDZ file that I can preview on Mac and VisionOS but I'm having trouble. In Reality Composer, I make a sphere, then assign a ShaderGraphMaterial to the material, with a simple diffuse color (green) input. When I save the file as .usda, it displays as a gray sphere on mac rather than the green sphere shown in reality composer. If I then convert to usdz using Reality Converter, I get a warning on import: "Shader nodes must have “id” as the implementationSource, with id values that begin with “Usd”. Also, shader inputs with connections must each have a single, valid connection source." And the exported .usdz also shows as a gray sphere. Is there a simple demo of a .usda file using ShaderGraphMaterial that displays on Mac, iOS, and VisionOS that I can look at to see how it looks internally? My actual problem is creating usdz / usda files on visionOS for viewing on iOS / Mac / VisionOS.. but the first step is showing it's possible to even use ShaderGraphMaterial across all platforms. Thanks
2
0
561
Mar ’24
"Meet Reality Composer Pro" - Spatial Audio Problem
I'm following the Meet Reality Composer Pro walkthrough and ran into something that didn't function as expected. When I got to the step where I add five "Bird_With_Audio.usda" references to the scene, I found they did not play audio. After some trial and error, I found that Preview > Resource in each of their Spatial Audio items was set to "None." If I click the dropdown menu, I see several "Bird_Calls" groups to pick from. I checked the original Bird_With_Audio.usda that I had created, and the "Bird_Calls" audio group was correctly assigned and worked. I tried dragging a sixth Bird_With_Audio into the scene and confirmed that the Spatial Audio item suddenly empties, rendering the bird silent. I was able to go through each of the five birds and set their Spatial Audio Resource to Bird_Calls, and the group worked like the video demonstrates. While this fixed the issue, as a beginner I'd like to know why this happened. It doesn't seem right that I would build and item and then have to re-attach any sounds to it when I place it in the main scene. So…where did I mess up?
0
0
290
Mar ’24
Change ModelEntity rotation axis to to that of the child entity
Hi, I'm trying to have an entity (and some attachments to it) to rotate. If I add the entity to content, add the attachment as a child entity, and set the entity as InputTargetComponent, then when I add a gesture ONLY the entity rotates and NOT the attachments (added as child entities). If I add a parent entity with let parentEntity = ModelEntity(), add my entity to the parentEntity, then add the attachments to an entity (which is now a child of the ModelEntity) and set the ModelEntity as InputTargetComponent then the whole thing rotates (including attachments) I'm sure there must be a bug, why would it work only with an added ModelEntity? Anyway, bug or not a bug, the problem I have now is that it rotates around the axes of the ModelEntity, not my primary entity, which is what I want. Is there a way to set the ModelEntity axes to be the axes of my primary child entity so it rotates like I want? What call should I use to move the axes where would I find the axes of the first child entity which should be the focus of my app? Here is my code: var body: some View { RealityView { content, attachments in // Add the initial RealityKit content if let specimenentity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { let parentEntity = ModelEntity() parentEntity.addChild(specimenentity) content.add(parentEntity) let entityBounds = specimenentity.visualBounds(relativeTo: parentEntity) parentEntity.collision = CollisionComponent(shapes: [ShapeResource.generateBox(size: entityBounds.extents).offsetBy(translation: entityBounds.center)]) parentEntity.generateCollisionShapes (recursive: true) parentEntity.components.set(InputTargetComponent()) if let Left_Hemisphere = attachments.entity(for: "Left_Hemisphere") { //4. Position the Attachment and add it to the RealityViewContent Left_Hemisphere.position = [-0.5, 1, 0] specimenentity.addChild(Left_Hemisphere) } } } attachments: { Attachment(id: "Left_Hemisphere") { //2. Define the SwiftUI View Text("Left_Hemisphere") .font(.extraLargeTitle) .padding() .glassBackgroundEffect() } } .gesture( DragGesture() .targetedToAnyEntity() .onChanged { value in let entity = value.entity var orientation = Rotation3D(entity.orientation(relativeTo: nil)) var newOrientation: Rotation3D if (value.location.x >= lastGestureValueX) { newOrientation = orientation.rotated(by: .init(angle: .degrees(0.5), axis: .y)) } else { newOrientation = orientation.rotated(by: .init(angle: .degrees(-0.5), axis: .y)) } entity.setOrientation(.init(newOrientation), relativeTo: nil) lastGestureValueX = value.location.x orientation = Rotation3D(entity.orientation(relativeTo: nil)) if (value.location.y >= lastGestureValueY) { newOrientation = orientation.rotated(by: .init(angle: .degrees(0.5), axis: .x)) } else { newOrientation = orientation.rotated(by: .init(angle: .degrees(-0.5), axis: .x)) } entity.setOrientation(.init(newOrientation), relativeTo: nil) lastGestureValueY = value.location.y } ) } }
4
0
642
Mar ’24
How to visualize collision components in RealityKit Composer Pro?
I setup an entity with a collision component on it. But it was hard to target the object for I tap gesture, until I increased the radius quite a bit. Now I am unsure if it is too large. Is there a way to visualize these components somehow, maybe even in a running scene? Also, I find it pretty confusing that the size is given in cm. This made me wonder if this cm setting is affected by the entity's size at all? In Unity, it's just (local) "units".
1
0
467
Apr ’24
How to set the scale unit of an Entity in Reality Composer Pro
How to set the scale unit of an Entity in Reality Composer Pro, for example, if the scale value is 1 meter, then when this Entity is placed in RealityView, the displayed size will be 1 meter If the unit of scale cannot be set in Reality Composer Pro, is there a way to specify the unit of scale in the code so that the Entity can be displayed in meters when added to RealityView Thank you
2
0
374
May ’24
Using the node RealityKitTexture2DLOD
I'm trying to control the LOD of textures for an app for vision pro, With the default image node in composer pro the UV's are correct but the LOD is not what I want, I would like to have control over it. I see there is a node called "RealityKitTexture2DLOD" but as soon as I try to use that one the UV's are all messed up. Am I missing something ? Do we need to do something specific to use this node ? I tried to use the nodes "Place 2D" and "UsdTransform2d" but could not get the texture to align Any help appreciated
2
0
327
May ’24
Build errors for iOS for my visionOS app
I'm taking my iOS/iPadOS app and converting it so it runs on visionOS. I’m trying to compile my app, build it, for both visionOS and iOS. When I try to build for an iPhone and iPad simulator, I get the following error:  Building for 'iphonesimulator', but realitytool only supports [xros, xrsimulator] I’m thinking I might need to do a # if conditional compilation statement for visionOS so iOS doesn’t try to build lines of code but I can’t for this particular error find out for which file or code I need to do the conditional compilation. Anyone know how to get rid of this error? 
2
0
476
May ’24