Reality Composer Pro

RSS for tag

Prototype and produce content for AR experiences using Reality Composer Pro.

Learn More

Post

Replies

Boosts

Views

Activity

Program finger/hand gesture for snap turn/teleport navigation
Hello! Currently watching the Envision the Future: Build great apps for visionOS" webinar, and lots of questions coming up. Thx for offering this online! For those of us with "VR legs", how can we go about setting up custom hand/finger gestures that would enable us to add the functionality for teleporting and navigating within our fully Immersive environments? Both smooth, and snap turn/teleport options would be great, thx! This is adjacent to my previous question on how to setup a PS5 controller to do something similar. Think Half-Life: ALYX as the gold standard for VR navigation.
0
0
53
11h
Control Player Camera with PS5 Controller on Vision Pro
I recently completed a freelance project where I was tasked with creating room-scale environments that could be used as AR elements. As a bonus, I suggested that these could be done to scale, and repurposed for eventual viewing in Vision Pro. To illustrate, I was able to quickly create a simple Immersive project in Xcode, add the USDZ file (authored in Maya, with baked lighting from Arnold) to Reality Composer Pro, and compile for quick sending to headset. I then would do screen recordings inside the immersive space, which the client loved to see. However, I am unable to walk around due to the boundary limitations. My next obvious thought is, how can I setup the “player” camera so that I can control with a PS5 controller inside AVP? In addition to Maya, I’m an Unreal Engine artist, and have been waiting patiently to get any projects compiled for AVP. With 5.5 release, I was able to get a VR Template test over to AVP, where I have rudimentary navigation control via the PS5 controller. Ideally, I’d also love to learn how to set this up natively, so I can take simple USDZ scenes created in Maya, import to RCP, setup a simple camera controller, and then be able to use this to navigate my VR Immersive spaces on Vision Pro. How can we go about doing this? Part two of this question/suggestion is, how would I go about controlling a rigged, animated character in AR/passthrough mode in a similar fashion? Thx!
0
0
55
12h
Custom component question
I created a custom component for composer pro in which I have several variables I need an entity to have. The idea is to add this component to some 3d models and save them as usdz’s then I load these usdz’s in code and do specific things depending on these variables. The component shows up in composer fine and I can set variables there. The problem is that the values I set in composer are different that what is shown in code. lets say in composer I set canMove = true. then when I read in code is set to false. I don’t know if I’m missing something public struct MyObjectComponent: Component, Codable { public var affectAll: Bool = false public var affectFloor: Bool = false public var canMove: Bool = false public var moveX: Bool = false public var moveY: Bool = false public var moveZ: Bool = false public var canRotate: Bool = false public var rotateX: Bool = false public var rotateY: Bool = false public var rotateZ: Bool = false public init() { } } Any help appreciated. Guillermo
4
0
192
6d
Rendering bug when layering transparent textures front and back
If I put an alpha image texture on a model created in Blender and run it on RCP or visionOS, the rendering between the front and back due to alpha will result in an unintended rendering. Details are below. I expor ted a USDC file of a Blender-created cylindrical object wit h a PNG (wit h alpha) texture applied to t he inside, and t hen impor ted it into Reality Composer Pro. When multiple objects t hat make extensive use of transparent textures are placed in front of and behind each ot her, t he following behaviors were obser ved in t he transparent areas ・The transparent areas do not become transparent ・The transparent areas become transparent toget her wit h t he image behind t hem the order of t he images becomes incorrect Best regards.
1
0
201
1w
Reality Composer Pro Timeline does not seem to work on iphone 12 and 11
I created a project using Reality Composer Pro. When I export to a .usdz file, it works well on iPhone 13, 14, 15, and 16 but not on iPhone 12 and 11. In the timeline, I use a behaviour that is on added to scene to active intro animation and loop background audio. But it does not work on old device like iPhone 12 and 11. Also, all interactive taps/touch points/audio don't seem to work too. Iphone 13,14,15,16 is on ios 18.1. iPhone 11, 12 is on ios 17.6.1. Here is sample usdz file exported from REality Composer Pro 2.0 that has problem above: https://drive.google.com/file/d/1sHZn9JABTswLq2flYjToTbWDuE5T7eNw/view?usp=sharing
0
0
89
1w
distance calculation:I want to implement the same code functionality for distance calculation in Unity using SwiftUI. visionos
I use this piece of code in Unity to get the distance length of my model entering another model. I have set collision markers at the tip and end of the model and performed raycasting, but Unity currently does not support object tracking in VisionOS. Therefore, I plan to use SwiftUI for native development. In Reality Composer Pro, I haven't seen a collision editing feature like in Unity; I can only set the size of the collision body but cannot manually adjust or visualize the shape and size of my collision body. I want to achieve similar functionality using SwiftUI, to be able to calculate and display the distance that my model A, like a needle or ruler, penetrates into another model or a physical object's interior. Is there a similar functionality available, or other coding methods to achieve this? void CalculateLengthInsideOrgan() { // Direction from the base of the probe to the tip Vector3 direction = probeTip.position - probeBase.position; float probeLength = direction.magnitude; // Raycasting RaycastHit[] hits = Physics.RaycastAll(probeBase.position, direction, probeLength, organLayerMask); if (hits.Length > 0) { // Calculate the length entering the organ float distanceToFirstHit = hits[0].distance; lengthInsideOrgan = probeLength - distanceToFirstHit; } else { lengthInsideOrgan = 0f; } }
1
0
254
1w
Custom 3D Window Using RealityView
I have a RealityView displaying a Reality Composer Pro scene in window. Things are generally working fine, but the content seems to be appearing in front of and blocking the VisionOS window, rather than being contained inside it. Do I need to switch to a volumetric view for this to work? My scene simply contains a flat display which renders 3D content (it has a material that sends different imagery to each eye).
3
0
209
2w
[Reality Composer Pro] How to Enable Entities Mid-Timeline Without Early Appearance?
Hi! I'm using timeline in Reality Composer Pro. I tried to enable entities that were disabled at the beginning of the scene to be enabled in the middle of the Timeline playback using the 'Enable Entities'. But it didn't work well as I imagined. (It was keep appearing before starting Timeline) How do I solve this problem? Are there good solutions about it?
1
0
180
2w
[Reality Composer Pro] Is it able playing timeline via Xcode without adding Behaviors component to entities?
Hi! Im making project with Xcode and Reality Composer Pro. I'm trying to play timeline in Reality Composer Pro using codes without setting Behaviors on entities. And I also tried to send notification from Xcode to entities in Reality Composer Pro to play timeline(I already set "OnNotification" with Behaviors component). But it's not working well, and I couldn't figure out any problems. Are there solutions about it?
1
0
300
3w
Beginner question: Keyframes?
I am new here, my name is Axel - Hi! 👋 I am both a visionOS beginner and an expert in things 3D, now figuring my way into Reality Composer Pro. Most things are intuitive and easy to understand, yet there is one thing I cannot seem to figure out, and I feel really stupid because it must be there, and that's Keyframing: I would simply make a new Timeline and animate published parameters in a Shader Graph over time. I know how to do this via Xcode and Custom Components, but that can't be it because that will break even on simple to medium animation in terms of previewing and fine-tuning, complete overkill for a simple value animation. Since key framing is the most basic functionality of 3D apps next to move, I am sure it is in RCP somewhere. Anyone got a pointer for me?
1
0
229
4w
How to use KTX file Image3D in ShaderGraph?
I've tried importing a 3D KTX file into Reality Composer Pro ShaderGraph, but got nothing output. Realitykit doesn't support Image3D yet? First, I use pyktx to create a 3D KTX texture file and it seems correct in Finder preview. import numpy as np from pyktx import KtxTexture2, KtxTextureCreateInfo, KtxTextureCreateStorage, VkFormat size = 32 image = np.zeros((size, size, size, 3), dtype=np.uint8) for i in range(size): image[i, :, :, 0] = np.interp(i, [0, size], [50, 255]) image[:, i, :, 1] = np.interp(i, [0, size], [100, 255]) image[:, :, i, 2] = np.interp(i, [0, size], [150, 255]) info = KtxTextureCreateInfo( gl_internal_format=None, vk_format=VkFormat.VK_FORMAT_R8G8B8_SRGB, base_width=image.shape[2], base_height=image.shape[1], base_depth=image.shape[0], num_dimensions=3, num_levels=1, num_layers=1, num_faces=1, generate_mipmaps=False, is_array=False, ) texture = KtxTexture2.create(info, KtxTextureCreateStorage.ALLOC) for _ in range(1): texture.set_image_from_memory(0, 0, _, image[_].tobytes()) texture.write_to_named_file(f'{size}.ktx') Then I import it into ShaderGraph like this but seems nothing could be read. In addition, I tested 2D KTX file and it worked. I also tested different vk_format or KTX1/KTX2 but did not work. I also tested Image3DPixel/Image2DArray/Image3DRead and none of them work as long as it's 3D.
3
0
226
Oct ’24
Protobuf Error Crash
We are developing a mixed reality app for Vision Pro using Reality Composer Pro, but we're consistently encountering a Protobuf-related crash whenever we enter the immersive space. Our Reality Composer Pro package is quite complex, with numerous objects. Could the complexity of the package be contributing to the issue, or could something else be at play? No errors are being flagged in the code or Reality Composer itself. Here’s the error log: [libprotobuf FATAL /Library/Caches/com.apple.xbs/Sources/REKit/ThirdParty/protobuf/src/google/protobuf/io/zero_copy_stream_impl_lite.cc:276] CHECK failed: (count) >= (0): libc++abi: terminating due to uncaught exception of type google::protobuf::FatalException: CHECK failed: (count) >= (0): Any insight on what might be causing this would be appreciated.
1
0
305
Oct ’24
Tips for developing shader graphs in Reality Composer Pro
I am new to the graph editor and was able to achieve some results. However, I am noticing that my graphs are getting very tangled, confusing, and hard to debug. I was wondering whether: is it possible to define variables, to store the value of computations, and refer to them in other parts of the graph, without having to link them graphically? This would help in tidying the tangled mess I created. In the "Explore materials in Reality Composer Pro" video, I saw that it is possible to create "instances", but I am not sure if that is what I need. For example: does the shader compiler optimize them, so that there is no need to recompute each instance? Is there any functionality to debug the graph, trying inputs and seeing what the numeric outputs would be?
1
0
268
Oct ’24
Creating and applying a shader to change an entity's rendering in RealityKit
Hello, I am looking to create a shader to update an entity's rendering. As a basic example say I want to recolour an entity, but leave its original textures showing through: I understand with VisionOS I need to use Reality Composer Pro to create the shader, but I'm lost as how to reference the original colour that I'm trying to update in the node graph. All my attempts appear to completely override the textures in the entity (and its sub-entities) that I want to impact. Also the tutorials / examples I've looked at appear to create materials, not add an effect on top of existing materials. Any hints or pointers? Assuming this is possible, I've been trying to load the material in code, and apply to an entity. But do I need to do this to all child entities, or just the topmost? do { let entity = MyAssets.createModelEntity(.plane) // Loads from bundle and performs config let material = try await ShaderGraphMaterial(named: "/Root/TestMaterial", from: "Test", in: realityKitContentBundle) entity.applyToChildren { $0.components[ModelComponent.self]?.materials = [material] } root.addChild(entity) } catch { fatalError(error.localizedDescription) }
3
0
293
Oct ’24
How to use CubeMap in Reality Composer Pro?
I just do as the document in https://developer.apple.com/documentation/shadergraph/realitykit/cube-image-(realitykit) I have a .ktx file, and use CubeImage node to load it, then Convert node, but it shows black. I check it on my Vision Pro, it's still black, I don't know why? Is it something wrong? ps: I also use Image node to load .ktx file, it shows one image, so I belive .ktx file is right. I alse checked it on Vision Pro.
2
0
266
Oct ’24