"Although Xcode generates loading methods for all Reality Composer files in your Xcode project"
I do not find this to be true, sadly.
Does anyone have any luck or insight on how one can build just a simple MacOS app that will import a scene from a Reality File?
The documentation suggests that the simple act of bringing a .Reality File in (What about .realitycomposerpro?) will generate code, but that doesn't seem to happen.
The sample code (Spaceship) does not compile for MacOS.
I'd really love just the most generic template of an Xcode Project that compiles with a button that pops open a scene., Like the VisionOS default immersive project.
Reality Composer Pro
RSS for tagLeverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps
Posts under Reality Composer Pro tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hey there, I am working on an app that displays environmental data using PNG color channels to represent data ranges, which gets overlayed on a map. The sampled values aren't what I'm expecting though... for example an RGB value of 0x7f0000 (R = 0.5, G = 0, B = 0) would be seen as 0.21, 0, 0 in the shader. This basically makes it unusable if I'm trying to show scientific data... I'm half wondering if I am completely misunderstanding how sampling works in RealityKit / RealityComposerPro. Anybody have any idea why it works like this?
Actual result (chart labels added in photoshop):
Expected:
Red > 0.1 Shader Graph
Hi guys, I have set size of cube is 20 cm. Then i want to transform scale with values x: 0.048, y: 1, z: 0.0.48, But strangely it reset all value to x: 1, y: 1, and z:1. Then i want to try with another decimal such as x: 1.2, y: 2, z: 1.2 the panel will automatically reset it into x: 1, y: 1, z: 1. My question is how to set transform scale that can accept decimal number ?
Hi
Hopefully someone can share some ideas on how to accomplish this.
I know we can load models from realityKitContentBundle like
let model = try? await Entity(named: “testModel”, in: realityKitContentBundle)
But this is in the root of RealityKitContent.rkassets , if I have the models in some subfolder then I have to add the complete path like
let model = try? await Entity(named: “/superModels/testModel”, in: realityKitContentBundle)
What I want is to be able to search recursively in all folders for that file as I have several subfolders with different models.
Any suggestion ?
Thanks in advance.
Guillermo
Hey, I am having issues getting my Material X shaders to work properly in Reality Composer Pro that I've authored in Houdini.
The shader is very simple. It starts with a tiled image node that is written to the diffuse color of the preview surface node. This node is called mtxltileimage2.
When I create a tiled image node in RCP and configure it to have the same parameter values I get the texture to show up correctly. This node is called TiledImage.
One difference I can identify is that the second node has a grey icon whereas the first node has a blue icon. Could this be related to this issue?
Here is the USD viewer output for the two variants of the tiled image node.
Any pointers, misconceptions and help would be greatly appreciated. My goal is to be able and author these shaders in Houdini and import them into RCP. Trying to figure out the right pipeline for this workflow.
I’m having issues getting Collision Shapes working in Reality Composer on iPadOS, or with Reality Composer Pro via Xcode on macOS?
I’ve posted a video recorded through my Vision Pro showing the issue.
The project i’m working on is a Dice Rolling application. The dice don’t appear to be working set as Collision Shape=Automatic, which I assume takes into account the actual silhouette of the shape.
https://youtu.be/upPtQY4QOAk?si=yyx6rbSSmVkLxBLg
They also don’t rest on their face when they land.
Anyone experience this type of behavior and found a solution? I’m currently doing this with Reality Composer, but most likely will also be wanting to get it to work properly in Reality Composer Pro as well.
Thx!
Hello,
I have a usdz model created in Maya. It's supposed to have a splashing animation associated with it, and this can be viewed in Maya and Blender, but for some reason when I export it to usdz and then import it into my Reality Composer Pro project, the animation is missing. I expect an animation library to be created on the entity when I drag it in like any other usdz with animation data, but that is not the case.
Any help would be appreciated on this issue.
Hi, I added DockingRegion to my scene from Reality Composer Pro, and I am able to load up the scene, but DockingRegion is getting ignored and the scene is getting rendered with no change in AVPlayerViewController window. As it can be seen in Reality Composer Pro screenshot below, I set the width of the player to 666, and moved it to the back by 300cm, but the actual result does not reflect the position I set on Reality Composer Pro.
Is there anything else I should do other than loading up the Entity and adding to RealityView? Specifically, do I have to get the DockingRegion within the usda file and somehow enable it?
I'm developing an app in which I need to render pictures and contain some models in a RealityView. I want to set up a camera, intercept virtual content through the camera, and save it as an image.
Hi everyone,
I’m working on a project in Reality Composer Pro, and I’ve encountered an issue with vertex animations. Here’s what I’ve done:
I created a model in Blender, which includes two animations:
One that scales the vertices of the model.
Another that moves the model's position.
I imported both animations into Reality Composer Pro, and the position animation works fine, but the vertex scaling animation does not seem to work.
What I’m Trying to Achieve:
I want the vertex scaling animation to play correctly in Reality Composer Pro alongside the movement animation.
Problem:
The position animation works as expected, but the vertex scaling animation does not work when applied in Reality Composer Pro.
I have checked the vertex scaling animation in other software, and it works fine there. The issue seems to be specific to Reality Composer Pro.
Is vertex animation scaling supported in Reality Composer Pro? If so, what might be causing this issue? Any advice or solutions would be greatly appreciated!
Hi everyone,
I’m having trouble with image anchoring when working on a project in Reality Composer and Reality Composer Pro. Here’s the issue:
1. What I’m Trying to Achieve:
I want to create an AR scene where an object anchors to an image I provide. I don't want to create an app for this but just use the USDZ File the Scene creates. The USDZ File then should be viewable via the various integrations of AR Quick Look across the Apple Ecosystem. The image anchoring works perfectly when I preview the scene inside Reality Composer using AR mode.
2. The Problem:
When I export the project (tried both USDZ and Reality formats) and open it on my iPhone using the Files app (which uses AR Quick Look), the image anchoring no longer works. The object doesn’t anchor to the provided image as expected. It just anchors to the first plane it recognizes and not the image.
3. What I’ve Tried:
Exporting the scene in USDZ format.
Exporting the scene in Reality format.
Both formats result in the same issue: no image anchoring outside of the Reality Composer environment.
Trying different images but all resulting in same manor that the image anchoring is not working
Tried different iOS Version but resulting in the same issue
4. Current Setup:
Reality Composer Pro version: 2.0
iPhone model: iPhone 13 Pro
iOS version: 18.1.
5. What I Need Help With:
Is there a way to ensure image anchoring works in exported files when opened via AR Quick Look?
Do I need to configure something specific during the export process?
Are there limitations in AR Quick Look that prevent image anchoring from functioning correctly?
Do i need to create an app to make this work?
I’d appreciate any advice or insights from the community. If anyone has experience with similar issues or knows of a workaround, please let me know!
Thanks in advance, Mav
Hello!
https://forums.developer.apple.com/forums/thread/762763
I read this thread, and this is similar to what I'm trying to do.
I have two entities in the scene, "HandTrackingEntity", "HandScanner".
"HandTrackingEntity": I put Anchoring Component, Collision Component (Trigger) here.
"HandScanner": I put Behaviors Component(OnCollision), and Collision Component here.
Here is the pictures how I set the components.
and I set physicsSimulation property to .none.
I was expecting that Timeline will be played when I put my hand(with HandTrackingEntity) on "HandScanner" entity. But it didn't work.
Am I missing some steps? And I need sample codes to understand how to apply 'physicsSimulation' property. I'd appreciate it if you could let me know about it.
Hi!
I wanna ask that if it's possible to make mirror material with Shader Graph in Reality Composer Pro. The mirror should reflect entities in Reality Composer Pro scene.
I found that it works with SceneKit, but I'm using RealityKitContent in my project. Are there ways to solve this?
Hi, every one!
I'm trying to bind timeline(animation + audio) and behaviors on an entity in reality composer pro.
In xcode, I need to clone this entity and use the behavior, but I found that the behaviors are not clone(send notification but not received by reality composer pro and not execute the timeline).
How can I solve this problem? Thanks!
Hi!
I'm creating an app like this:
Using Image Tracking to set world anchor in real world first.
The timeline in Reality Composer Pro scene needs to be played in same time(for the people in same place using the app).
People using the app will see the same contents in same position in same time in same place.
I already made Image Tracking feature worked. But the big problem is "Synchronization". I found Group Activities and TabletopKit to solve the problem. But I don't know if this are the right modules for this project.
How do I solve this problem technically?
If you have ideas, please let me know. I really need help for this.
What is recommended best practice for importing a Blender 3D file into RCP? I assume as a .usdz file? Is there a WWDC24 session or other Apple resource that best explains this. I want to make sure I provide the right format/file to RCP from Blender.
I created a custom component for composer pro in which I have several variables I need an entity to have.
The idea is to add this component to some 3d models and save them as usdz’s then I load these usdz’s in code and do specific things depending on these variables.
The component shows up in composer fine and I can set variables there. The problem is that the values I set in composer are different that what is shown in code. lets say in composer I set canMove = true. then when I read in code is set to false.
I don’t know if I’m missing something
public struct MyObjectComponent: Component, Codable
{
public var affectAll: Bool = false
public var affectFloor: Bool = false
public var canMove: Bool = false
public var moveX: Bool = false
public var moveY: Bool = false
public var moveZ: Bool = false
public var canRotate: Bool = false
public var rotateX: Bool = false
public var rotateY: Bool = false
public var rotateZ: Bool = false
public init() {
}
}
Any help appreciated.
Guillermo
I decided to use a club to kick a ball and let it roll on the turf in RealityKit, but now I can only let it slide but can not roll.
I add collision on the turf(static), club (kinematic) and the ball(dynamic), and set some parameters: radius, mass.
Using these parameters calculate linear damping, inertia, besides, use time between frames and the club position to calculate speed. Code like these:
let radius: Float = 0.025
let mass: Float = 0.04593 // 质量,单位:kg
var inertia = 2/5 * mass * pow(radius, 2)
let currentPosition = entity.position(relativeTo: nil)
let distance = distance(currentPosition, rgfc.lastPosition)
let deltaTime = Float(context.deltaTime)
let speed = distance / deltaTime
let C_d: Float = 0.47 //阻力系数
let linearDamping = 0.5 * 1.2 * pow(speed, 2) * .pi * pow(radius, 2) * C_d //线性阻尼(1.2表示空气密度)
entity.components[PhysicsBodyComponent.self]?.massProperties.inertia = SIMD3<Float>(inertia, inertia, inertia)
entity.components[PhysicsBodyComponent.self]?.linearDamping = linearDamping
// force
let acceleration = speed / deltaTime
let forceDirection = normalize(currentPosition - rgfc.lastPosition)
let forceMultiplier: Float = 1.0
let appliedForce = forceDirection * mass * acceleration * forceMultiplier
entityCollidedWith.addForce(appliedForce, at: rgfc.hitPosition, relativeTo: nil)
Also I try to applyImpulse but not addForce, like:
let linearImpulse = forceDirection * speed * forceMultiplier * mass
No matter how I adjust the friction(static, dynamic) and restitution, using addForce or applyImpulse, the ball can only slide. How can I solve this problem?
If I put an alpha image texture on a model created in Blender and run it on
RCP or visionOS, the rendering between the front and back due to alpha will result in an unintended rendering. Details are below.
I expor ted a USDC file of a Blender-created cylindrical object wit h a PNG (wit h alpha) texture applied to t he inside, and
t hen impor ted it into Reality Composer Pro.
When multiple objects t hat make extensive use of transparent textures are placed in front of and behind each ot her,
t he following behaviors were obser ved in t he transparent areas
・The transparent areas do not become transparent
・The transparent areas become transparent toget her wit h t he image behind t hem
the order of t he images becomes incorrect
Best regards.
I created a project using Reality Composer Pro. When I export to a .usdz file, it works well on iPhone 13, 14, 15, and 16 but not on iPhone 12 and 11.
In the timeline, I use a behaviour that is on added to scene to active intro animation and loop background audio. But it does not work on old device like iPhone 12 and 11. Also, all interactive taps/touch points/audio don't seem to work too.
Iphone 13,14,15,16 is on ios 18.1.
iPhone 11, 12 is on ios 17.6.1.
Here is sample usdz file exported from REality Composer Pro 2.0 that has problem above: https://drive.google.com/file/d/1sHZn9JABTswLq2flYjToTbWDuE5T7eNw/view?usp=sharing