I created a simple Timeline animation with only a "Play Audio" action in RCP. Also a Behaviors Component setting an "OnTap" trigger to fire this Timeline animation.
In my code, I simply run Entity.applyTapForBehaviors() when something happened. The audio can be normally played on the simulator but cannot be played on the device.
Any potential bug leads this behavior?
Env below:
Simulator Version: visionOS 2.0 (22N5286g)
XCode Version: Version 16.0 beta 4 (16A5211f)
Device Version: visionOS 2.0 beta (latest)
Reality Composer Pro
RSS for tagPrototype and produce content for AR experiences using Reality Composer Pro.
Post
Replies
Boosts
Views
Activity
It's all about notifications to trigger actions from RCP's new Timeline system. From Compose interactive 3D content in Reality Composer Pro I am actually starting to confuse why there was need to use Entity.applyTapForBehaviors in code to trigger content in Behaviors Component. Simply because in Behaviors Component, we have chosen OnTap to allow a "Tap Notification" to trigger our action (on a selected target object).
Then I guess by selecting OnCollision this trigger, I should write something like CollisionEvent.entityA.applyCollisionForBehaviors, which we don't have. And ofc the collision on my object won't trigger this action (because I only did things in RCP not in code).
Ignoring this post has pointed out we could use Behaviors Component's OnNotification to trigger something for now.
I found that I could still use OnTap trigger but actually put my code Entity.applyTapForBehaviors under my subscribed collision's begin event. That actually works better than OnCollision
So what is the design principles here? And how could I trigger a collision notification to let my Behaviors Component's OnCollision actually works?
The entity in My RealityView contains tracking components and allows them to track different places of the hand. However, I found that except for the fingertip of the index finger, the fingertip of the thumb, the palm and the wrist, all other positions cannot be tracked normally (such as the fingertip of the middle finger). How can I solve it (I think it may be a beta version of the bug)
I am attempting to execute actions after clicking an entity in Reality View using the Behaviors component. I have added the Input Target component and the Tap gesture as follows:
TapGesture().targetedToAnyEntity()
.onEnded({ value in
_ = value.entity.applyTapForBehaviors()
})
)
However, during testing, I have observed that the entity does not appear to recognize the click gesture. Could you kindly provide any relevant documentation or guidance on this matter?
In the reality view, I found that the entity could not cast a shadow on the reality. What configuration should I add to achieve this function?
How to set the scale unit of an Entity in Reality Composer Pro, for example, if the scale value is 1 meter, then when this Entity is placed in RealityView, the displayed size will be 1 meter
If the unit of scale cannot be set in Reality Composer Pro, is there a way to specify the unit of scale in the code so that the Entity can be displayed in meters when added to RealityView
Thank you
VisionOS 2 beta 5 ,unity textmesh shader errors
I tested all variations. The checkboxes in Reality Composer Pro 2 (beta 4) in the Physics Body component:
are absolute and not parent-relative. Also, regardless of what I set the center of mass to:
it always rotates around the center of mass despite the local rotation being correctly at the center of origin (imported from Blender). Thus I can get the door to turn but never to swing because it always rotates around its center of mass.
Tell me if this is expected behaviour or if there is a simple way to make this work.
How should I set the window of WindowGrop to resemble a curved screen style?
I have a plane with a texture that was made in Blender & then exported using the Reality Converter from a USDC to a USDZ.
The translucency looks 100% translucent in RCP but it looks a bit like glass with some reflection in a Reality Kit scene in visionOS.
Is there a material setting that I need to change?
Hi all,
I am fairly new to Swift development so go easy on me!
I am working through a few examples of using Reality Kit content within my projects and whilst trying to work on adding gestures to RealityKit entities, I have come across a weird issue.
Downloading and running the example here
This works fine for me.
When adding the same things to my own code - in this case a class called EntityGestureState to my GestureComponent file (within the reality kit project) I constantly get this error:
"Static property 'shared' is not concurrency-safe because it is non-isolated global shared mutable state"
Even just troubleshooting with something as simple as:
public class EntityGestureState {
// The entity currently being dragged if a gesture is in progress.
// Singleton shared instance
static let shared: EntityGestureState = EntityGestureState()
}
I immediately get the error and from a bunch of trial and error and reading different sources I can't seem to get around this.
Could anyone help here? I am running on Xcode 16 beta 3 so am wondering if it's a bug but also more than likely user-error.
I’m encountering an issue with recording in my Unity game through Reality Composer Pro. When I attempt to record video or take screenshots, it results in a black screen once my game launches. Screenshots and videos outside my game record fine, but within the game, the recordings are just black.
Additionally, when using my headset, the display is distorted and only my right eye shows anything, while the left eye remains black.
Here are some specifics:
My game is developed in Unity.
I’m using all the betas: Xcode 16 beta, the new macOS beta, and VisionOS 2 beta.
In the attached screenshot, you can see an Apple UI overlay with a black screen behind it. However, when I’m in the headset, I actually see my game along with that UI overlay, so it seems like the game itself isn’t getting recorded.
Also, I noticed on the Apple webpage that they recommend using the Developer Capture feature in Reality Composer Pro for high-quality screenshots and app previews. However, I find that using Control Center for recording works pretty well despite the lower quality and foveated resolution. If I can’t get Reality Composer Pro to capture in 4K, is it still acceptable to use screenshots and record videos from the Control Center?
Has anyone encountered similar issues or have any insights on what might be causing this? And regarding the secondary question, I’d appreciate any guidance from Apple on the acceptability of using Control Center recordings as a fallback. Here's a video preview I made with Control Center recordings. Is this quality acceptable?
https://youtu.be/z4VIO7obNNg?si=2irqHEfeGjkNBUvb
I have a custom material using Shader Graph in Reality Composer Pro, and I am trying to rig up sliders to values to control the shader. I am able to read the values from the Shader Graph without a problem, and I can even update them when setting them from the LLDB command line and then getting the values back. But the changes are not reflected in the graphics. Is there some sort of update() method or something that is required to read the changed parameter values?
On a related note, I am trying to understand what the MaterialParameters.Handle property is and why one would access a MaterialParameter via the handle vs just the name.
I have developed a code that initiates the Timeline in the Reality Composer Pro scene every 12.93 seconds.
RealityView { … }
.onAppear {
startTimer()
}
.onDisappear {
stopTimer()
}
func startTimer() {
timer = Timer.scheduledTimer(withTimeInterval: 12.93, repeats: true) { _ in
action()
}
}
func stopTimer() {
timer?.invalidate()
}
func action() {
print(“SunUpDown”)
NotificationCenter.default.post(
name: NSNotification.Name(“RealityKit.NotificationTrigger”),
object: nil,
userInfo: [
“RealityKit.NotificationTrigger.Scene”: scene as Any,
“RealityKit.NotificationTrigger.Identifier”: “SunUpDown”
]
)
}
Upon receiving the “SunUpDown” command, Timeline will be executed.
However, everything was functioning normally when I was running the scene, and I could continue looping until I attempted to zoom in on the window and discovered that it ceased looping. Could you please provide an explanation for this behavior?
Note: The window type is volumetric, and the parameter of the defaultWorldScaling modifier is dynamic.
Hi,
I'm struggling to find a way to get a simple Unlit Material working with Reality Composer.
With all the standard objects, it doesn't seem like we have the option of an unlit material which seems really odd to me... This is kind of a basic feature.
The only way I was able to get an unlit material inside RealityConverter was to import a mesh without material and I was getting a white unlit material.
I have seen that you can set an unlit material using RealityKit but from what I saw RealityKit builds an app at the end right? Honestly, I'm not sure what you get when creating an AR app using RealityKit...What I'm looking for is a simple .reality file to be displayed on the web.
In a scenario involving one of the entities in a Reality Composer Pro environment, I intend for this entity to display a blue material when viewed by the user. To achieve this, I have added the following Shader Graphs to the materials associated with this entity:
Additionally, I have included the HoverEffectComponent component to the Reality View in the code:
RealityView { content in
if let model = try? await Entity(named: “WorldScene”, in: realityKitContentBundle) {
let hoverEffect = HoverEffectComponent(.shader(.default))
model.components.set(hoverEffect)
content.add(model)
}
}
However, hover this entity, I am unable to observe any visual reaction. Could you please provide guidance on how to resolve this issue?
Hello all,
I'm developing an application for visionOS and I'm trying to implement 2 different animations:
First animation
Initially, I have a map that should not be visible. I would like to create an animation effect where it appears as if a drop of water falls in the center of the map and the expanding waves gradually reveal the entire map.
Is there a way to do it directly on SwiftUI or I need an animation on my USDZ?
Second animation
I want an animation effect similar to a cinema screen opening from the center, gradually revealing a video that was initially hidden.
Is there a way to do it directly on SwiftUI?
Can someone help me with this topic?
Thanks ;)
Ok, I am loading an object from a Reality Composer Pro scene that has two entities inside its hierarchy that both have a Physics Body and a Collision component like this
Root
Outer Box Mesh
Hinge + physics(static/kinematic) + collision
Door. + physics(dynamic) + collision
I tried to keep the physics/collision components only to the hinge and the door while I move the root or the outer box via code around. The behaviour I see is that it either
moves the hinge and the door around relative to the top level (despite me checking the movement locking) OR
starts rotating! the root or outer box even though I only set its position.
What is the correct setup in this case? What I want is that I can move the whole object around and settle it somewhere and still have the door pinned at a fixed relative position and have one degree freedom on the hinge axis.
I know how to do it in code but I really want to use the build in Reality Composer Pro settings/components. I am using the latest beta 4.
I'm using Reality Composer Pro Version 2.0 Version 2.0 (448.0.10.0.2) avaliable in Xcode_16_beta_4
When adding a animation from the Animation Library component on my armature to a timeline - the animation does not 'freeze' on the last frame.
Is there a way to 'freeze' the first or last frames when adding animations to the timeline? And how should I expect the first and last keys on my animations to behave with the default 'rest pose' on the imported usd file?
How to solve the problem of using Model3D to load a local model file in Unity project, clicking on NavigationLink multiple times to load the local model file, and receiving a prompt "assertion failure: 'stagingBuffer.buffer.isValid()' (createMetalBuffer:line 2971) Failed to create staging buffer for texture upload"?