We are building an AR experience for deployment on iphones. We are using Unity but it looks as if Reality Composer Pro has better features for spatial audio. I am not sure if Reality Composer Pro can only be used for Vision Pro or can it also be used for deployment on Iphone or ipad.
Reality Composer Pro
RSS for tagPrototype and produce content for AR experiences using Reality Composer Pro.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello! I’ve got a USDZ export from Maya pipeline working with animation, and they load up nicely in the Vision Pro.
I’ve been checking out the animated sample files in the Augmented Reality/Quick Loop sample page, specifically, the first three at the top of the page.
I would like to know how they are created. I’m a 3d modeler and animator, not a programmer, so dipping my toe in RCP and Xcode/SwiftUI, but could used some informative tutorials for proper workflow. For example, in the Lunar Rover sample, there are lines emanating from the model, then text windows appear. Would I need to create all these extras inside Reality Composer Pro? I’d like to start creating immersive, narrative experiences (both in a volume, and fully immersive) but for prototyping, I want to learn the proper way to add this type of functionality. I think I remember seeing something to do with “schemas” involved. I’m assuming there might be some coding to setup in RCP for when items are selected, then an associated animation is triggered. Can anyone point me towards the relevant documentation to help me get started? Remember, I don’t code. ;)
Here are my recent Vision Pro experimentations.
https://youtube.com/playlist?list=PLCH753rZ9r6eqXxpIemaSlcyYxjFgR210&si=P_7AY2aL97Upm61i
I’m also proficient with Unreal Engine, but getting content packaged and over to AVP is still not ready for prime time, so i’m exploring the native approach.
Thanks for helping point me in the right direction!
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
It's all about notifications to trigger actions from RCP's new Timeline system. From Compose interactive 3D content in Reality Composer Pro I am actually starting to confuse why there was need to use Entity.applyTapForBehaviors in code to trigger content in Behaviors Component. Simply because in Behaviors Component, we have chosen OnTap to allow a "Tap Notification" to trigger our action (on a selected target object).
Then I guess by selecting OnCollision this trigger, I should write something like CollisionEvent.entityA.applyCollisionForBehaviors, which we don't have. And ofc the collision on my object won't trigger this action (because I only did things in RCP not in code).
Ignoring this post has pointed out we could use Behaviors Component's OnNotification to trigger something for now.
I found that I could still use OnTap trigger but actually put my code Entity.applyTapForBehaviors under my subscribed collision's begin event. That actually works better than OnCollision
So what is the design principles here? And how could I trigger a collision notification to let my Behaviors Component's OnCollision actually works?
The ShaderGraph Node Blurred Background (RealityKit) – https://developer.apple.com/documentation/shadergraph/realitykit/blurred-background-(realitykit) works fine within the RealityComposer Pro 2 editor but isn't working on iOS 18 or macOS 15. Instead of the blurred content it just renders as opaque in a single color (Screenshot 2).
Interestingly it also fails to render within RealityComposer Pro when no other entities are within the scene, e.g only a background skybox set.
Expected Behavior: It would be great if this node worked the same way as it does on visionOS since this would allow for really interesting and nice effects for scenes.
Feedback ID: FB15081190
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error:
RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://developer.apple.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound)
This happens with both https://developer.apple.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://developer.apple.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit)
RealityView { content in
do {
let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)])
bgEntity.position.z = -0.2
content.add(bgEntity)
let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial")
let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial])
content.add(testEntity)
content.cameraTarget = testEntity
} catch {
print("Shader Graph Load Error:")
dump(error)
}
}
.realityViewCameraControls(.orbit)
.edgesIgnoringSafeArea(.all)
Feedback ID: FB15081296
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
RealityKit
Reality Composer Pro
Shader Graph Editor
I am currently developing a game that runs on VisionOS using RealityKit and Swift.
I have a question regarding particle emitters.
It seems that there is a sorting order (render queue) between particle emitters themselves, but there doesn’t appear to be a render queue between particle emitters and regular model entities.
If such a feature exists, could you please provide a simple example?
Thank you!
Hi!
I wanna ask that if it's possible to make mirror material with Shader Graph in Reality Composer Pro. The mirror should reflect entities in Reality Composer Pro scene.
I found that it works with SceneKit, but I'm using RealityKitContent in my project. Are there ways to solve this?
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
Reality Composer Pro
Shader Graph Editor
visionOS
Hello!
https://forums.developer.apple.com/forums/thread/762763
I read this thread, and this is similar to what I'm trying to do.
I have two entities in the scene, "HandTrackingEntity", "HandScanner".
"HandTrackingEntity": I put Anchoring Component, Collision Component (Trigger) here.
"HandScanner": I put Behaviors Component(OnCollision), and Collision Component here.
Here is the pictures how I set the components.
and I set physicsSimulation property to .none.
I was expecting that Timeline will be played when I put my hand(with HandTrackingEntity) on "HandScanner" entity. But it didn't work.
Am I missing some steps? And I need sample codes to understand how to apply 'physicsSimulation' property. I'd appreciate it if you could let me know about it.
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
RealityKit
Reality Composer Pro
visionOS
Reality Composer Pro question related to custom components
My custom component defines some properties to edit in RCP. Simple ones work find, but SIMD3 and SIMD2 do not. I'd expect to see default values but instead I get this 0s. If I try to run this the scene doesn't load. Once I enter some values it does and build and run again it works fine.
More generally, does Apple have documentation on creating properties for components? The only examples I've seen show simple strings and floats. There are no details about vectors, conditional options, grouping properties, etc.
public struct EntitySpawnerComponent: Component, Codable {
public enum SpawnShape: String, Codable {
case domeUpper
case domeLower
case sphere
case box
case plane
case circle
}
// These prooerties get their default values in RCP
/// The number of clones to create
public var Copies: Int = 12
/// The shape to spawn entities in
public var SpawnShape: SpawnShape = .domeUpper
/// Radius for spherical shapes (dome, sphere, circle)
public var Radius: Float = 5.0
// These properties DO NOT get their default values in RCP. The all show 0
/// Dimensions for box spawning (width, height, depth)
public var BoxDimensions: SIMD3<Float> = SIMD3(2.0, 2.0, 2.0)
/// Dimensions for plane spawning (width, depth)
public var PlaneDimensions: SIMD2<Float> = SIMD2(2.0, 2.0)
/// Track if we've already spawned copies
public var HasSpawned: Bool = false
public init() {
}
}
I want to render a 3d/stereoscopic video in an Apple Vision Pro window using RealityKit/RealityView. The video is a left-right stereo. The straight forward approach would be to spawn a quad, and give it a custom Shader Graph material, which has a CameraIndexSwitch. The CameraIndexSwitch chooses between the right texture vs the left texture.
https://i.sstatic.net/XawqjNcg.png
The issue I have here is that I have to extract the video frames from my AVSampleBufferVideoRenderer. This should work ok, but not if I'm playing FairPlay content.
So, my question is, how to render stereo FairPlay videos in a SwiftUI RealityView?
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
Metal
MetalKit
RealityKit
AVFoundation
I’m currently using the RealityKit/ObjectCaptureSession API to develop my app, and I’ve noticed that Apple’s official Reality Composer app also uses the same API. However, both my app and the Reality Composer app crash if the device doesn’t have enough storage space (approximately 4 GB free). Here is the debug log I’m seeing:
Insufficient storage: required 4000000000
Switch to error state. Got error = insufficientStorage(requiredBytes: 4000000000)
fromState == toState so punting transition! from=disabled toState=disabled
Punting transition since states match: disabled
Got error starting session! insufficientStorage(requiredBytes: 4000000000)
I would like to request:
A fix for the crash in the official Reality Composer app.
Guidance on how to properly handle this crash or error when using the ObjectCaptureSession API in my own app.
Thank you!
Hi
I'm trying to create a water shader using the shader graph in Reality Composer Pro, but quite a few of the features you would need for realistic water rendering appear to be missing.
One big issue is the lack of a way to create refraction. We can easily control the transparency of the water by changing the opacity, but how can we distort what we see through the water? I can't find any obvious solution for that.
In Unity, they provide a node called HD Scene Color which is basically the scene rendered to an offscreen buffer which you can apply to the water and then distort to get a refraction effect. I guess the Background Blur node could be used for something like this if we could turn off the blur and distort it, but there's no control for the blur and no control for the texture coordinates.
Am I missing something? Any ideas are welcome :)
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
[xrsimulator] Component Compatibility: EnvironmentLightingConfiguration not available for 'xros 1.0', please update 'platforms' array in Package.swift
[xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults")
Tool exited with code 1
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
sample repo: https://github.com/ckse93/VideoDiffusionIssueSHowcase
Repo has detailed step by step workflow. as well as screenshot, python script compute result, and parameters
after running computeDiffuseReflectionUVs.py and mapping textures and reflection diffuse to objects, I noticed that reflection diffuse does not produce any color.
expected result is shown below, diffused light has color
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
RealityKit
Reality Composer Pro
Shader Graph Editor
I am using RealityKit's ObjectCaptureSession API to capture objects, presenting the process with ObjectCaptureView. During the object capture session, there is default background audio that plays automatically.
I noticed this same audio behavior in Apple's official Composer app, which seems to use the same API. I'd like to disable this audio in my app, but I have not been able to find any API or configuration option to do so.
However, the audio persists, and I cannot find a way to turn it off. Is there an official method or workaround to disable this default audio in the ObjectCaptureSession API?
Any guidance would be appreciated. Thank you!
Trying to add a Reality Composer Pro project into my swift playground application. Can't figure out what name to call for the package.
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
Swift Student Challenge
Swift Playground
RealityKit
Looking for help on getting "On Tap" to work inside RCP for my AVP project. I can get it to work when using "on added to scene" but if I switch to "on tap", the audio will not play when attaching the audio to an entity in my scene. I'm using the same entity for the tap gesture that the audio is using for the emitter. Here is my work flow for the "on added to scene" that works correctly to help troubleshoot my non working "on tap".
Behaviors: "on added to scene". action - timeline
Input target: check mark enabled, allowed all
Collision set to default
Audio library: source mp3 file
Chanel Audio: resource mp3 file above
Timeline: Play Audio with mp3 file added
This set up in RCP allows my AVP project to launch correctly with audio "on added to scene". But when switching behaviors to "on tap", the audio will no longer play and I can not figure out why. I've tried several different options and nothing works. Please help!
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
Audio
USDZ
Reality Composer Pro
visionOS
Is it possible to have a skydome which influences lighting in the scene but is otherwise invisible? in a raytracer that would be visible to secondary rays but invisible to primary rays.
Cheers, thanks.
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
I am submitting a challenge to the Swift Student Challenge. I have created a RealityContent folder using Reality Composer Pro. How can I import this folder into the Swift Package Manager (.swiftpm) project hosted on Playground to ensure that it becomes a usable package?
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
Swift Student Challenge
Swift Playground
RealityKit
Reality Composer Pro
My friend cannot build my visionOS project in the simulator. He gets the following error.
Error:
[xrsimulator] Exception thrown during compile: cannotGetRkassetsContents(path: "/Users/path/to/Packages/RealityKitContent/Sources/RealityKitContent/RealityKitContent.rkassets")
In Xcode, he is able to open the RealityKitContent package in realityComposer Pro by clicking on the Package.realitycomposerpro file. No warnings show up wrt this error in RCP either. All scenes appear to be usable/navigable in RCP. This error only comes up when he tries to build the project in Xcode command+b. The is no other information in the Report Navigator's Build logs for this error. The error is always followed by this next error.
Error:
Tool exited with code 1
Yikes, please help!
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro