Reality Composer Pro

RSS for tag

Prototype and produce content for AR experiences using Reality Composer Pro.

Learn More

Posts under Reality Composer Pro subtopic

Post

Replies

Boosts

Views

Created

Exporting .reality files from Reality Composer Pro
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience. That works really well. I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook. Am I missing something, or is it no longer possible to export .reality files? Thanks.
3
2
2.0k
Sep ’23
Animations exported from Blender does not shoe in Reality Composer Pro
I made an animation in Blender using geometry nodes that I exported to USDC file (then I used Reality Converter to convert to USDZ) and I can see the animation when viewing from the finder but does not play after importing to RCP. Any idea how I can play the animation? Or can the animation be accessed through Xcode? Thanks!
4
0
1k
Jun ’24
Blender to Reality Composer Pro 2.0 to SwiftUI + RealityKit visionOS Best Practices
Hi, I'm very new to 3D and am currently porting a SwiftUI iOS app to visionOS 2.0. I saw WWDC24 feature Blender in multiple spatial videos, and have begun integrating Blender models and animations into my VisionOS app (I would also like to integrate skeletons and programmatic rigging, more on that later). I'm wondering if there are “Best Practices” for this workflow - from Blender to USD to RCP 2.0 to visionOS 2 in Xcode. I’ve cobbled together the following that has some obvious holes: I’ve been able to find some pre-rigged and pre-animated models online that can serve as a great starting point. As a reference, here is a free model from SketchFab - a simple rigged skeleton with 6 built in animations: https://sketchfab.com/3d-models/skeleton-character-low-poly-8856e0138f424d68a8e0b40e185951f6 When exporting to USD from Blender, I haven’t been able to export more than one animation per USD file. Is there a workflow to export multiple animations in a single USDC file, or is this just not possible? As a temporary workaround, here is a python script I’ve been using to loop through all Blender animations, and export a model for each animation: import bpy import os # Set the directory where you want to save the USD files output_directory = “/path/to/export” # Ensure the directory exists if not os.path.exists(output_directory): os.makedirs(output_directory) # Function to export current scene as USD def export_scene_as_usd(output_path, start_frame, end_frame): bpy.context.scene.frame_start = start_frame bpy.context.scene.frame_end = end_frame # Export the scene as a USD file bpy.ops.wm.usd_export( filepath=output_path, export_animation=True ) # Save the current scene name original_scene = bpy.context.scene.name # Iterate through each action and export it as a USD file for action in bpy.data.actions: # Create a new scene for each action bpy.context.window.scene = bpy.data.scenes[original_scene].copy() new_scene = bpy.context.scene # Link the action to all relevant objects for obj in new_scene.objects: if obj.animation_data is not None: obj.animation_data.action = action # Determine the frame range for the action start_frame, end_frame = action.frame_range # Export the scene as a USD file output_path = os.path.join(output_directory, f"{action.name}.usdc") export_scene_as_usd(output_path, int(start_frame), int(end_frame)) # Delete the temporary scene to free memory bpy.data.scenes.remove(new_scene) print("Export completed.") I have also been able to successfully export rigging armatures as a single Skeleton - each “bone” showing getting imported into Reality Composer Pro 2.0 when exporting/importing manually. I would like to have all of these animations available in a single scene to be used in a RealityView in visionOS - so I have placed all animation models in a RCP scene and created named Timeline Action animations for each, showing the correct model and hiding the rest when triggering specific animations. I apply materials/textures to each so they appear the same, using Shader Graph. Then in SwiftUI I use notifications (as shown here - https://forums.developer.apple.com/forums/thread/756978) to trigger each RCP Timeline Action animation from code. Two questions: Is there a better way than to have multiple models of the same skeleton - each with a different animation - in a scene to be able to trigger multiple animations? Or would this require recreating Blender animations using skeleton rigging and keyframes from within RCP Timelines? If I want to programmatically create custom animations and move parts of the skeleton/armatures - do I need to do this by defining custom components in RCP, using IKRig and define movement of each of the “bones” in Xcode? I’m looking for any tips/tricks/workflow from experienced engineers or 3D artists that can create a more efficient/optimized workflow using Blender, USD, RCP 2 and visionOS 2 with SwiftUI. Thanks so much, I appreciate any help! I am very excited about all the new tools that keep evolving to make spatial apps really fun to build!
4
2
1.1k
Jun ’24
How to create Lunar Rover USDZ Animated Sample File
Hello! I’ve got a USDZ export from Maya pipeline working with animation, and they load up nicely in the Vision Pro. I’ve been checking out the animated sample files in the Augmented Reality/Quick Loop sample page, specifically, the first three at the top of the page. I would like to know how they are created. I’m a 3d modeler and animator, not a programmer, so dipping my toe in RCP and Xcode/SwiftUI, but could used some informative tutorials for proper workflow. For example, in the Lunar Rover sample, there are lines emanating from the model, then text windows appear. Would I need to create all these extras inside Reality Composer Pro? I’d like to start creating immersive, narrative experiences (both in a volume, and fully immersive) but for prototyping, I want to learn the proper way to add this type of functionality. I think I remember seeing something to do with “schemas” involved. I’m assuming there might be some coding to setup in RCP for when items are selected, then an associated animation is triggered. Can anyone point me towards the relevant documentation to help me get started? Remember, I don’t code. ;) Here are my recent Vision Pro experimentations. https://youtube.com/playlist?list=PLCH753rZ9r6eqXxpIemaSlcyYxjFgR210&si=P_7AY2aL97Upm61i I’m also proficient with Unreal Engine, but getting content packaged and over to AVP is still not ready for prime time, so i’m exploring the native approach. Thanks for helping point me in the right direction!
1
0
588
Jul ’24
How to trigger actions by OnCollision in Behaviors Component
It's all about notifications to trigger actions from RCP's new Timeline system. From Compose interactive 3D content in Reality Composer Pro I am actually starting to confuse why there was need to use Entity.applyTapForBehaviors in code to trigger content in Behaviors Component. Simply because in Behaviors Component, we have chosen OnTap to allow a "Tap Notification" to trigger our action (on a selected target object). Then I guess by selecting OnCollision this trigger, I should write something like CollisionEvent.entityA.applyCollisionForBehaviors, which we don't have. And ofc the collision on my object won't trigger this action (because I only did things in RCP not in code). Ignoring this post has pointed out we could use Behaviors Component's OnNotification to trigger something for now. I found that I could still use OnTap trigger but actually put my code Entity.applyTapForBehaviors under my subscribed collision's begin event. That actually works better than OnCollision So what is the design principles here? And how could I trigger a collision notification to let my Behaviors Component's OnCollision actually works?
1
2
594
Aug ’24
Blurred Background (RealityKit) Shader Graph Node not working on iOS/macOS
The ShaderGraph Node Blurred Background (RealityKit) – https://developer.apple.com/documentation/shadergraph/realitykit/blurred-background-(realitykit) works fine within the RealityComposer Pro 2 editor but isn't working on iOS 18 or macOS 15. Instead of the blurred content it just renders as opaque in a single color (Screenshot 2). Interestingly it also fails to render within RealityComposer Pro when no other entities are within the scene, e.g only a background skybox set. Expected Behavior: It would be great if this node worked the same way as it does on visionOS since this would allow for really interesting and nice effects for scenes. Feedback ID: FB15081190
1
1
772
Sep ’24
ShaderGraphMaterial with Occlusion Surface Output fails to load on iOS and macOS
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error: RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://developer.apple.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound) This happens with both https://developer.apple.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://developer.apple.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit) RealityView { content in do { let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)]) bgEntity.position.z = -0.2 content.add(bgEntity) let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial") let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial]) content.add(testEntity) content.cameraTarget = testEntity } catch { print("Shader Graph Load Error:") dump(error) } } .realityViewCameraControls(.orbit) .edgesIgnoringSafeArea(.all) Feedback ID: FB15081296
1
1
731
Sep ’24
I have a question regarding particle emitters on realityKit
I am currently developing a game that runs on VisionOS using RealityKit and Swift. I have a question regarding particle emitters. It seems that there is a sorting order (render queue) between particle emitters themselves, but there doesn’t appear to be a render queue between particle emitters and regular model entities. If such a feature exists, could you please provide a simple example? Thank you!
1
0
414
Nov ’24
Timeline, behaviors in RCP and xcode
Hi, every one! I'm trying to bind timeline(animation + audio) and behaviors on an entity in reality composer pro. In xcode, I need to clone this entity and use the behavior, but I found that the behaviors are not clone(send notification but not received by reality composer pro and not execute the timeline). How can I solve this problem? Thanks!
0
0
423
Nov ’24
Quadrophonic
Hi, In the downloadable WWDC sample project "CreatingASpaceshipGame" there is an audio file named "WorkMusic.aiff", as well mentioned in the video. Info says it's PCM 4-channel Quadrophonic. Where can I find further information on how this file was authored? Was it simply exported from Logic Pro with Quadrophonic Surround settings or did it have any other specific treatment? Thanks, Axel
3
0
690
Dec ’24
[Reality Composer Pro] Detecting Collision in RealityComposerPro scene
Hello! https://forums.developer.apple.com/forums/thread/762763 I read this thread, and this is similar to what I'm trying to do. I have two entities in the scene, "HandTrackingEntity", "HandScanner". "HandTrackingEntity": I put Anchoring Component, Collision Component (Trigger) here. "HandScanner": I put Behaviors Component(OnCollision), and Collision Component here. Here is the pictures how I set the components. and I set physicsSimulation property to .none. I was expecting that Timeline will be played when I put my hand(with HandTrackingEntity) on "HandScanner" entity. But it didn't work. Am I missing some steps? And I need sample codes to understand how to apply 'physicsSimulation' property. I'd appreciate it if you could let me know about it.
1
0
517
Dec ’24
Issue with Vertex Animation Scaling Not Working in Reality Composer Pro
Hi everyone, I’m working on a project in Reality Composer Pro, and I’ve encountered an issue with vertex animations. Here’s what I’ve done: I created a model in Blender, which includes two animations: One that scales the vertices of the model. Another that moves the model's position. I imported both animations into Reality Composer Pro, and the position animation works fine, but the vertex scaling animation does not seem to work. What I’m Trying to Achieve: I want the vertex scaling animation to play correctly in Reality Composer Pro alongside the movement animation. Problem: The position animation works as expected, but the vertex scaling animation does not work when applied in Reality Composer Pro. I have checked the vertex scaling animation in other software, and it works fine there. The issue seems to be specific to Reality Composer Pro. Is vertex animation scaling supported in Reality Composer Pro? If so, what might be causing this issue? Any advice or solutions would be greatly appreciated!
0
0
449
Dec ’24
Values for SIMD3 and SIMD2 not showing up in Reality Composer Pro
Reality Composer Pro question related to custom components My custom component defines some properties to edit in RCP. Simple ones work find, but SIMD3 and SIMD2 do not. I'd expect to see default values but instead I get this 0s. If I try to run this the scene doesn't load. Once I enter some values it does and build and run again it works fine. More generally, does Apple have documentation on creating properties for components? The only examples I've seen show simple strings and floats. There are no details about vectors, conditional options, grouping properties, etc. public struct EntitySpawnerComponent: Component, Codable { public enum SpawnShape: String, Codable { case domeUpper case domeLower case sphere case box case plane case circle } // These prooerties get their default values in RCP /// The number of clones to create public var Copies: Int = 12 /// The shape to spawn entities in public var SpawnShape: SpawnShape = .domeUpper /// Radius for spherical shapes (dome, sphere, circle) public var Radius: Float = 5.0 // These properties DO NOT get their default values in RCP. The all show 0 /// Dimensions for box spawning (width, height, depth) public var BoxDimensions: SIMD3<Float> = SIMD3(2.0, 2.0, 2.0) /// Dimensions for plane spawning (width, depth) public var PlaneDimensions: SIMD2<Float> = SIMD2(2.0, 2.0) /// Track if we've already spawned copies public var HasSpawned: Bool = false public init() { } }
1
0
516
Dec ’24
RealityKit texture sampling behaviour
Hey there, I am working on an app that displays environmental data using PNG color channels to represent data ranges, which gets overlayed on a map. The sampled values aren't what I'm expecting though... for example an RGB value of 0x7f0000 (R = 0.5, G = 0, B = 0) would be seen as 0.21, 0, 0 in the shader. This basically makes it unusable if I'm trying to show scientific data... I'm half wondering if I am completely misunderstanding how sampling works in RealityKit / RealityComposerPro. Anybody have any idea why it works like this? Actual result (chart labels added in photoshop): Expected: Red > 0.1 Shader Graph
4
0
704
Dec ’24
RealityKit and MacOS - How to load Reality Composer Scene? Documentation Incomplete/Innacurate
"Although Xcode generates loading methods for all Reality Composer files in your Xcode project" I do not find this to be true, sadly. Does anyone have any luck or insight on how one can build just a simple MacOS app that will import a scene from a Reality File? The documentation suggests that the simple act of bringing a .Reality File in (What about .realitycomposerpro?) will generate code, but that doesn't seem to happen. The sample code (Spaceship) does not compile for MacOS. I'd really love just the most generic template of an Xcode Project that compiles with a button that pops open a scene., Like the VisionOS default immersive project.
2
0
810
Dec ’24