Reality Composer Pro

RSS for tag

Leverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps

Posts under Reality Composer Pro tag

140 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Xcode 16 / Reality Composer Pro 2 segmentation fault issue
After installing MacOS Sequoia, Xcode 16 & RealityComposer Pro 2 my Apple Vision Pro projects (worked perfectly with Xcode 15) started to give me Tool terminated by signal 'Segmentation fault: 11' error while compiling RealityKitContent assets. This happens only when I try to build project with .usdz models exported from Blender, but when I try with sample models from apple website it works nice with no errors. Is there any solution?
2
0
599
Dec ’24
Getting vector3 or vector4 per-vertex data into Shader Graph Editor
Hi, I'm trying to understand how I can get 3- or 4-channel per-vertex data into the Graph Editor. From my tests, it seems that: the "Geometric Property" node does not give access to 4-channel data, "Geometric Property (vector3)" does not give me access to custom properties besides the ones defined in MaterialX core the "Texture Coordinates" node has a vector4f mode (yay!), but according to MaterialX spec, texcoords must have 2 or 3 channels, and I can't get 4-channel data to show up there either. My assumption so far is that I must be missing some "magic" – for example, do the primvars in a file have to be in a specific order, independent of their names? Or do their names matter? (E.g. convention would be primars:st and primvars:st1 and so on) Unfortunately the forum doesn't allow me to attach any USDZ or ZIP files or GDrive links; if there's a way to share a test file I'm happy to do so!
0
0
621
Sep ’24
Weird Reality Composer Pro Loop Timeline bug
TLDR: Timeline does not play animation when Repeat Forever is checked. Hi! I have created a timeline for my model that does a built-in emphasize animation. Then I added a behavior to my model and has set OnAddedToScene with action to run that timeline. It works perfect well on my device. But I want the timeline to be looped. I realized that there's no loop option in the timeline, but I noticed that I can loop it if I insert it into another timeline(The loop checkbox shows up). So I did that and had my model's behavior to run that timeline. But then the model doesn't play the animation as intended. Note: I am not making a VisionPro app, but an iOS app leveraging ARKit and RealityKit Environment: iPhone 13 Pro Max with iOS18.0 Code: struct ARViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.session.run() Task { do { let anchor = AnchorEntity(plane: .horizontal) let emojiScene = try await Entity(named: "SunglassesScene", in: bubbleAR anchor.addChild(emojiScene) arView.scene.addAnchor(anchor) } catch { print("Failed to load models: \(error)") } } return arView } } Thank you!
0
0
572
Sep ’24
How to display spatial images or videos on swiftUI view
Hi! Now I am making a visionOS program. I have an idea that I want to embed spatial videos or pictures into my UI, but now I have encountered problems and have no way to implement my idea. I have tried the following work: Use AVPlayerViewController to play a spatial video, but it is only display spatial video when modalPresentationStyle =.fullscreen. Once embedded in swiftUI's view, it shows it as a normal 2D image. The method of https://developer.apple.com/forums/thread/733813 I also tried, using a shadergraph to realize the function of the spatial images displaying, but the material can only be attached on the entity, I don't know how to make it show up in view. I also tried to use CAMetalLayer to implement this function and write a custom shader to display spatial images, but I couldn't find a function like unity_StereoEyeIndex in unity to render binocular switching. Does anyone have a good solution to my problem? Thank you!
0
0
571
Sep ’24
ShaderGraphMaterial with Occlusion Surface Output fails to load on iOS and macOS
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error: RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://developer.apple.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound) This happens with both https://developer.apple.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://developer.apple.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit) RealityView { content in do { let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)]) bgEntity.position.z = -0.2 content.add(bgEntity) let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial") let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial]) content.add(testEntity) content.cameraTarget = testEntity } catch { print("Shader Graph Load Error:") dump(error) } } .realityViewCameraControls(.orbit) .edgesIgnoringSafeArea(.all) Feedback ID: FB15081296
1
1
696
Jan ’25
Casting shadows on the ground
In visionOS 2 beta, I have a character loaded from a Reality Composer Pro scene standing on the floor, but he isn't casting a shadow on the floor. I added a GroundingShadowComponent in RealityView, and he does cast shadows on himself (e.g., his hands cast shadows on his shoes), but I don't see any shadow on the floor. Do I need to enable something to have my character cast a show on the real-world floor?
1
0
646
Sep ’24
Build Errors for Reality Composer Pro Packages in Xcode 16 Beta 6 for iOS 18
Using Xcode 15.4, I have successfully built and run my app using Reality Composer Pro Version 1.0 package. I then successfully submitted that app version for release. Now, using Xcode 16 Beta 6, I've created a new branch repository for updating my app for iOS/iPadOS 18 and visionOS 2. However, once I created and switched to the new branch and did a build, I get build errors. It seems to be regarding the package manifest that relates to my Reality Composer Pro package that is part of my app. When I go to the package file in my project navigator and click the Open in Reality Composer Pro button, my package opens in Reality Composer Pro 2.0, which makes sense since it it the version for Xcode 16. However, I don't know how to address/get rid of the build errors. I've added and image of my build errors.
5
0
1.1k
Sep ’24
Compose interactive 3D content in Reality Composer Pro -- Build Error
Compilation of the project for the WWDC 2024 session title Compose interactive 3D content in Reality Composer Pro fails. After applying the fix mentioned here (https://developer.apple.com/forums/thread/762030?login=true), the project still won't compile. Using Xcode 16 beta 7, I get these errors: error: [xrsimulator] Component Compatibility: EnvironmentLightingConfiguration not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Component Compatibility: AudioLibrary not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Component Compatibility: BlendShapeWeights not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults") error: Tool exited with code 1
4
0
835
Oct ’24
How to trigger actions by OnCollision in Behaviors Component
It's all about notifications to trigger actions from RCP's new Timeline system. From Compose interactive 3D content in Reality Composer Pro I am actually starting to confuse why there was need to use Entity.applyTapForBehaviors in code to trigger content in Behaviors Component. Simply because in Behaviors Component, we have chosen OnTap to allow a "Tap Notification" to trigger our action (on a selected target object). Then I guess by selecting OnCollision this trigger, I should write something like CollisionEvent.entityA.applyCollisionForBehaviors, which we don't have. And ofc the collision on my object won't trigger this action (because I only did things in RCP not in code). Ignoring this post has pointed out we could use Behaviors Component's OnNotification to trigger something for now. I found that I could still use OnTap trigger but actually put my code Entity.applyTapForBehaviors under my subscribed collision's begin event. That actually works better than OnCollision So what is the design principles here? And how could I trigger a collision notification to let my Behaviors Component's OnCollision actually works?
1
2
570
Mar ’25
Reality Composer Pro 2.0 shader graphs can't be loaded on visionOS 1
Using Reality Composer Pro 2.0, I created a simple shader graph that displays a texture on an unlit surface: On visionOS 2 beta, I can successfully use ShaderGraphMaterial(named:from:in:) to load that shader graph material and assign it to a model entity. However, on visionOS 1.2 and earlier, either in Simulator or on the device, ShaderGraphMaterial(named:from:in:) fails and I see the following logged to the console: If, using Reality Composer Pro 1.0, I experimentally open the same project and delete and recreate exactly the same nodes above, then ShaderGraphMaterial(named:from:in:) works as expected on visionOS 1.2. Is it a known issue that Reality Composer 2 can't be used with visionOS 1? Is this intentional behavior? I've submitted feedback as FB14828873, including a sample project and repro steps. If possible, I would appreciate guidance from an Apple engineer, like "This is a known issue for [list of node types]" or "Reality Composer Pro 2 is not supported for visionOS 1 development, please refer to [documentation]" or "We recommend [workaround]." Thank you.
7
0
1.4k
May ’25
App Environment SkyDome's UV values
I started a visionOS app using Apple's new "App Environment" template, and when I looked at the UV mapping for the half SkyDome, the bottom edge had a UV 'Y' value of 0.318. Naively, I had assumed the bottom edge of a half dome would have a UV 'Y' value of 0.5 (half way up the texture map). Is this the standard UV mapping for half a SkyDome? It has caused some issues when I've applied some HDRIs.
1
0
672
Sep ’24
Blender to Reality Composer Pro 2.0 to SwiftUI + RealityKit visionOS Best Practices
Hi, I'm very new to 3D and am currently porting a SwiftUI iOS app to visionOS 2.0. I saw WWDC24 feature Blender in multiple spatial videos, and have begun integrating Blender models and animations into my VisionOS app (I would also like to integrate skeletons and programmatic rigging, more on that later). I'm wondering if there are “Best Practices” for this workflow - from Blender to USD to RCP 2.0 to visionOS 2 in Xcode. I’ve cobbled together the following that has some obvious holes: I’ve been able to find some pre-rigged and pre-animated models online that can serve as a great starting point. As a reference, here is a free model from SketchFab - a simple rigged skeleton with 6 built in animations: https://sketchfab.com/3d-models/skeleton-character-low-poly-8856e0138f424d68a8e0b40e185951f6 When exporting to USD from Blender, I haven’t been able to export more than one animation per USD file. Is there a workflow to export multiple animations in a single USDC file, or is this just not possible? As a temporary workaround, here is a python script I’ve been using to loop through all Blender animations, and export a model for each animation: import bpy import os # Set the directory where you want to save the USD files output_directory = “/path/to/export” # Ensure the directory exists if not os.path.exists(output_directory): os.makedirs(output_directory) # Function to export current scene as USD def export_scene_as_usd(output_path, start_frame, end_frame): bpy.context.scene.frame_start = start_frame bpy.context.scene.frame_end = end_frame # Export the scene as a USD file bpy.ops.wm.usd_export( filepath=output_path, export_animation=True ) # Save the current scene name original_scene = bpy.context.scene.name # Iterate through each action and export it as a USD file for action in bpy.data.actions: # Create a new scene for each action bpy.context.window.scene = bpy.data.scenes[original_scene].copy() new_scene = bpy.context.scene # Link the action to all relevant objects for obj in new_scene.objects: if obj.animation_data is not None: obj.animation_data.action = action # Determine the frame range for the action start_frame, end_frame = action.frame_range # Export the scene as a USD file output_path = os.path.join(output_directory, f"{action.name}.usdc") export_scene_as_usd(output_path, int(start_frame), int(end_frame)) # Delete the temporary scene to free memory bpy.data.scenes.remove(new_scene) print("Export completed.") I have also been able to successfully export rigging armatures as a single Skeleton - each “bone” showing getting imported into Reality Composer Pro 2.0 when exporting/importing manually. I would like to have all of these animations available in a single scene to be used in a RealityView in visionOS - so I have placed all animation models in a RCP scene and created named Timeline Action animations for each, showing the correct model and hiding the rest when triggering specific animations. I apply materials/textures to each so they appear the same, using Shader Graph. Then in SwiftUI I use notifications (as shown here - https://forums.developer.apple.com/forums/thread/756978) to trigger each RCP Timeline Action animation from code. Two questions: Is there a better way than to have multiple models of the same skeleton - each with a different animation - in a scene to be able to trigger multiple animations? Or would this require recreating Blender animations using skeleton rigging and keyframes from within RCP Timelines? If I want to programmatically create custom animations and move parts of the skeleton/armatures - do I need to do this by defining custom components in RCP, using IKRig and define movement of each of the “bones” in Xcode? I’m looking for any tips/tricks/workflow from experienced engineers or 3D artists that can create a more efficient/optimized workflow using Blender, USD, RCP 2 and visionOS 2 with SwiftUI. Thanks so much, I appreciate any help! I am very excited about all the new tools that keep evolving to make spatial apps really fun to build!
4
2
1.1k
Apr ’25
Object Tracking with RealtyView
When I wanted to call the Reality Composer Pro scene containing Object Tracking, I tried the following code: RealityView { content in if let model = try? await Entity(named: "Scene", in: realityKitContentBundle) { content.add(model) } } Obviously, this is wrong. We need to add some configurations that can enable Object Tracking to Reality View. What do we need to add? Note:I have seen https://developer.apple.com/videos/play/wwdc2024/10101/, but I don't know much about it.
3
1
1.1k
Sep ’24
Animations exported from Blender does not shoe in Reality Composer Pro
I made an animation in Blender using geometry nodes that I exported to USDC file (then I used Reality Converter to convert to USDZ) and I can see the animation when viewing from the finder but does not play after importing to RCP. Any idea how I can play the animation? Or can the animation be accessed through Xcode? Thanks!
4
0
1k
Apr ’25
How can I simultaneously apply the drag gesture to multiple entities?
I wanted to drag EntityA while also dragging EntityB independently. I've tried to separate them by entity but it only recognizes the latest drag gesture RealityView { content, attachments in ... } .gesture( DragGesture() .targetedToEntity(EntityA) .onChanged { value in ... } ) .gesture( DragGesture() .targetedToEntity(EntityB) .onChanged { value in ... } ) also tried using the simultaneously but didn't work too, maybe i'm missing something .gesture( DragGesture() .targetedToEntity(EntityA) .onChanged { value in ... } .simultaneously(with: DragGesture() .targetedToEntity(EntityB) .onChanged { value in ... } )
1
1
617
Mar ’25
Animated USD file | Starting position of the USD is not the same as first frame of animation
Hello all, I am building for visionOS with another engineer and using Reality Composer Pro to validate usd files. The starting position of my animated usdz, its position when it's first loaded, is not the same as the first frame of the animation on the usdz file For testing, I am using the AR Quick Look asset 'toy_biplane_idle.usdz' which demonstrates the same 'error' we're currently getting with our own usdz files. When the usdz is loaded, it is on the ground plane - But when the aniamtion is played, the plane 'snaps' to the position of the first frame of the animation - This 'snapping' behavior is giving us problems. We want the user ot see this plane in its static 'load' position with the option to play the animation. But we dont want it to snap when the user presses play Is it possible to load the .usdz in the position specified by the first frame of the animation? What is the best way to fix this issue. Thanks!
1
0
874
Sep ’24
Exporting .reality files from Reality Composer Pro
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience. That works really well. I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook. Am I missing something, or is it no longer possible to export .reality files? Thanks.
3
2
1.9k
Jul ’25
Apple's Choice: USDZ over Other 3D File Formats like GLTF
Hello Dev Community, I've been thinking over Apple's preference for USDZ for AR and 3D content, especially when there's the widely used GLTF. I'm keen to discuss and hear your insights on this choice. USDZ, backed by Apple, has seen a surge in the AR community. It boasts advantages like compactness, animation support, and ARKit compatibility. In contrast, GLTF too is a popular format with its own merits, like being an open standard and offering flexibility. Here are some of my questions toward the use of USDZ: Why did Apple choose USDZ over other 3D file formats like GLTF? What benefits does USDZ bring to Apple's AR and 3D content ecosystem? Are there any limitations of USDZ compared to other file formats? Could factors like compatibility, security, or integration ease have influenced Apple's decision? I would love to hear your thoughts on this. Feel free to share any experiences with USDZ or other 3D file formats within Apple's ecosystem!
4
1
6.0k
Oct ’24