RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Posts under RealityKit subtopic

Post

Replies

Boosts

Views

Activity

RealityKit Blend Modes
I have 2 planes with textures on. I want these planes to intersect [ –|– ], and I want the blend mode to be additive. Currently I get z fighting on the planes, and I can't see how to set blend modes. I've done this before in Unity and Godot in a fairly straight forward manner. How do I accomplish this with RealityKit, preferably using code only (my scene is quite dynamic)? Do I need to do it with a shader manually? How can I stop the z fighting?
2
0
736
Aug ’25
RealityKit migration
Hey there, I’m currently planning to use RealityKit in a new multiplatform app I’m building. Unfortunately, I noticed that WatchOS is not supported for RealityKit, while SceneKit is getting deprecated. However, I’d like to maintain the same codebase across platforms. What are my options?
2
0
399
Oct ’25
RealityView postProcess effect depth texture
Hello, Question re: iOS RealityView postProcess. I've got a working postProcess kernel and I'd like to add some depth-based effects to it. Theoretically I should be able to just do: encoder.setTexture(context.sourceDepthTexture, index: 1) and then in the kernel: texture2d<float, access::read> depthIn [[texture(1)]] ... outTexture.write(depthIn.read(gid), gid); And I consistently see all black rendered to the view. The postProcess shader works, so that's not the issue. It just seems to not be receiving actual depth information. (If I set a breakpoint at the encoder setTexture step, I can see preview the color texture of the scene, but the context's depthTexture looks like all NaN / blank.) I've looked at all the WWDC samples, but they include ARView for all the depth sample code, which has a different set of configuration options than RealityView. So far I haven't seen anywhere to explicitly tell RealityView "include the depth information". So I'm not sure if I'm missing something there. It appears that there is indeed a depth texture being passed, but it looks blank. Is there a working example somewhere that we can reference?
2
0
484
2w
How to convert `DragGesture().onEnded`'s velocity CGSize to the SIMD3<Float> required in `PhysicsMotionComponent(linearVelocity, angularVelocity)`?
So if I drag an entity in RealityView I have to disable the PhysicsBodyComponent to make sure nothing fights dragging the entity around. This makes sense. When I finish a drag, this closure gets executed: .gesture( DragGesture() .targetedToAnyEntity() .onChanged { e in // ... } .onEnded { e in let velocity: CGSize = e.gestureValue.velocity } If I now re-add PhysicsBodyComponent to the component I just dragged, and I make it mode: .dynamic it will loose all velocity and drop straight down through gravity. Instead the solution is to apply mode: .kinematic and also apply a PhysicsMotionComponent component to the entity. This should retain velocity after letting go of the object. However, I need to instatiate it with PhysicsMotionComponent(linearVelocity: SIMD3<Float>, angularVelocity: SIMD3<Float>). How can I calculate the linearVelocity and angularVelocity when the e.gestureValue.velocity I get is just a CGSize? Is there another prop of gestureValue I should be looking at?
1
0
811
Jan ’25
How can I simultaneously apply the drag gesture to multiple entities?
I wanted to drag EntityA while also dragging EntityB independently. I've tried to separate them by entity but it only recognizes the latest drag gesture RealityView { content, attachments in ... } .gesture( DragGesture() .targetedToEntity(EntityA) .onChanged { value in ... } ) .gesture( DragGesture() .targetedToEntity(EntityB) .onChanged { value in ... } ) also tried using the simultaneously but didn't work too, maybe i'm missing something .gesture( DragGesture() .targetedToEntity(EntityA) .onChanged { value in ... } .simultaneously(with: DragGesture() .targetedToEntity(EntityB) .onChanged { value in ... } )
1
1
650
Mar ’25
Disable Foveation for ImmersiveSpace?
Does anyone know how I can disable foveation for an ImmersiveSpace? I'm aware that I could use a CompositorLayer and my own Metal rendering to control foveation, but I'm hoping that I can configure an existing/underlying LayerRenderer (or similar) to disable it for an immersive scene. Or if there's another approach I should be taking, any pointers are appreciated. Thank you!
1
1
609
Dec ’24
EXC_BAD_ACCESS when removing IKComponent from Entity
I'm trying to position an Entity with inverse kinematics while dragging the handle only, but use forward kinematics (pose jointTransforms) otherwise. The System, Components, Gestures and Rig all seem to work individually. My approach is to add the IKComponent when dragging starts on the handle and removing the IKComponent it is released. The switch into IK works, but when removing the IKComponent the App crashes * thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x8) * frame #0: 0x00000001aa5bb188 CoreRE`(anonymous namespace)::IKComponentSolverWrapper::getSolver() + 60 frame #1: 0x00000001aa5bafb0 CoreRE`re::internal::ikParametersNodeCallback(re::Slice<re::StringID>, re::Slice<re::RigDataValue>, re::Slice<re::StringID>, re::MutableSlice<re::RigDataValue>, void*) + 48 frame #2: 0x00000001aa52d090 CoreRE`re::(anonymous namespace)::resolveEvaluationContextCallback(re::EvaluationContext&, void*) + 152 frame #3: 0x00000001aa68c024 CoreRE`re::(anonymous namespace)::$_76::__invoke(re::Slice<unsigned long>, re::(anonymous namespace)::RegisterTable&) + 1080 frame #4: 0x00000001aa678c94 CoreRE`re::EvaluationModelSingleThread::evaluate(re::EvaluationContextSlices&) + 1188 frame #5: 0x00000001aa866984 CoreRE`re::SkeletalPoseRuntimeData::executeEvaluationTree() + 136 frame #6: 0x00000001aadf37ec CoreRE`re::ecs2::SkeletalPoseComponent::calculateSkeletalPoseBufferWithRig(re::ecs2::MeshComponent*, re::ecs2::RigComponent*, re::ecs2::SkeletalPoseBufferComponent*) + 492 frame #7: 0x00000001aadf4a84 CoreRE`re::ecs2::SkeletalPoseComponentStateImpl::processPreparingComponents(re::ecs2::System::UpdateContext const&, re::ecs2::BasicComponentStateSceneData<re::ecs2::SkeletalPoseComponent>*, re::ecs2::ComponentBuckets<re::ecs2::SkeletalPoseComponent>::BucketIteration, void*) + 268 frame #8: 0x00000001aadf54b0 CoreRE`re::ecs2::SkeletalPoseSystem::update(re::ecs2::System::UpdateContext) const + 732 frame #9: 0x00000001aaed3e54 CoreRE`re::internal::Callable<re::ecs2::ECSManager::configurePhaseECSSystems(re::Scheduler::ScheduleDescriptor&, re::ecs2::ECSSystemGroup, unsigned long)::$_1, void (float)>::operator()(float&&) const + 168 frame #10: 0x00000001ab40eda4 CoreRE`re::Scheduler::executePhase(unsigned long) + 440 frame #11: 0x00000001aa6a3b74 CoreRE`re::Engine::executePhase(re::FramePhase) + 144 frame #12: 0x000000023173de9c RealitySystemSupport`RCPSharedSimulationExecuteUpdate + 64 frame #13: 0x00000002276c9820 MRUIKit`__65-[MRUISharedSimulation _doJoinWithConnectionConfiguration:error:]_block_invoke.35 + 168 frame #14: 0x00000002276c8530 MRUIKit`__addCAPreFenceHandler_block_invoke + 32 frame #15: 0x000000018af22058 QuartzCore`CA::Transaction::run_commit_handlers(CATransactionPhase) + 112 frame #16: 0x000000018aef2ad4 QuartzCore`CA::Context::commit_transaction(CA::Transaction*, double, double*) + 592 frame #17: 0x000000018af21898 QuartzCore`CA::Transaction::commit() + 652 frame #18: 0x000000018af22dac QuartzCore`CA::Transaction::flush_as_runloop_observer(bool) + 68 frame #19: 0x0000000185a26820 UIKitCore`_UIApplicationFlushCATransaction + 48 frame #20: 0x0000000184f97af0 UIKitCore`_UIUpdateSequenceRun + 76 frame #21: 0x0000000185954290 UIKitCore`schedulerStepScheduledMainSection + 168 frame #22: 0x00000001859536d8 UIKitCore`runloopSourceCallback + 80 frame #23: 0x00000001804157fc CoreFoundation`__CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 frame #24: 0x0000000180415744 CoreFoundation`__CFRunLoopDoSource0 + 172 frame #25: 0x0000000180414eb0 CoreFoundation`__CFRunLoopDoSources0 + 232 frame #26: 0x000000018040f454 CoreFoundation`__CFRunLoopRun + 788 frame #27: 0x000000018040ecd4 CoreFoundation`CFRunLoopRunSpecific + 552 frame #28: 0x0000000190104b70 GraphicsServices`GSEventRunModal + 160 frame #29: 0x0000000185a27e30 UIKitCore`-[UIApplication _run] + 796 frame #30: 0x0000000185a2c058 UIKitCore`UIApplicationMain + 124 frame #31: 0x00000001d29558b4 SwiftUI`closure #1 (Swift.UnsafeMutablePointer<Swift.Optional<Swift.UnsafeMutablePointer<Swift.Int8>>>) -> Swift.Never in SwiftUI.KitRendererCommon(Swift.AnyObject.Type) -> Swift.Never + 164 frame #32: 0x00000001d29555dc SwiftUI`SwiftUI.runApp<τ_0_0 where τ_0_0: SwiftUI.App>(τ_0_0) -> Swift.Never + 84 frame #33: 0x00000001d265ecdc SwiftUI`static SwiftUI.App.main() -> () + 164 frame #34: 0x000000010303f1c4 Playground.debug.dylib`static PlaygroundApp.$main() at <compiler-generated>:0 frame #35: 0x000000010303f290 Playground.debug.dylib`main at PlaygroundApp.swift:7:8 frame #36: 0x0000000102f6d410 dyld_sim`start_sim + 20 frame #37: 0x000000010312e274 dyld`start + 2840 Is there a workaround or another way to switch between IK and FK?
1
0
445
Dec ’24
Object Capture Not Working – Blank Screen Displayed
Hello everyone, Since last night, the Object Capture feature in my app has stopped working. Whenever I try to use it, a blank screen is displayed instead of the expected functionality. I’ve also tested several other apps that rely on Object Capture, and they are experiencing the same issue. This makes me think it might not be a problem specific to my device or app. I’ve already tried restarting my device and ensuring all apps are up to date, but the issue persists. Does anyone have more information about this issue? If so, is there any update on when it might be resolved? Thank you in advance for your help! Best regards
1
0
512
Dec ’24
Issues importing Tiled Image Material X shader node into Reality Composer Pro
Hey, I am having issues getting my Material X shaders to work properly in Reality Composer Pro that I've authored in Houdini. The shader is very simple. It starts with a tiled image node that is written to the diffuse color of the preview surface node. This node is called mtxltileimage2. When I create a tiled image node in RCP and configure it to have the same parameter values I get the texture to show up correctly. This node is called TiledImage. One difference I can identify is that the second node has a grey icon whereas the first node has a blue icon. Could this be related to this issue? Here is the USD viewer output for the two variants of the tiled image node. Any pointers, misconceptions and help would be greatly appreciated. My goal is to be able and author these shaders in Houdini and import them into RCP. Trying to figure out the right pipeline for this workflow.
1
0
670
Dec ’24
BillboardComponent causing Model Entity tap recognition issues on iOS 18
Hi, When I attach BillboardComponent to anchor entities, I am no longer able to retrieve the tapped entity anymore because the collision shapes of the entity are messed up due to always orienting it towards the camera. And it does not updated the collision shapes because if I try pressing everywhere that is not my model entity, I get a hit out of nowhere. I tried updating the collision shapes of the entity every frame: for child in existingPassport.mainEntity!.children { child.generateCollisionShapes(recursive: true) } However, nothing comes of it, and it is not a smart solution in the first places because it is too heavy to recreate the shapes every frame. I am using the usual AR View Controller that works when I comment out the BillboardComponent line just fine: private func setupTapRecognizer() { let tapRecognizer = UITapGestureRecognizer(target: self, action: #selector(handleTap)) arView.addGestureRecognizer(tapRecognizer) } @objc func handleTap(_ recognizer: UITapGestureRecognizer) { print("handle tap URL 1") let location = recognizer.location(in: arView) if let entity = arView.entity(at: location) { print("handle tap URL 2") // Assuming each entity has a URL stored in a component if let urlComponent = entity.components[URLComponent.self] { webViewPresenter?.presentFullScreenWebView(url: urlComponent.url) print("handle tap URL: \(urlComponent.url)") } } } How should we tackle this issue on iOS 18? Thanks!
1
0
691
Dec ’24
RealityKit fails with EXC_BAD_ACCESS at CMClockGetAnchorTime in the simulator
Starting with iOS 18.0 beta 1, I've noticed that RealityKit frequently crashes in the simulator when an app launches and presents an ARView. I was able to create a small sample app with repro steps that demonstrates the issue, and I've submitted feedback: FB16144085 I've included a crash log with the feedback. If possible, I'd appreciate it if an Apple engineer could investigate and suggest a workaround. It's awkward to be restricted to the iOS 17 simulator, which does not exhibit this behavior. Please let me know if there's anything I can do to help. Thank you.
1
0
604
Apr ’25
How to clip ModelEntity
I am trying to model something similar to an odometer in RealityKit, where 3D numbers scroll up or down, as they increase or decrease, within a container entity. Is there a way for an Entity to clip its children so that anything that extends beyond its dimensions is not rendered?
1
0
514
Dec ’24
RealityView session not terminating when leaving view on iOS
Everything works fine, except when tapping the navigation Back link and returning to the previous view, the AR session inside RealityView does not terminate. The green dot camera indicator stays on, it is still scanning the environment, and if the package has audio in it, the audio will still play, albeit extremely panned on the right channel. I have no issues terminating QuickLook or ARSCNView. I have a simple NavigationLink opening the RealityView... NavigationLink(destination: MyRealityView()) { Text("Open AR") } struct MyRealityView : View { var body: some View { RealityView { content in // Create horizontal plane anchor for the content let anchor = AnchorEntity(.plane(.horizontal, classification: .floor, minimumBounds: [0.5,0.5])) let scene = await loadEntity(named: "Scene") // Add model to anchor anchor.addChild(scene!) content.add(anchor) // View Settings content.camera = .spatialTracking } placeholder: { ProgressView() } .onDisappear { //print("RealityView is disappearing. Cleanup actions here.") } .edgesIgnoringSafeArea(.all) // Activate onTap from Reality Composer Pro .gesture(TapGesture().targetedToAnyEntity().onEnded { value in _ = value.entity.applyTapForBehaviors() }) }} I have experimented with several ways of trying to close it, and I can't figure it out. I have tried State variables and custom Back buttons. I was also trying to working with pause(), but I can't seem to get that to function either. Anyone else have this issue or know of a solution? What am I missing?
1
1
540
Jan ’25
Entity cross multiple portals at once?
If I have one portal on the ceiling and one on the floor, can a tall Entity cross multiple portals at once? Will the opposing portal directions cause it to fail? No matter what I try for the crossingMode and clippingMode of the PortalComponent I can only get it to fully work for one portal at a time. I have tried flipping the normals for the crossingMode and clippingMode planes. I have also tried creating a ceiling portal plane with inverted normals. It seems like whatever Entity is passing through a portal has one portal it wants to deal with at a time and that's it. My other option is to create portals using occlusion but I prefer the simplest way.
1
0
484
Jan ’25
Comparing colors of two ModelEntities
I want to compare the colors of two model entities (spheres). How can i do it? The method i'm currently trying to apply is as follows case let .color(controlColor) = controlMaterial.baseColor, controlColor == .green { // Flip target sphere colour if let targetMaterial = targetsphere.model?.materials.first as? SimpleMaterial, case let .color(targetColor) = targetMaterial.baseColor, targetColor == .blue { targetsphere.model?.materials = [SimpleMaterial(color: .green, isMetallic: false)] // Change to |1⟩ } else { targetsphere.model?.materials = [SimpleMaterial(color: .blue, isMetallic: false)] // Change to |0⟩ } } This method (baseColor) was deprecated in swift 15.0 changes to 'color' but i cannot compare the value color to each other.👾
1
0
630
Jan ’25
RealityKit Character Skeleton animation weapons are not animating
Hello, I am experiencing an issue with a character animation using RealityKit. I have a file created in Blender that contains the rigged Character, a sword, and a shield. The sword and the shield have bones connected to the character's hands so they can follow the character's animation. When I run the animation in Blender and preview the exported USDZ file on Mac, I can see the sword and shield attached to the hands and the animation is fine. But when I add the USDZ model in RealityKit and play the animation, only the character is animating, the sword and the shield are not moving at all. This is the code I use to animate the character: private func loadAnimations() { let unifiedAnimations = children[0].availableAnimations.first!.definition let animationResource = try! AnimationResource.generate(with: unifiedAnimations) self.children[0].playAnimation( animationResource.repeat() ) } Here is a link with a Demo Xcode project, the Blender file, and a video: https://www.dropbox.com/scl/fi/ypq2iwxc5f9dwzjggsvin/AppleTest.zip?rlkey=wiag3rg44urhjdh2wal8cdx2u&st=vbpf7x11&dl=0 STEPS TO REPRODUCE I have created a demo project that displays the character on a horizontal surface, and the animation starts playing. Run the App. The ARView will be set up, and a yellow square will appear in the middle of the screen. When a horizontal surface is detected the yellow Square will change indicating that the surface is found. Tap on the screen to load the USDZ model and position it in the yellow square's position. The Animation will start playing and you can see that the character is animating, but the sword and shield remain still. Thanks
1
0
522
Jan ’25
USDZ File Crashes in QuickLook on iPad 9th Gen (iPadOS 18.3) – Urgent Help Needed
Hi everyone, I’m experiencing a critical issue with USDZ files created in Reality Composer on an iPad 9th Generation (iPadOS 18.3). The files work perfectly on iPads from the 10th Generation onwards and on iPad Pros. However, on older devices like the iPad 9th Generation and older iPhones, QuickLook (file preview) crashes when opening them. This is a major issue because these USDZ files are part of an exhibition where artworks are extended with AR elements via a web page. If some visitors cannot view the 3D content, it significantly impacts the experience. What’s puzzling is that two years ago, we exported USDZ files from Reality Composer, made them available via a website, and they worked flawlessly on all devices, including older iPads and iPhones. Now, with the latest iPadOS, they consistently crash on older devices. Has anyone encountered a similar issue? Are there known limitations with QuickLook on older devices, or is there a way to optimize the USDZ files to prevent crashes? Could this be related to changes in iPadOS or RealityKit? Any advice or workaround would be greatly appreciated! Thanks in advance!
1
0
578
Feb ’25
Animating a RealityComposerPro shader's uniform input value
I'm trying to build a Shader in "Reality Composer Pro" that updates from a start time. Initially I tried the following: The idea was that when the startTime was 0, the output would be 0, but then I would set startTime from within code and this would be compared with the current GPU time, and difference used to drive another part of the shader graph: if let testEntity = root.findEntity(named: "Test"), var shaderGraphMaterial = testEntity.components[ModelComponent.self]?.materials.first as? ShaderGraphMaterial { let time = CFAbsoluteTimeGetCurrent() try! shaderGraphMaterial.setParameter(name: "StartTime", value: .float(Float(time))) testEntity.components[ModelComponent.self]?.materials[0] = shaderGraphMaterial } However, I haven't found a reference to the time the shader would be using. So now I am trying to write an EntityAction to achieve the same effect. Instead of comparing a start time to the GPU's time I'm trying to animate one of the shader's uniform input. However, I'm not sure how to specify the bind target. Here's my attempt so far: import RealityKit struct ShaderAction: EntityAction { let startValue: Float let targetValue: Float var animatedValueType: (any AnimatableData.Type)? { Float.self } static func registerEntityAction() { ShaderAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let value = simd_mix(event.action.startValue, event.action.targetValue, Float(animationState.normalizedTime)) animationState.storeAnimatedValue(value) } } } extension Entity { func updateShader(from startValue: Float, to targetValue: Float, duration: Double) { let fadeAction = ShaderAction(startValue: startValue, targetValue: targetValue) if let shaderAnimation = try? AnimationResource.makeActionAnimation(for: fadeAction, duration: duration, bindTarget: .material(0).customValue) { playAnimation(shaderAnimation) } } } ''' Currently when I run this I get an assertion failure: 'Index out of range (operator[]:line 797) index = 260, max = 8' Furthermore, even if it didn't crash I don't understand how to pass a binding to the custom shader value "startValue". Any clues of how to achieve this effect - even if it's a completely different way.
1
0
588
Feb ’25
ARMeshAnchor Data with RealityView
I want to use SwiftUI and RealityView to get AR scene understanding data (ARMeshAnchor) on iOS devices with LiDAR. The only way we can do that is by using ARSession (unless there is another way). However in previous iOS 18 builds there was this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:session:arconfiguration:) , which worked with SpatialTrackingSession and a custom ARSession together. This function in the the latest iOS and Xcode has since been removed in the RealityKit framework but still there on documentation. I also wanted to get ARFaceAnchor data which I still cannot get without ARSession, the closest I can get is by using: let target = AnchoringComponent.Target.face let anchoringComponent = AnchoringComponent(target, trackingMode: .predicted) entity = Entity() entity!.components.set(anchoringComponent) But I still can't find a way to get the current frame (ARFrame) or the anchors ([ARAnchor]) in the view. Alternatively if I use if I use this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:) and start the ARSession separately. The session (didUpdate and didAdd) only runs for a few frames before getting interrupted. And if I completely remove SpatialTrackingConfiguration and just run the ARSession. There still is a valid tracked entity for the AnchoringComponent.Target.face component. IF in the configuration for the ARSession I use the ARWorldTrackingConfiguration with face tracking. And I still get updated facial data each frame. But the ARSession didUpdate or didAdd functions don't get called passed the first few frames. Interestingly if I switch the RealityViewCameraContent.RealityViewCamera to .virtual. I get ARMeshAnchor and ARFaceAnchor data, but no camera feed (as expected). This with or without SpatialTrackingConfiguration. My overarching question is what is the proper way to access ARMeshAnchors and other ARAnchors created by the system and track them live while also using SwiftUI. GitHub Repo with sample project can be found here: https://github.com/bpate75/RealityViewTesting
1
1
439
Feb ’25