RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Posts under RealityKit subtopic

Post

Replies

Boosts

Views

Activity

Updated Object Capure -- needs LiDAR?
I have two apps released -- ReefScan and ReefBuild -- that are based on the WWDC21 sample photogrammetry apps for iOS and MacOS. Those run fine without LiDAR and are used mostly for underwater models where LiDAR does not work at all. It now appears that the updated photogrammetry session requires LiDAR data, and building my app on current xcode results in a non-working app. Has the "old" version of photgrammetry session been broken by this update? It worked very well previously so I would hate to see this regression to needing LiDAR. Most of my users do not have that.
3
0
535
Mar ’25
Object Capture Not Working – Blank Screen Displayed
Hello everyone, Since last night, the Object Capture feature in my app has stopped working. Whenever I try to use it, a blank screen is displayed instead of the expected functionality. I’ve also tested several other apps that rely on Object Capture, and they are experiencing the same issue. This makes me think it might not be a problem specific to my device or app. I’ve already tried restarting my device and ensuring all apps are up to date, but the issue persists. Does anyone have more information about this issue? If so, is there any update on when it might be resolved? Thank you in advance for your help! Best regards
1
0
512
Dec ’24
Recursively searching the realityKitContentBundle
Hi Hopefully someone can share some ideas on how to accomplish this. I know we can load models from realityKitContentBundle like let model = try? await Entity(named: “testModel”, in: realityKitContentBundle) But this is in the root of RealityKitContent.rkassets , if I have the models in some subfolder then I have to add the complete path like let model = try? await Entity(named: “/superModels/testModel”, in: realityKitContentBundle) What I want is to be able to search recursively in all folders for that file as I have several subfolders with different models. Any suggestion ? Thanks in advance. Guillermo
3
0
687
Dec ’24
BillboardComponent causing Model Entity tap recognition issues on iOS 18
Hi, When I attach BillboardComponent to anchor entities, I am no longer able to retrieve the tapped entity anymore because the collision shapes of the entity are messed up due to always orienting it towards the camera. And it does not updated the collision shapes because if I try pressing everywhere that is not my model entity, I get a hit out of nowhere. I tried updating the collision shapes of the entity every frame: for child in existingPassport.mainEntity!.children { child.generateCollisionShapes(recursive: true) } However, nothing comes of it, and it is not a smart solution in the first places because it is too heavy to recreate the shapes every frame. I am using the usual AR View Controller that works when I comment out the BillboardComponent line just fine: private func setupTapRecognizer() { let tapRecognizer = UITapGestureRecognizer(target: self, action: #selector(handleTap)) arView.addGestureRecognizer(tapRecognizer) } @objc func handleTap(_ recognizer: UITapGestureRecognizer) { print("handle tap URL 1") let location = recognizer.location(in: arView) if let entity = arView.entity(at: location) { print("handle tap URL 2") // Assuming each entity has a URL stored in a component if let urlComponent = entity.components[URLComponent.self] { webViewPresenter?.presentFullScreenWebView(url: urlComponent.url) print("handle tap URL: \(urlComponent.url)") } } } How should we tackle this issue on iOS 18? Thanks!
1
0
691
Dec ’24
How to clip ModelEntity
I am trying to model something similar to an odometer in RealityKit, where 3D numbers scroll up or down, as they increase or decrease, within a container entity. Is there a way for an Entity to clip its children so that anything that extends beyond its dimensions is not rendered?
1
0
514
Dec ’24
Comparing colors of two ModelEntities
I want to compare the colors of two model entities (spheres). How can i do it? The method i'm currently trying to apply is as follows case let .color(controlColor) = controlMaterial.baseColor, controlColor == .green { // Flip target sphere colour if let targetMaterial = targetsphere.model?.materials.first as? SimpleMaterial, case let .color(targetColor) = targetMaterial.baseColor, targetColor == .blue { targetsphere.model?.materials = [SimpleMaterial(color: .green, isMetallic: false)] // Change to |1⟩ } else { targetsphere.model?.materials = [SimpleMaterial(color: .blue, isMetallic: false)] // Change to |0⟩ } } This method (baseColor) was deprecated in swift 15.0 changes to 'color' but i cannot compare the value color to each other.👾
1
0
630
Jan ’25
RealityKit Character Skeleton animation weapons are not animating
Hello, I am experiencing an issue with a character animation using RealityKit. I have a file created in Blender that contains the rigged Character, a sword, and a shield. The sword and the shield have bones connected to the character's hands so they can follow the character's animation. When I run the animation in Blender and preview the exported USDZ file on Mac, I can see the sword and shield attached to the hands and the animation is fine. But when I add the USDZ model in RealityKit and play the animation, only the character is animating, the sword and the shield are not moving at all. This is the code I use to animate the character: private func loadAnimations() { let unifiedAnimations = children[0].availableAnimations.first!.definition let animationResource = try! AnimationResource.generate(with: unifiedAnimations) self.children[0].playAnimation( animationResource.repeat() ) } Here is a link with a Demo Xcode project, the Blender file, and a video: https://www.dropbox.com/scl/fi/ypq2iwxc5f9dwzjggsvin/AppleTest.zip?rlkey=wiag3rg44urhjdh2wal8cdx2u&st=vbpf7x11&dl=0 STEPS TO REPRODUCE I have created a demo project that displays the character on a horizontal surface, and the animation starts playing. Run the App. The ARView will be set up, and a yellow square will appear in the middle of the screen. When a horizontal surface is detected the yellow Square will change indicating that the surface is found. Tap on the screen to load the USDZ model and position it in the yellow square's position. The Animation will start playing and you can see that the character is animating, but the sword and shield remain still. Thanks
1
0
522
Jan ’25
USDZ File Crashes in QuickLook on iPad 9th Gen (iPadOS 18.3) – Urgent Help Needed
Hi everyone, I’m experiencing a critical issue with USDZ files created in Reality Composer on an iPad 9th Generation (iPadOS 18.3). The files work perfectly on iPads from the 10th Generation onwards and on iPad Pros. However, on older devices like the iPad 9th Generation and older iPhones, QuickLook (file preview) crashes when opening them. This is a major issue because these USDZ files are part of an exhibition where artworks are extended with AR elements via a web page. If some visitors cannot view the 3D content, it significantly impacts the experience. What’s puzzling is that two years ago, we exported USDZ files from Reality Composer, made them available via a website, and they worked flawlessly on all devices, including older iPads and iPhones. Now, with the latest iPadOS, they consistently crash on older devices. Has anyone encountered a similar issue? Are there known limitations with QuickLook on older devices, or is there a way to optimize the USDZ files to prevent crashes? Could this be related to changes in iPadOS or RealityKit? Any advice or workaround would be greatly appreciated! Thanks in advance!
1
0
578
Feb ’25
Animating a RealityComposerPro shader's uniform input value
I'm trying to build a Shader in "Reality Composer Pro" that updates from a start time. Initially I tried the following: The idea was that when the startTime was 0, the output would be 0, but then I would set startTime from within code and this would be compared with the current GPU time, and difference used to drive another part of the shader graph: if let testEntity = root.findEntity(named: "Test"), var shaderGraphMaterial = testEntity.components[ModelComponent.self]?.materials.first as? ShaderGraphMaterial { let time = CFAbsoluteTimeGetCurrent() try! shaderGraphMaterial.setParameter(name: "StartTime", value: .float(Float(time))) testEntity.components[ModelComponent.self]?.materials[0] = shaderGraphMaterial } However, I haven't found a reference to the time the shader would be using. So now I am trying to write an EntityAction to achieve the same effect. Instead of comparing a start time to the GPU's time I'm trying to animate one of the shader's uniform input. However, I'm not sure how to specify the bind target. Here's my attempt so far: import RealityKit struct ShaderAction: EntityAction { let startValue: Float let targetValue: Float var animatedValueType: (any AnimatableData.Type)? { Float.self } static func registerEntityAction() { ShaderAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let value = simd_mix(event.action.startValue, event.action.targetValue, Float(animationState.normalizedTime)) animationState.storeAnimatedValue(value) } } } extension Entity { func updateShader(from startValue: Float, to targetValue: Float, duration: Double) { let fadeAction = ShaderAction(startValue: startValue, targetValue: targetValue) if let shaderAnimation = try? AnimationResource.makeActionAnimation(for: fadeAction, duration: duration, bindTarget: .material(0).customValue) { playAnimation(shaderAnimation) } } } ''' Currently when I run this I get an assertion failure: 'Index out of range (operator[]:line 797) index = 260, max = 8' Furthermore, even if it didn't crash I don't understand how to pass a binding to the custom shader value "startValue". Any clues of how to achieve this effect - even if it's a completely different way.
1
0
588
Feb ’25
alternative for CustomShader in visionOS
Following the post on https://developer.apple.com/documentation/realitykit/custommaterial it's simple to use shader for materials and get uniforms and params from each vertex. However it's not available for visionOS. Any alternative to use in this case? I want to write shader to fill material by myself. (I have shader experience from web, familiar with fragment shader)
1
0
444
Feb ’25
Issue with FBX import -- models are ultra small
When importing FBX animations (generated by Cinema 4d or Blender) the models come in very far way and cannot resize or zoomed in. I have tried every setting from both programs to no avail. Is there a secret to providing the right export options? When importing without animations/and rigging the model imports fine and correct size. But once motion is included, something is awry. I also tried changing base units in Converter, but did not work. I have attache my model heirarchy in C4D as well as the imported result. It appears the animation is imported, as I can see it move, but can barely see it :)
3
0
541
Feb ’25
EntityAction for MaterialBaseTint - incorrect colours
Hello, I'm writing an EntityAction that animates a material base tint between two different colours. However, the colour that is being actually set differs in RGB values from that requested. For example, trying to set an end target of R0.5, G0.5, B0.5, results in a value of R0.735357, G0.735357, B0.735357. I can also see during the animation cycle that intermediate actual tint values are also incorrect, versus those being set. My understanding is the the values of material base colour are passed as a SIMD4. Therefore I have a couple of helper extensions to convert a UIColor into this format and mix between two colours. Note however, I don't think the issue is with this functions - even if their outputs are wrong, the final value of the base tint doesn't match the value being set. I wondered if this was a colour space issue? import simd import RealityKit import UIKit typealias Float4 = SIMD4<Float> extension Float4 { func mixedWith(_ value: Float4, by mix: Float) -> Float4 { Float4( simd_mix(x, value.x, mix), simd_mix(y, value.y, mix), simd_mix(z, value.z, mix), simd_mix(w, value.w, mix) ) } } extension UIColor { var float4: Float4 { var r: CGFloat = 0.0 var g: CGFloat = 0.0 var b: CGFloat = 0.0 var a: CGFloat = 0.0 getRed(&r, green: &g, blue: &b, alpha: &a) return Float4(Float(r), Float(g), Float(b), Float(a)) } } struct ColourAction: EntityAction { let startColour: SIMD4<Float> let targetColour: SIMD4<Float> var animatedValueType: (any AnimatableData.Type)? { SIMD4<Float>.self } init(startColour: UIColor, targetColour: UIColor) { self.startColour = startColour.float4 self.targetColour = targetColour.float4 } static func registerEntityAction() { ColourAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let interpolatedColour = event.action.startColour.mixedWith(event.action.targetColour, by: Float(animationState.normalizedTime)) animationState.storeAnimatedValue(interpolatedColour) } } } extension Entity { func updateColour(from currentColour: UIColor, to targetColour: UIColor, duration: Double, endAction: @escaping (Entity) -> Void = { _ in }) { let colourAction = ColourAction(startColour: currentColour, targetColour: targetColour, endedAction: endAction) if let colourAnimation = try? AnimationResource.makeActionAnimation(for: colourAction, duration: duration, bindTarget: .material(0).baseColorTint) { playAnimation(colourAnimation) } } } The EntityAction can only be applied to an entity with a ModelComponent (because of the material), so it can be called like so: guard let modelComponent = entity.components[ModelComponent.self], let material = modelComponent.materials.first as? PhysicallyBasedMaterial else { return } let currentColour = material.baseColor.tint let targetColour = UIColor(_colorLiteralRed: 0.5, green: 0.5, blue: 0.5, alpha: 1.0) entity.updateColour(from:currentColour, to: targetColour, duration: 2)
1
0
557
Feb ’25
RealityKit particleEmitter delay starting when toggling isEmitting
I have a scene built up in RealityComposerPro, in which I've added a ParticleEmitter with isEmitting set to False and 'Loop' set to True. In my app, when I toggle isEmitting to True there can be a delay of a few seconds before the ParticleEmitter starts. However, if I programatically add the emitter in code at that point, it starts immediately. To be clear, I'm seeing this on the VisionOS simulator - I don't have access to a device at this time. Am I misunderstanding how to control the ParticleEmitter when I need precise control on when it starts.
1
0
533
Feb ’25
Photogrammetry requiring lidar-capable phones, curious why
Hello! I'm currently building an app where I feed images into a Photogrammetry session to create a USDZ. Pretty straightforward, works great. We've recently started some testing on older devices, and have discovered that Photogrammetry is requiring devices that have LIDAR (we've seen some console logs referencing LIDAR if we stumble through a photogrammetry process without checking isSupported first) Judging from @swredcam's posting about ReefScan from November 24 (https://developer.apple.com/forums/thread/769221) it looks like Photogrammetry did work on those non-LIDAR devices. In my own testing on an iPhone 12 mini with iOS 17, PhotogrammetrySession says it's not supported. Since we're only feeding in a sequence of photos that have never had depth data, and they process fine on pro/max devices, we're curious why this would require a LIDAR sensor to work, when it seems like it did work without LIDAR in the past. Or is there some other limitation of non-pro devices that is causing photogrammetry to not be supported (especially on today's really powerful hardware) Thanks! ++md
4
0
567
Mar ’25
Gestures not working correctly when setting the fov orientation to .horizontal
Hi there, I've discovered an issue with gesture handling in RealityKit when setting the camera’s fieldOfViewOrientation to horizontal. For instance, if I render a simple box at the center of the view with a collision shape that exactly matches its dimensions, the actual hit area behaves as if it's smaller than the box. Additionally, when attempting to drag the box away from the center, the hit area appears misaligned—offset slightly towards the center. Since the default fieldOfViewOrientation is vertical and everything works as expected in that mode, it seems that the gesture system might be assuming a vertical FOV. Given that the API allows setting it to horizontal, perhaps gestures should function correctly regardless of the orientation? Thank you!
1
0
458
Mar ’25
ModelEntity(named:in:) fails to load USD file from RealityKitContent bundle with misleading error?
My experience has been that ModelEntity(named:in:) can be used to load a USD file with a simple structure consisting of entities and model entities, and, critically, it will flatten the entity hierarchy down to a single ModelEntity, presumably reducing the number of draw calls. However, can anyone verify that the following is true? If ModelEntity(named:in:) is used to load a USD file from a RealityKit content bundle, it may fail when the USD file contains more complex data, such as shader graph material definitions, or perhaps for some other reason. I am not sure. AND the error that ModelEntity(named:in:) throws in this case is Cannot load RealityKitContent entity: Failed to find resource with name "<name>" in bundle which would literally suggest that the file does not exist, instead of what I assume the error actually is, which is "the file exists but its entity hierarchy could not be flattened to a single ModelEntity" ? Is that an accurate description of the known behavior of ModelEntity:named:in:)? I understand that I could use Entity(named:in:) instead, without the flattening feature. My question is really more about the seemingly misleading error message. Thank you for any clarification you can provide.
2
0
149
May ’25
RealityKit VideoMaterial renders pink on iOS 18
our app is live, and it appears that since the ios 18 update - the VideoMaterial renders pink / purple color instead of the video (picture attached). the audio is rendered properly. we found that it occurs on old devices: iPhone 11 & iPhone SE 2020. I've found this thread of Andy Jazz on stackoverflow: Steps to Reproduce: Create a plane for the video screen. Apply a VideoMaterial using AVPlayerItem. Anchor the model entity to an ARImageAnchor. Expected Outcome: The video should play as a material on the plane in RealityKit. Actual Outcome: On iOS 18, the plane appears pink, indicating the VideoMaterial isn’t applied. What I’ve Tried: -Verified the video URL is correct. -Checked that the AVPlayerItem and VideoMaterial are initialised correctly. -Ensured the AVPlayer is playing the video. I also tried different formats (mov / mp4 / m4v), and verifying that the video's status is readyToPlay. any suggestions?
1
0
189
Jun ’25