RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Posts under RealityKit tag

217 results found
Sort by:
Post not yet marked as solved
460 Views

Can't import GLTF or GLB in Reality Converter

I'm not sure what happened, I'm pretty sure I wasn't having issues importing these file types a few weeks ago. Just to be sure I checked the webpage for Reality Converter and yup, GLTF is listed... Simply drag-and-drop common 3D file formats, such as .obj, .gltf and .usd I haven't updated anything on my computer, so I dunno how anything could have changed. Anyone know what the deal could be?
Asked
by dpow.
Last updated
.
Post not yet marked as solved
400 Views

Add a UIImage to a Plane in RealityKit

I'm trying to dynamically add a UIImage that is in memory (that is dynamically created at runtime and changed over time) to a RealityKit scene. I see with TextureResource that I can easily do this by creating a material from an image file in my bundle, but I don't see any way to do this with just a UIImage. Is there another way to go about this that I'm just missing? Any help would be appreciated!
Asked Last updated
.
Post marked as solved
75 Views

How can I disable viewport culling in RealityKit?

I'm using a geometry modifier to displace vertices well beyond their original position. When viewing this displaced mesh in an .ar RealityKit view, the mesh disappears when the mesh's pre-modified coordinates fall outside of the viewing frustum. There's usually only this mesh in the scene, so it is weird for the user to walk up to the mesh they see, only for it to go away. Is there a way to disable viewport culling in RealityKit or in my geometry modifier? Or, is it possible to set the culling to happen after the geometry modifier has displaced the mesh? (I looked at this answer, but it looked like that person was building custom geometry in a Metal view.)
Asked Last updated
.
Post marked as solved
738 Views

ARWorld loading works differently on iOS 15

There is a difference in how ARWorldMap loading/relocalization works on iOS 14 and 15, which is not mentioned in iOS 15 release notes. Whenever you start an AR session by providing an initialWorldMap and relocalization succeeds, on iOS 14 the AR world's origin is automatically updated to match the previously saved world's origin. However, on iOS 15 it's not. In my case, I've used RealityKit for rendering and found out about it by creating an AnchorEntity() (default identity transform) and adding it to my ARView's scene after relocalization. On iOS 14, it is placed where the previously saved session started, as expected. While on iOS 15, it is placed exactly where the current session started. It's not the case for the Apple's sample: Saving and Loading World Data, because the virtual object is restored using ARSessionDelegate's method where a previously saved ARAnchor is provided by the session. This is also the solution I use right now – to add an anchor before saving the world so you can use it later to e.g. update the world's origin using setWorldOrigin(relativeTransform:). I'm sharing in case someone runs into this problem, and it'd be great if someone at Apple could confirm that's the desired behavior.
Asked Last updated
.
Post marked as solved
79 Views

polygon count vs. triangle count?

In a previous post I asked if 100,000 polygons is still the recommended size for USDZ Quick Look models on the web. (The answer is yes) But I realize my polygons are 4-sided but are not planar, so they have to be broken down into 2 triangles when rendered. Given that, should I shoot for 50,000 polygons (i.e., 100,000 triangles)? Or does the 100,000 polygon statistic already assume polygons will be subdivided into triangles? (The models are generated from digital terrain (GeoTIFF) data, not a 3D modeling tool)
Asked
by Todd2.
Last updated
.
Post not yet marked as solved
109 Views

RealityKit - MeshAnchor with Custom Material problems

trying to put custom material on ARMeshAnchor, but during the session the FPS frequency begins to decrease, and the following warning comes out Link to the video example of braking FPS https://www.dropbox.com/s/p7g7qgvb5o95gdf/RPReplay_Final1637641112.mov?dl=0 Console Warning ARSessionDelegate is retaining 11 ARFrames. This can lead to future camera frames being dropped CameraViewController final class CameraViewController: UIViewController {     private let arView = ARView().do {         $0.automaticallyConfigureSession = false     }     private var meshAnchorTracker: MeshAnchorTracker?     override func viewDidLoad() {         super.viewDidLoad()         MetalLibLoader.initializeMetal()         setupSubviews()     }     override func viewDidAppear(_ animated: Bool) {         super.viewDidAppear(animated)         setupARSession()     }     private func setupSubviews() {         view.addSubview(arView)         arView.frame = view.bounds     }     private func setupARSession() {         configureWorldTracking()         setupPhysicsOrigin()     }     private func configureWorldTracking() {         let configuration = ARWorldTrackingConfiguration()   let sceneReconstruction: ARWorldTrackingConfiguration.SceneReconstruction = .mesh         if ARWorldTrackingConfiguration.supportsSceneReconstruction(sceneReconstruction) {             configuration.sceneReconstruction = sceneReconstruction             meshAnchorTracker = .init(arView: arView)         }         configuration.planeDetection = .horizontal         arView.session.run(configuration, options: [.resetSceneReconstruction])         arView.renderOptions.insert(.disableMotionBlur)         arView.session.delegate = self     }     private func setupPhysicsOrigin() {         let physicsOrigin = Entity()         physicsOrigin.scale = .init(repeating: 0.1)         let anchor = AnchorEntity(world: SIMD3<Float>())         anchor.addChild(physicsOrigin)         arView.scene.addAnchor(anchor)         arView.physicsOrigin = physicsOrigin     }     func updateAnchors(anchors: [ARAnchor]) {         for anchor in anchors.compactMap({ $0 as? ARMeshAnchor }) {             meshAnchorTracker?.update(anchor)         }     } } extension CameraViewController: ARSessionDelegate {     func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {         updateAnchors(anchors: anchors)     }     func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {         updateAnchors(anchors: anchors)     }     func session(_ session: ARSession, didRemove anchors: [ARAnchor]) {         for anchor in anchors.compactMap({ $0 as? ARMeshAnchor }) {             meshAnchorTracker?.remove(anchor)         }     } } MeshAnchorTracker struct MeshAnchorTracker {     var entries: [ARMeshAnchor: Entry] = [:]     weak var arView: ARView?     init(arView: ARView) {         self.arView = arView     }     class Entry {         var entity: AnchorEntity         private var currentTask: Cancellable?         var nextTask: LoadRequest<MeshResource>? {             didSet {                 scheduleNextTask()             }         }         static let material: RealityKit.CustomMaterial = {             let customMaterial: CustomMaterial             let surfaceShader = CustomMaterial.SurfaceShader(named: "plasma", in: MetalLibLoader.library)             do {                 try customMaterial = CustomMaterial(surfaceShader: surfaceShader, lightingModel: .lit)             } catch {                 fatalError(error.localizedDescription)             }             return customMaterial         }()         func scheduleNextTask() {             guard let task = nextTask else { return }             guard currentTask == nil else { return }             self.nextTask = nil             currentTask = task                 .sink(                     receiveCompletion: { result in                         switch result {                         case .failure(let error): assertionFailure("\(error)")                         default: return                         }                     },                     receiveValue: { [weak self] mesh in                         self?.currentTask = nil                         self?.entity.components[ModelComponent.self] = ModelComponent(                             mesh: mesh,                             materials: [Self.material]                         )                         self?.scheduleNextTask()                     }                 )         }         init(entity: AnchorEntity) {             self.entity = entity         }     }     mutating func update(_ anchor: ARMeshAnchor) {         let tracker: Entry = {             if let tracker = entries[anchor] { return tracker }             let entity = AnchorEntity(world: SIMD3<Float>())             let tracker = Entry(entity: entity)             entries[anchor] = tracker             arView?.scene.addAnchor(entity)             return tracker         }()         let entity = tracker.entity         do {             entity.transform = .init(matrix: anchor.transform)             let geom = anchor.geometry             var desc = MeshDescriptor()             let posValues = geom.vertices.asSIMD3(ofType: Float.self)             desc.positions = .init(posValues)             let normalValues = geom.normals.asSIMD3(ofType: Float.self)             desc.normals = .init(normalValues)             do {                 desc.primitives = .polygons(                     (0..<geom.faces.count).map { _ in UInt8(geom.faces.indexCountPerPrimitive) },                     (0..<geom.faces.count * geom.faces.indexCountPerPrimitive).map {                         geom.faces.buffer.contents()                             .advanced(by: $0 * geom.faces.bytesPerIndex)                             .assumingMemoryBound(to: UInt32.self).pointee                     }                 )             }             tracker.nextTask = MeshResource.generateAsync(from: [desc])         }     }     mutating func remove(_ anchor: ARMeshAnchor) {         if let entity = self.entries[anchor] {             entity.entity.removeFromParent()             self.entries[anchor] = nil         }     } } extension ARGeometrySource {     func asArray<T>(ofType: T.Type) -> [T] {         dispatchPrecondition(condition: .onQueue(.main))         assert(MemoryLayout<T>.stride == stride, "Invalid stride \(MemoryLayout<T>.stride); expected \(stride)")         return (0..<self.count).map {             buffer.contents().advanced(by: offset + stride * Int($0)).assumingMemoryBound(to: T.self).pointee         }     }     func asSIMD3<T>(ofType: T.Type) -> [SIMD3<T>] {         return asArray(ofType: (T, T, T).self).map { .init($0.0, $0.1, $0.2) }     } } What could the problem? 🥵
Asked Last updated
.
Post not yet marked as solved
75 Views

Precompile/prewarm shaders to avoid jank

Hi, is there a way to force RealityKit to compile/prewarm and cache all shaders that will be used within a Scene in advance – ideally in the background? This would be useful for adding complex models to the scene which sometimes can cause quite a couple dropped frames even on the newest devices (at least I assume the initial delay when displaying them is caused by the shader compilation) but also for CustomMaterials. Note this also happens with models that are loaded asynchronously. Thanks!
Asked Last updated
.
Post not yet marked as solved
103 Views

What causes »ARSessionDelegate is retaining X ARFrames« console warning?

Hi, since iOS 15 I've repeatedly noticed the console warning »ARSessionDelegate is retaining X ARFrames. This can lead to future camera frames being dropped« even for rather simple projects using RealityKit and ARKit. Could someone from the ARKit team please elaborate what causes this warning and what can be done to avoid it? If I remember correctly I didn't even assign an ARSessionDelegate. Thank you!
Asked Last updated
.
Post marked as solved
78 Views

Is it possible to set a CALayer as the contents of a RealityKit material?

Is it possible to set a CALayer as the contents of a RealityKit material? Currently this is possible with SceneKit materials. I am wondering if there is something similar for RealityKit. https://developer.apple.com/documentation/scenekit/scnmaterialproperty/1395372-contents
Asked Last updated
.
Post not yet marked as solved
80 Views

Reality Converter - Unlit Material

Hi, I'm struggling to find a way to get a simple Unlit Material working with Reality Composer. With all the standard objects, it doesn't seem like we have the option of an unlit material which seems really odd to me... This is kind of a basic feature. The only way I was able to get an unlit material inside RealityConverter was to import a mesh without material and I was getting a white unlit material. I have seen that you can set an unlit material using RealityKit but from what I saw RealityKit builds an app at the end right? Honestly, I'm not sure what you get when creating an AR app using RealityKit...What I'm looking for is a simple .reality file to be displayed on the web.
Asked Last updated
.
Post marked as solved
63 Views

How can i add a UIView on the material of the ModelEntity?

How can i show UIView on a ModelEntity? On android, i use ViewRenderable to load a view , and attach it to a Node, to show a View in the AR scene. But i not found any ways to show view in the AR scene of the ARView used the RealityKit. Anyone can help me ? Thanks very much!
Asked Last updated
.
Post not yet marked as solved
200 Views

Create MTLLibrary from raw String for use within RealityKit

Hello, I have a usecase where I need to to download and compile metal shaders on demand as strings or .metal files. These should then be used for CustomMaterials and/or postprocessing within RealityKit. Essentially this boils down to having raw source code that needs to be compiled at runtime. My plan was to use the method makeLibrary(source:options:completionHandler:) to accomplish this. The problem is that I get the following error during compilation: RealityKitARExperienceAssetProvider: An error occured while trying to compile shader library »testShaderLibrary« - Error Domain=MTLLibraryErrorDomain Code=3 "program_source:2:10: fatal error: 'RealityKit/RealityKit.h' file not found #include <RealityKit/RealityKit.h> My code for creating the library looks like this (simplified example): let librarySourceString: String = """ #include <metal_stdlib> #include <RealityKit/RealityKit.h> using namespace metal; [[visible]] void mySurfaceShader(realitykit::surface_parameters params) {     params.surface().set_base_color(half3(1, 1, 1)); } """ mtlDevice.makeLibrary(source: librarySourceString, options: nil) { library, error in     if let error = error {         dump(error) return     }     // do something with library } So I'm wondering if there's a way to tell the metal compiler how to resolve this reference to the RealityKit header file? Would I need to replace that part of the source string maybe with an absolute path to the RealityKit framework (if so – how would I get this at runtime)? Appreciate any hints - thanks!
Asked Last updated
.
Post not yet marked as solved
197 Views

Remove the focus point when the camera sees ARImageAnchor and reveal the focus point when the camera does not see ARImageAnchor

I try to obtain focus point entity when camera not see ARImageAnchor, and remove after camera sees ARImageAnchor, and when camera not sees anchor obtain focus point again. I used arView.session.delegate, but delegate method maybe call one time, i don't know. how to make it? Thank u var focusEntity: FocusEntity! // (FocusEntity: Entity, HasAnchoring) override func viewDidLoad() { super.viewDidLoad() // ... focusEntity = FocusEntity(on: arView, style: .classic(color: .systemGray4)) arView.session.delegate = self } } extension CameraViewController: ARSessionDelegate { func session(_ session: ARSession, didAdd anchors: [ARAnchor]) { for anchor in anchors { if let imageAnchor = anchor as? ARImageAnchor { focusEntity.destroy() focusEntity = nil //... Obtain entity to image anchor } } } func session(_ session: ARSession, didUpdate frame: ARFrame) { //... ??? } }
Asked Last updated
.
Post not yet marked as solved
93 Views

RealityKit - ARImageAnchor with VideoMaterial problems

RealityKit ARImageAnchor with VideoMaterial problems When I move the camera closer, sometimes the image from the ARResources overlaps the video with itself. What could be the problem? Links: https://www.dropbox.com/s/b8yaczq4xjk9v1p/IMG_9429.PNG?dl=0 https://www.dropbox.com/s/59dj4ldf6l3yj4u/RPReplay_Final1637392988.mov?dl=0 VideoEntity class final class VideoEntity {     var videoPlayer = AVPlayer()     func videoModelEntity(width: Float?, height: Float?) -> ModelEntity {         let plane = MeshResource.generatePlane(width: width ?? Float(), height: height ?? Float())         let videoItem = createVideoItem(with: "Cooperation")         let videoMaterial = createVideoMaterial(with: videoItem)         return ModelEntity(mesh: plane, materials: [videoMaterial])      }     func placeVideoScreen(videoEntity: ModelEntity, imageAnchor: ARImageAnchor, arView: ARView) {         let anchorEntity = AnchorEntity(anchor: imageAnchor)         let rotationAngle = simd_quatf(angle: GLKMathDegreesToRadians(-90), axis: SIMD3<Float>(x: 1, y: 0, z: 0))         videoEntity.setOrientation(rotationAngle, relativeTo: anchorEntity)         videoEntity.setPosition(SIMD3<Float>(x: 0, y: 0.015, z: 0), relativeTo: anchorEntity)         anchorEntity.addChild(videoEntity)         arView.scene.addAnchor(anchorEntity)     }     private func createVideoItem(with filename: String) -> AVPlayerItem {         guard let url = Bundle.main.url(forResource: filename, withExtension: "mov") else {             fatalError("Fatal Error: - No file source.")         }         return AVPlayerItem(url: url)     }     private func createVideoMaterial(with videoItem: AVPlayerItem) -> VideoMaterial {         let videoMaterial = VideoMaterial(avPlayer: videoPlayer)         videoPlayer.replaceCurrentItem(with: videoItem)         videoPlayer.actionAtItemEnd = .none         videoPlayer.play()         NotificationCenter.default.addObserver(self, selector: #selector(loopVideo), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: videoPlayer.currentItem)         return videoMaterial     }         @objc     private func loopVideo(notification: Notification) {         guard let playerItem = notification.object as? AVPlayerItem else { return }         playerItem.seek(to: CMTime.zero, completionHandler: nil)         videoPlayer.play()     } } ViewModel class     func startImageTracking(arView: ARView) {         guard let arReferenceImage = ARReferenceImage.referenceImages(inGroupNamed: "ARResources", bundle: Bundle.main) else { return }         let configuration = ARImageTrackingConfiguration().do {             $0.trackingImages = arReferenceImage             $0.maximumNumberOfTrackedImages = 1         }         let personSegmentation: ARWorldTrackingConfiguration.FrameSemantics = .personSegmentationWithDepth         if ARWorldTrackingConfiguration.supportsFrameSemantics(personSegmentation) {   configuration.frameSemantics.insert(personSegmentation)         }         arView.session.run(configuration, options: [.resetTracking]) } ARSessionDelegate protocol   func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {         for anchor in anchors {             if let imageAnchor = anchor as? ARImageAnchor {                 let videoEntity = viewModel.videoEntity.videoModelEntity(width: Float(imageAnchor.referenceImage.physicalSize.width), height: Float(imageAnchor.referenceImage.physicalSize.height))                 viewModel.videoEntity.placeVideoScreen(videoEntity: videoEntity, imageAnchor: imageAnchor, arView: arView)             }         }     }
Asked Last updated
.