RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Posts under RealityKit tag

434 Posts
Sort by:
Post not yet marked as solved
12 Replies
8.3k Views
I have a problem, I want to load a usdz file, place it and animate it in RealityKit. It should also be possible to apply standard gestures as done inarView.installGestures(for: entity)From what I know I have two choices to load an Entity, for example Apples "toy_biplane.usdz" (It also contains an skeletal animation).I havelet entity = try! Entity.load(named: "toy_biplane")orlet entity = try! Entity.loadModel(named: "toy_biplane")when I use the first one, the usdz is loaded as an Entity, not ModelEntity. I can executeentity.playAnimation(entity.repeat())But I can't usearView.installGestures(for: entity)because it does not conform to HasCollision. I tried subclassing Entity and conforming to HasCollision and HasModel. It compiles but even if I call the generateCollisionShape(recursive: true) the gestures are not working.So I tried the loadModel approach which returns an ModelEntity. There the arView.installGestures are working fine exactly as tried out. But when I want to play the animation, the airplane just rotates around my camer very weird.I also tried loading them asynchronously, no success.After many time of debugging I found out, that the Entity from the first approach contained many children from the usdz. Each of them is a part of the skeleton and has its own animation. Not so in the ModelEntity. The children property is an empty set and therefore the animation (e.g rotation of the propeller) is not applied to the skeletal element it belongs to but rather to the combined overall skeleton. Causing a rotation of the whole plane which is not what I want.What am I doing wrong or is something of this unintended behaviour from RealityKit?Thanks.
Posted
by
Post not yet marked as solved
1 Replies
796 Views
Hello everyone,I had some problems with buttons and tap-trigger in AR Quick View.If I place a button-object and another object over the button ( in my case it was a sphere, or an imported usdz object) and export the project from Reality Composer as a .reality file - the button loses it's interactivity. It is working in a Reality Composer play mode ( in the example video I attached the sphere starts moving if you tap the button) but nothing happens if I export the project and test it in AR Quick View.Here is a small example of this problem (with attached .rpproject, .reality and two videos of testing the scene in Reality Composer play mode and in in AR Quick View) :https://drive.google.com/file/d/1eQa-pCEihRVtgP7jJUlpfhG5PjKZulJB/view?usp=sharingDo you have any ideas how to fix this problem?
Posted
by
pfc
Post marked as solved
2 Replies
509 Views
Hello, RealityKit offers an awesome interface to install gestures for the common interactions with the virtual object in the 3D space. One of them is the EntityTranslationGestureRecognizer to move the 3D object in the 3D space. When checking the documentation I found the velocity(in:) method which I'd like to modify to limit the speed an object can be moved through the 3D space. https://developer.apple.com/documentation/realitykit/entitytranslationgesturerecognizer/3255581-velocity I didn't find a straight forward way to subclass and install this gesture recognizer yet. Do I miss something? Best, Lennart
Post not yet marked as solved
1 Replies
634 Views
Hello, I'm setting up an ar view using scene anchors from reality composer. The scenes load perfectly fine the first time entering the AR View. When I go back to the previous screen and re-enter the AR View the app crashes before any of the scenes appear on the screen. I've tried pausing and resuming the session and am still getting the following error. validateFunctionArguments:3536: failed assertion `Fragment Function(fsRenderShadowReceiverPlane): incorrect type of texture (MTLTextureTypeCube) bound at texture binding at index 0 (expect MTLTextureType2D) for projectiveShadowMapTexture[0].' Any help would be very much appreciated. Thanks
Posted
by
Post not yet marked as solved
2 Replies
1.1k Views
Hi, The ARkit is a great tool, I have my small app doing things, and it's fun! but I wanted to try to migrate from ARWorldConfiguration to ARGeoTrackingConfiguration - https://developer.apple.com/documentation/arkit/argeotrackingconfiguration and then we can see that this configuration is limited to a couples of USA only cites. But I can't manage to figure Why and if, in the near future, this will be expanded world wide ?
Posted
by
Post not yet marked as solved
3 Replies
2.2k Views
My RealityKit app uses an ARView with camera mode .nonAR. Later it puts another ARView with camera mode .ar on top of this. When I apply layout constraints to the second view the program aborts with the follow messages. If both views are of type .ar this doesn't occur, it is only when the first view is .nonAR and then has the second presented over it. I have been unable so far to reproduce this behavior in a demo program to provide to you and the original code is complex and proprietary. Does anyone know what is happening? I've seen other questions concerning situation but not under the same circumstances. 2021-12-01 17:59:11.974698-0500 MyApp[10615:6672868] -[MTLTextureDescriptorInternal validateWithDevice:], line 1325: error ‘Texture Descriptor Validation MTLTextureDescriptor has width (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has height (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has invalid pixelFormat (0). ’ -[MTLTextureDescriptorInternal validateWithDevice:]:1325: failed assertion `Texture Descriptor Validation MTLTextureDescriptor has width (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has height (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has invalid pixelFormat (0).
Posted
by
Post not yet marked as solved
5 Replies
2.7k Views
How does meta data affect the model creation? When I added image meta data to sample, the created model was changed. this is the code I get meta data from image url: func getImageMetaData(url: URL) -> CFDictionary? {             if let data = NSData(contentsOf: url),                let source = CGImageSourceCreateWithData(data as CFData, nil) {                 let metadata = CGImageSourceCopyPropertiesAtIndex(source, 0, nil)                 return metadata             }             return nil         } when I create photogrammetrySample, this is the code I add meta data to it: if let metaData = getImageMetaData(url: imageURL) as? [String:AnyObject] {                                 sample.metadata = metaData                             }
Posted
by
Post not yet marked as solved
2 Replies
2.0k Views
I'm using ARKit to collect LiDAR data. I have read the depth map and its format is 'kCVPixelFormatType_DepthFloat32'. I saved the depth map to a PNG image by converting it to UIImage, but I found that the PNG depth map is incorrect. The PNG format only supports 16bit data. let ciImage = CIImage(cvPixelBuffer: pixelBuf) let cgImage = context.createCGImage(ciImage, from: ciImage.extent) let uiImage = UIImage(cgImage: cgImage!).pngData() So, I have to save the depth map to a TIFF image. let ciImage = CIImage(cvPixelBuffer: pixelBuf)         do { try context.writeTIFFRepresentation(of: ciImage, to: path, format: context.workingFormat, colorSpace: context.workingColorSpace!, options: [:]) } catch { self.showInfo += "Save TIFF failed;" print(error) } How to convert the depth map from kCVPixelFormatType_DepthFloat32 to Float16? Is there a correct way to save the depth map to a PNG image?
Posted
by
Post not yet marked as solved
3 Replies
1.5k Views
Hello, I created in Blender a simple cube with 2 animations, one animation move up and down the cube and second one rotating cube on his position. I exported this file in glb format and I tried to converted using Reality Converter, unfortunately I can only see 1 animation. Is there any limitation of Reality Converter? Can I include more than 1 animation? The original file glb has the 2 animation inside, as you can see from the screenshot I checked the file using a online viewer for glb and there are no problem, both animations are in. The converter unfortunately see only the last one created. Any reason or explanation? I believe is a limitation on Reality Converter Regards
Posted
by
Post not yet marked as solved
4 Replies
715 Views
I created a plane with unlit material with png texture, and I want use it as an indicator to show users where to put models in the scene. But if the grounding shadow is enabled, the plane is flickering with shadow and looks really weird. If the grounding shadow is disabled it works fine. But I do want other models have grounding shadow. So I wonder if I can disable this specific plane's grounding shadow, or if there is a way to solve the flickering? Here's my code let planeMesh = MeshResource.generatePlane(width: 0.8, depth: 0.8) let planeAnchor = AnchorEntity(plane: .horizontal) var material = UnlitMaterial(color: .white) material.color.texture = try! .init(.load(named:"indicator")) indicatorEntity = ModelEntity(mesh: planeMesh, materials: [material]) planeAnchor.addChild(indicatorEntity) arView.scene.addAnchor(planeAnchor)     func session(_ session: ARSession, didUpdate frame: ARFrame) {         guard let result = self.arView.raycast(from: arView.center, allowing: .estimatedPlane, alignment: .horizontal).first else {             return         }         indicatorEntity.setTransformMatrix(result.worldTransform, relativeTo: nil)     }
Posted
by
Post not yet marked as solved
6 Replies
1.3k Views
Hi We're working on an app that uses RealityKit to present products in the customer's home. We have a bug where every once in a while (somewhere around 1 in 100 runs), all entities are rendered using the green channel only (see image below). It seems to happen to occur in all entities in the ARView, regardless of model or material type. Due to the flaky nature of this bug, I found it really hard to debug, and I can't seem to rule out an internal issue in RealityKit. Did anyone run into similar issues or have any hints of where to look for the culprit?
Posted
by
Post marked as solved
3 Replies
1.2k Views
I'm looking to get a GPU to use for Object capture. The requirements are an AMD GPU with 4GB of VRAM and ray tracing support, the rx 580 seems to be able to do ray tracing from what I've found online but looks like someone had an issue with a 580X here https://developer.apple.com/forums/thread/689891
Posted
by
Post marked as solved
2 Replies
1.3k Views
Hi,  Im trying to determine if point in 3D space is covered by other objects like human hand or a wall. I do not want to use raycast, so my idea is to calculate two things: 1) distance between iPad camera and this point. 2) position of this 3D point projected to 2D arView and then find depth information from depthMap at this point If depth is smaller than distance to point I can assume that point is covered by something. My code works well when iPad is facing our 3D point straight, but when we rotate iPad a little then calculation 2 (based on depth) gain an error. It looks like calculation 1 and 2 take two different points on iPad as a reference (camera position) but I could not find any logic in it. This is my code: let viewSize = arView.bounds.size let frame = arView.session.currentFrame! // Transform to translate between arView and depth map let displayTransform = frame.displayTransform(for: arView.interfaceOrientation, viewportSize: viewSize) guard let depthPixelBuffer = frame.sceneDepth?.depthMap else { return } let depthWidth = CVPixelBufferGetWidth(depthPixelBuffer) let depthWidthFloat = CGFloat(depthWidth) let depthHeight = CVPixelBufferGetHeight(depthPixelBuffer) let depthHeightFloat = CGFloat(depthHeight) // Point in 3D space (our point, red square on images) let object3Dposition = self.position // Calculate distance between camera and point in 3D space // this always works good let distanceToObject = distance(object3Dposition, arView.cameraTransform.translation) // 2D point on ArView projected from 3D position (find where this point will be visible on arView) guard let pointOnArView = arView.project(object3Dposition) else { return } // Normalize 2D point (0-1) let pointOnArViewNormalized = CGPoint(x: pointOnArView.x/viewSize.width, y: pointOnArView.y/viewSize.height) // Transform form ArView position to depthMap position let pointOnDepthMapNormalized = CGPointApplyAffineTransform(pointOnArViewNormalized, displayTransform.inverted()) // Point on depth map (from normalized coordinates to true coordinates) let pointOnDepthMap = CGPoint(x: pointOnDepthMapNormalized.x * depthWidthFloat, y: pointOnDepthMapNormalized.y * depthHeightFloat) guard     pointOnDepthMap.x >= 0 && pointOnDepthMap.y >= 0 && pointOnDepthMap.x < depthWidthFloat && pointOnDepthMap.y < depthHeightFloat else {     // Point not visible, outside of screen     isVisibleByCamera = false     return } // Read depth from buffer let depth: Float32 CVPixelBufferLockBaseAddress(depthPixelBuffer, CVPixelBufferLockFlags(rawValue: 2)) let floatBuffer = unsafeBitCast(     CVPixelBufferGetBaseAddress(depthPixelBuffer),     to: UnsafeMutablePointer<Float32>.self ) // Get depth in 'pointOnDepthMap' coordinates (convert from X,Y coordinates to buffer index) let depthBufferIndex = depthWidth * Int(pointOnDepthMap.y) + Int(pointOnDepthMap.x) // This depth is incorrect when iPad is rotated depth = floatBuffer[depthBufferIndex]          CVPixelBufferUnlockBaseAddress(depthPixelBuffer, CVPixelBufferLockFlags(rawValue: 2)) if distanceToObject > depth + 0.05 {     isVisibleByCamera = false } else {     isVisibleByCamera = true } Thank you :)
Posted
by
Post not yet marked as solved
3 Replies
1.3k Views
hi I am not sure what is going on... I have been working on this model for a while on reality composer, and had no problem testing it that way...it always worked out perfectly. So I imported the file into a brand new Xcode project... I created a new ARApp, and used SwiftUI. I actually did it twice ... And tested the version apple has with the box. In Apple's version, the app appears but the whole part where it tries to detect planes didn't show up. So I am confused. I found a question that mentions the error messages I am getting but I am not sure how to get around it? https://developer.apple.com/forums/thread/691882 // // ContentView.swift // AppToTest-02-14-23 // // Created by M on 2/14/23. // import SwiftUI import RealityKit struct ContentView : View {   var body: some View {     return ARViewContainer().edgesIgnoringSafeArea(.all)   } } struct ARViewContainer: UIViewRepresentable {       func makeUIView(context: Context) -> ARView {           let arView = ARView(frame: .zero)           // Load the "Box" scene from the "Experience" Reality File     //let boxAnchor = try! Experience.loadBox()     let anchor = try! MyAppToTest.loadFirstScene()           // Add the box anchor to the scene     arView.scene.anchors.append(anchor)           return arView         }       func updateUIView(_ uiView: ARView, context: Context) {}     } #if DEBUG struct ContentView_Previews : PreviewProvider {   static var previews: some View {     ContentView()   } } #endif This is what I get at the bottom 2023-02-14 17:14:53.630477-0500 AppToTest-02-14-23[21446:1307215] Metal GPU Frame Capture Enabled 2023-02-14 17:14:53.631192-0500 AppToTest-02-14-23[21446:1307215] Metal API Validation Enabled 2023-02-14 17:14:54.531766-0500 AppToTest-02-14-23[21446:1307215] [AssetTypes] Registering library (/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten. 2023-02-14 17:14:54.716866-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.743580-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.744961-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/drPostAndComposition.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.745988-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.747245-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.748750-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.749140-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761189-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761611-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761983-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.762604-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.763575-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.764859-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 18: Json Deserialization; unknown member 'EnableARProbes' - skipping. 2023-02-14 17:14:54.764902-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 20: Json Deserialization; unknown member 'EnableGuidedFilterOcclusion' - skipping. 2023-02-14 17:14:55.531748-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534559-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534633-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534680-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534733-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534777-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534825-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534871-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534955-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:56.207438-0500 AppToTest-02-14-23[21446:1307383] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [2] 2023-02-14 17:17:15.741931-0500 AppToTest-02-14-23[21446:1307414] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] 2023-02-14 17:22:07.075990-0500 AppToTest-02-14-23[21446:1308137] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] code-block
Posted
by
Post marked as solved
2 Replies
539 Views
When I launch any RealityKit app that uses a CustomMaterial with a surface shader on my iPad Pro 12.9" 1st gen (Model A1584, iPadOS 16.4.1, built with Xcode 14.3), it crashes with the following error message: -[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5446: failed assertion 'Draw Errors Validation Fragment Function (realitykit::fsSurfacePbr): the offset into the buffer clippingConstants that is bound at buffer index 6 must be multiple of 256 but was set to 128. I can reproduce this with the sample code Altering RealityKit Rendering with Shader Functions as well as with an AR app I'm currently developing. Feedback FB12150033 already submitted to Apple. Replacing all occurrences of CustomMaterial with a SimpleMaterial resolves the crash. Can anyone confirm this? Any ideas for a workaround that lets me keep the shaders? I wouldn't want my app to crash on users of this iPad Pro, but surface shaders are essential in my app.
Posted
by