RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

Posts under RealityKit tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Reality Composer - Buttons don't work after exporting .reality file
Hello everyone,I had some problems with buttons and tap-trigger in AR Quick View.If I place a button-object and another object over the button ( in my case it was a sphere, or an imported usdz object) and export the project from Reality Composer as a .reality file - the button loses it's interactivity. It is working in a Reality Composer play mode ( in the example video I attached the sphere starts moving if you tap the button) but nothing happens if I export the project and test it in AR Quick View.Here is a small example of this problem (with attached .rpproject, .reality and two videos of testing the scene in Reality Composer play mode and in in AR Quick View) :https://drive.google.com/file/d/1eQa-pCEihRVtgP7jJUlpfhG5PjKZulJB/view?usp=sharingDo you have any ideas how to fix this problem?
2
1
920
Sep ’23
Subclassing / Modifying the built-in gesture recognizers
Hello, RealityKit offers an awesome interface to install gestures for the common interactions with the virtual object in the 3D space. One of them is the EntityTranslationGestureRecognizer to move the 3D object in the 3D space. When checking the documentation I found the velocity(in:) method which I'd like to modify to limit the speed an object can be moved through the 3D space. https://developer.apple.com/documentation/realitykit/entitytranslationgesturerecognizer/3255581-velocity I didn't find a straight forward way to subclass and install this gesture recognizer yet. Do I miss something? Best, Lennart
2
1
613
Oct ’23
Reloading a scene from reality composer.
Hello, I'm setting up an ar view using scene anchors from reality composer. The scenes load perfectly fine the first time entering the AR View. When I go back to the previous screen and re-enter the AR View the app crashes before any of the scenes appear on the screen. I've tried pausing and resuming the session and am still getting the following error. validateFunctionArguments:3536: failed assertion `Fragment Function(fsRenderShadowReceiverPlane): incorrect type of texture (MTLTextureTypeCube) bound at texture binding at index 0 (expect MTLTextureType2D) for projectiveShadowMapTexture[0].' Any help would be very much appreciated. Thanks
1
0
743
Oct ’23
RealityKit : Why ARGeoTrackingConfiguration is not available everywhere ?
Hi, The ARkit is a great tool, I have my small app doing things, and it's fun! but I wanted to try to migrate from ARWorldConfiguration to ARGeoTrackingConfiguration - https://developer.apple.com/documentation/arkit/argeotrackingconfiguration and then we can see that this configuration is limited to a couples of USA only cites. But I can't manage to figure Why and if, in the near future, this will be expanded world wide ?
2
0
1.3k
Jan ’24
error ‘Texture Descriptor Validation
My RealityKit app uses an ARView with camera mode .nonAR. Later it puts another ARView with camera mode .ar on top of this. When I apply layout constraints to the second view the program aborts with the follow messages. If both views are of type .ar this doesn't occur, it is only when the first view is .nonAR and then has the second presented over it. I have been unable so far to reproduce this behavior in a demo program to provide to you and the original code is complex and proprietary. Does anyone know what is happening? I've seen other questions concerning situation but not under the same circumstances. 2021-12-01 17:59:11.974698-0500 MyApp[10615:6672868] -[MTLTextureDescriptorInternal validateWithDevice:], line 1325: error ‘Texture Descriptor Validation MTLTextureDescriptor has width (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has height (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has invalid pixelFormat (0). ’ -[MTLTextureDescriptorInternal validateWithDevice:]:1325: failed assertion `Texture Descriptor Validation MTLTextureDescriptor has width (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has height (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has invalid pixelFormat (0).
3
0
2.4k
Sep ’23
RealityKit PhotogrammetrySession not recognizing depth scale
I'm creating a custom scanning solution for iOS and using RealityKit Object Capture PhotogrammetrySession API to build a 3D model. I'm finding the data I'm sending to it is ignoring the depth and not building the model to scale. The documentation is a little light on how to format the depth so I'm wondering if someone could take a look at some example files I send to the PhotogrammetrySession. Would you be able to tell me what I'm not doing correctly? https://drive.google.com/file/d/1-GoeR_KMhX_i7-y8M8ElDRrySasOdoHy/view?usp=sharing Thank you!
1
0
1.5k
May ’24
Display USDZ 3D model on website without AR
I am trying to build a website where I would like to render the USDZ 3D model on the browser without the AR feature. The user should be able to interact with the 3D model using a pointing device (mouse). If the user wants to see the 3D model in AR she/he can do so by loading the page on a compatible device where the 3D model can be projected in AR. I am looking for an answer to how to display the USDZ 3D model on the browser without the AR feature.
2
1
2.6k
May ’24
RealityKit MeshResource generated from SwiftUI shape
On Scenekit, using SCNShapewe can create SCN geometry from SwiftUI 2D shapes/beziers:https://developer.apple.com/documentation/scenekit/scnshape Is there an equivalent in RealityKit? Could we use the generate(from:) for that?https://developer.apple.com/documentation/realitykit/meshresource/3768520-generate https://developer.apple.com/documentation/realitykit/meshresource/3768520-generate
2
0
940
Jun ’24
Reality Converter, problem converting usdz file with multiple animation
Hello, I created in Blender a simple cube with 2 animations, one animation move up and down the cube and second one rotating cube on his position. I exported this file in glb format and I tried to converted using Reality Converter, unfortunately I can only see 1 animation. Is there any limitation of Reality Converter? Can I include more than 1 animation? The original file glb has the 2 animation inside, as you can see from the screenshot I checked the file using a online viewer for glb and there are no problem, both animations are in. The converter unfortunately see only the last one created. Any reason or explanation? I believe is a limitation on Reality Converter Regards
3
1
1.7k
Sep ’23
RealityKit sometimes renders using only the green channel
Hi We're working on an app that uses RealityKit to present products in the customer's home. We have a bug where every once in a while (somewhere around 1 in 100 runs), all entities are rendered using the green channel only (see image below). It seems to happen to occur in all entities in the ARView, regardless of model or material type. Due to the flaky nature of this bug, I found it really hard to debug, and I can't seem to rule out an internal issue in RealityKit. Did anyone run into similar issues or have any hints of where to look for the culprit?
6
0
1.4k
Sep ’23
RealityComposer asset not showing up on test app OR Apple's own basic AR file project
hi I am not sure what is going on... I have been working on this model for a while on reality composer, and had no problem testing it that way...it always worked out perfectly. So I imported the file into a brand new Xcode project... I created a new ARApp, and used SwiftUI. I actually did it twice ... And tested the version apple has with the box. In Apple's version, the app appears but the whole part where it tries to detect planes didn't show up. So I am confused. I found a question that mentions the error messages I am getting but I am not sure how to get around it? https://developer.apple.com/forums/thread/691882 // // ContentView.swift // AppToTest-02-14-23 // // Created by M on 2/14/23. // import SwiftUI import RealityKit struct ContentView : View {   var body: some View {     return ARViewContainer().edgesIgnoringSafeArea(.all)   } } struct ARViewContainer: UIViewRepresentable {       func makeUIView(context: Context) -> ARView {           let arView = ARView(frame: .zero)           // Load the "Box" scene from the "Experience" Reality File     //let boxAnchor = try! Experience.loadBox()     let anchor = try! MyAppToTest.loadFirstScene()           // Add the box anchor to the scene     arView.scene.anchors.append(anchor)           return arView         }       func updateUIView(_ uiView: ARView, context: Context) {}     } #if DEBUG struct ContentView_Previews : PreviewProvider {   static var previews: some View {     ContentView()   } } #endif This is what I get at the bottom 2023-02-14 17:14:53.630477-0500 AppToTest-02-14-23[21446:1307215] Metal GPU Frame Capture Enabled 2023-02-14 17:14:53.631192-0500 AppToTest-02-14-23[21446:1307215] Metal API Validation Enabled 2023-02-14 17:14:54.531766-0500 AppToTest-02-14-23[21446:1307215] [AssetTypes] Registering library (/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten. 2023-02-14 17:14:54.716866-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.743580-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.744961-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/drPostAndComposition.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.745988-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.747245-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.748750-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.749140-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761189-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761611-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761983-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.762604-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.763575-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.764859-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 18: Json Deserialization; unknown member 'EnableARProbes' - skipping. 2023-02-14 17:14:54.764902-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 20: Json Deserialization; unknown member 'EnableGuidedFilterOcclusion' - skipping. 2023-02-14 17:14:55.531748-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534559-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534633-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534680-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534733-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534777-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534825-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534871-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534955-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:56.207438-0500 AppToTest-02-14-23[21446:1307383] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [2] 2023-02-14 17:17:15.741931-0500 AppToTest-02-14-23[21446:1307414] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] 2023-02-14 17:22:07.075990-0500 AppToTest-02-14-23[21446:1308137] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] code-block
3
0
1.5k
Mar ’24
Apple AR
Hi i am student from London studying app development in Arizona. I have a few ideas for apps that I believe would add to the Apple AR experience. I was wondering where I should go about getting started in the development process. Any Guidance would be much appreciated :)
2
0
732
Jul ’23
visionOS simulator on the iPadOS device
I was starting to test visionOS SDK on an existing project that has been running fine on iPad (iOS 17) with Xcode 15. It can be configured to run on visionOS simulator on a MacBook that runs M1 chip without any change in Xcode’s project Build Settings. However the Apple Vision Pro simulator doesn’t appear when I run Xcode 15 on Intel MacBook Pro, unless I change the SUPPORTED_PLATFORMS key on the Xcode’s project Build Settings to visionOS. Although, I can understand that a MacBook pro running M1 / M2 chip would be the ideal platform to run the visionOS simulator, it’s so much better if we can run the visionOS simulator on iPadOS, as it has the same arm64 architecture, and it has all the hardware needed to run camera, GPS, and Lidar. The Mac is not a good simulator, even though it has an M1 / M2 chip, first of all: It doesn’t have a dual facing camera (front and back) It doesn’t have a Lidar It doesn’t have GPS It doesn’t have a 5G cellular radio It’s not portable enough for developers to design use cases around spatial computing Last but not least, I have problems or not very clear on simulating ARKit with actual camera frames on a VisionPro simulator, while I would estimate this can be simulated perfectly on an iPadOS. My suggestion is to provide us developers with a simulator that can be run on iPadOS, that will increase developers adoption and improve the design and prototyping phase of apps running on the actual Vision Pro device.
4
2
2.3k
Sep ’23
RealityViewContent update
I am working on a project where changes in a window are reflected in a volumetric view which includes a RealityView. I have a shared data model between the window and volumetric view, but it unclear to me how I can programmatically refresh the RealityViewContent. Initially I tried holding the RealityViewContent passed from the RealityView closure in the data model, and I also tried embedding a .sink into the closure, but because the RealityViewContent is inout, neither of those work. And changes to the window's contents do not cause the RealityView's update closure fire. Is there a way to notify the RealityViewContent to update?
4
0
846
Jul ’23
Per-vertex color. in a custom RealityKit mesh? (macOS)
I'm working on an application for viewing AMF models on macOS, using RealityKit. AMF supports several different ways to color models, including per-vertex color (where the color of a triangle is interpolated from vertex to vertex) as well as per-face color (where the color of the triangle is the same across the entire face). I'm trying to figure out how to support those color models using a RealityKit mesh. Apple's documentation (https://developer.apple.com/documentation/realitykit/modifying-realitykit-rendering-using-custom-materials) talks about per-vertex colors, but I haven't found a way to create a mesh that includes per-vertex colors, other than use a texture map (which might be the correct solution). Can someone give me some pointers?
4
1
957
Mar ’24