Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game using ARKit.

Posts under ARKit tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Reloading a scene from reality composer.
Hello, I'm setting up an ar view using scene anchors from reality composer. The scenes load perfectly fine the first time entering the AR View. When I go back to the previous screen and re-enter the AR View the app crashes before any of the scenes appear on the screen. I've tried pausing and resuming the session and am still getting the following error. validateFunctionArguments:3536: failed assertion `Fragment Function(fsRenderShadowReceiverPlane): incorrect type of texture (MTLTextureTypeCube) bound at texture binding at index 0 (expect MTLTextureType2D) for projectiveShadowMapTexture[0].' Any help would be very much appreciated. Thanks
1
0
743
Oct ’23
Workflow Suggestions from Blender to Reality Composer
Are there any good tutorials or suggestions on creating models in Blender and exporting with the associated materials and nodes? Specifically I'm looking to see if there is an ability to export translucency associated with an object (i.e. glass bottle). I have created a simple cube with a Principled BSDF shader, but the transmission and IOR settings are not porting over. Any tips or suggestions would be helpful.
4
0
1.7k
Oct ’23
Arkit 4.0 and Body measurement
Hello, I'm working on an application that will try to take measures from parts of a human body (hand, foot..). Now that Lidar has been integrated on the iPhone Pro and Arkit 4.0 came out, I would like to know if it seems feasible to combine the feature of the library such as model creation and geometry measurement, to get precise measurement of a body part. Thanks for your help, Tom
6
0
2.2k
Nov ’23
RealityKit : Why ARGeoTrackingConfiguration is not available everywhere ?
Hi, The ARkit is a great tool, I have my small app doing things, and it's fun! but I wanted to try to migrate from ARWorldConfiguration to ARGeoTrackingConfiguration - https://developer.apple.com/documentation/arkit/argeotrackingconfiguration and then we can see that this configuration is limited to a couples of USA only cites. But I can't manage to figure Why and if, in the near future, this will be expanded world wide ?
2
0
1.3k
Jan ’24
Testing ARGeoTrackingConfiguration
I am developing an app using ARKit4 ARGeoTrackingConfiguration following https://developer.apple.com/documentation/arkit/content_anchors/tracking_geographic_locations_in_ar. I am outside of the US, so my location is not supported. I simulate a location but the CoachingState always stays at .initializing Is there a way to test geotracking apps outside of the US?
1
0
634
Jan ’24
AR Session Replay Issue
I'm trying to Replay a recorded session, but I keep getting the error. 2021-10-02 14:44:57.687259-0700 arZero[11120:1137758] MOVReaderInterface - ERROR - Error Domain=com.apple.videoeng.streamreaderwarning Code=0 "Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'." UserInfo={NSLocalizedDescription=Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'.} 2021-10-02 14:44:58.798050-0700 arZero[11120:1137758] [Session] ARSession <0x111d066d0>: did fail with error: Error Domain=com.apple.arkit.error Code=101 "Required sensor unavailable." UserInfo={NSLocalizedDescription=Required sensor unavailable., NSLocalizedFailureReason=A required sensor is not available on this device.} The same video works on the iPad Pro, but not on an iPhone 12 Pro and iPhone 13 Pro. I've tried recording the video with all different phones.
10
4
3k
Aug ’23
RealityKit PhotogrammetrySession not recognizing depth scale
I'm creating a custom scanning solution for iOS and using RealityKit Object Capture PhotogrammetrySession API to build a 3D model. I'm finding the data I'm sending to it is ignoring the depth and not building the model to scale. The documentation is a little light on how to format the depth so I'm wondering if someone could take a look at some example files I send to the PhotogrammetrySession. Would you be able to tell me what I'm not doing correctly? https://drive.google.com/file/d/1-GoeR_KMhX_i7-y8M8ElDRrySasOdoHy/view?usp=sharing Thank you!
1
0
1.5k
May ’24
Setting exif data for Object Capture Photogrammetry
I'm making an app that captures data using ARKit and will ultimately send the images+depth+gravity to an Object Capture Photogrammetry agent. I need to use the depth data and produce a model with correct scale, so from what I understand I need to send the depth file + set proper exif data in the image. Since I'm getting the images+depth from ARKit I'll need to set the exif data manually before saving the images. Unfortunately the documentation on this is a bit light, so would you be able to let me know what exif data needs to be set in order for the Photogrammetry to be able to create the model with proper scale? If I try and set my Photogrammetry agent with manual metadata like this: var sample = PhotogrammetrySample(id: id, image: image)       var dict:[ String: Any ] = [:]      dict["FocalLength"] = 23.551325 dict["PixelWidth"] = 1920 dict["PixelHeight"] = 1440       sample.metadata = dict I get the following error in the output and depth is ignored: [Photogrammetry] Can't use FocalLenIn35mmFilm to produce FocalLengthInPixel! Punting...
1
0
1.1k
Nov ’23
Get distance from uvd and intrinsic matrix?
Hello! I am having trouble calculating accurate distances in the real world using the camera's returned intrinsic matrix and pixel coordinates/depths captured from the iPhone's LiDAR. For example, in the image below, I set a mug 0.5m from the phone. The mug is 8.5cm wide. The intrinsic matrix returned from the phone's AVCameraCalibrationData class has focalx = 1464.9269, focaly = 1464.9269, cx = 960.94916, and cy = 686.3547. Selecting the two pixel locations denoted in the image below, I calculated each one's xyz coordinates using the formula: x = d * (u - cx) / focalx y = d * (v - cy) / focaly z = d Where I get depth from the appropriate pixel in the depth map - I've verified that both depths were 0.5m. I then calculate the distance between the two points to get the mug width. This gives me a calculated width of 0.0357, or 3.5 cm, instead of the 8.5cm I was expecting. What could be accounting for this discrepancy? Thank you so much for your help!
5
0
1.9k
Jun ’24
Reality Converter, problem converting usdz file with multiple animation
Hello, I created in Blender a simple cube with 2 animations, one animation move up and down the cube and second one rotating cube on his position. I exported this file in glb format and I tried to converted using Reality Converter, unfortunately I can only see 1 animation. Is there any limitation of Reality Converter? Can I include more than 1 animation? The original file glb has the 2 animation inside, as you can see from the screenshot I checked the file using a online viewer for glb and there are no problem, both animations are in. The converter unfortunately see only the last one created. Any reason or explanation? I believe is a limitation on Reality Converter Regards
3
1
1.7k
Sep ’23
ARKit : Couldn't load AR reference object from URL or AR Resource group
I have a arobject files that's already tested and working perfectly in Reality Composer as an anchor. But for whatever reason when I try them both as AR Resource group in my asset or even loading it directly from the url it always fails (returns nil) I've double check all the file/group names and they seems fine and I couldn't find the error, just always nil. This is my code : var referenceObject: ARReferenceObject? if let referenceObjects = ARReferenceObject.referenceObjects(inGroupNamed: "TestAR", bundle: Bundle.main) { referenceObject = referenceObjects[referenceObjects.startIndex] } if let referenceObject = referenceObject{ delegate.didFinishScan(referenceObject, false) }else { do { if let url = Bundle.main.url(forResource: "Dragonball1", withExtension: "arobject") { referenceObject = try ARReferenceObject(archiveURL: url) } } catch let myError{ let error = myError as NSError print("try \(error.code)") } } Any idea? Thanks
1
1
517
Jun ’24
RealityKit sometimes renders using only the green channel
Hi We're working on an app that uses RealityKit to present products in the customer's home. We have a bug where every once in a while (somewhere around 1 in 100 runs), all entities are rendered using the green channel only (see image below). It seems to happen to occur in all entities in the ARView, regardless of model or material type. Due to the flaky nature of this bug, I found it really hard to debug, and I can't seem to rule out an internal issue in RealityKit. Did anyone run into similar issues or have any hints of where to look for the culprit?
6
0
1.5k
Sep ’23
RealityComposer asset not showing up on test app OR Apple's own basic AR file project
hi I am not sure what is going on... I have been working on this model for a while on reality composer, and had no problem testing it that way...it always worked out perfectly. So I imported the file into a brand new Xcode project... I created a new ARApp, and used SwiftUI. I actually did it twice ... And tested the version apple has with the box. In Apple's version, the app appears but the whole part where it tries to detect planes didn't show up. So I am confused. I found a question that mentions the error messages I am getting but I am not sure how to get around it? https://developer.apple.com/forums/thread/691882 // // ContentView.swift // AppToTest-02-14-23 // // Created by M on 2/14/23. // import SwiftUI import RealityKit struct ContentView : View {   var body: some View {     return ARViewContainer().edgesIgnoringSafeArea(.all)   } } struct ARViewContainer: UIViewRepresentable {       func makeUIView(context: Context) -> ARView {           let arView = ARView(frame: .zero)           // Load the "Box" scene from the "Experience" Reality File     //let boxAnchor = try! Experience.loadBox()     let anchor = try! MyAppToTest.loadFirstScene()           // Add the box anchor to the scene     arView.scene.anchors.append(anchor)           return arView         }       func updateUIView(_ uiView: ARView, context: Context) {}     } #if DEBUG struct ContentView_Previews : PreviewProvider {   static var previews: some View {     ContentView()   } } #endif This is what I get at the bottom 2023-02-14 17:14:53.630477-0500 AppToTest-02-14-23[21446:1307215] Metal GPU Frame Capture Enabled 2023-02-14 17:14:53.631192-0500 AppToTest-02-14-23[21446:1307215] Metal API Validation Enabled 2023-02-14 17:14:54.531766-0500 AppToTest-02-14-23[21446:1307215] [AssetTypes] Registering library (/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten. 2023-02-14 17:14:54.716866-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.743580-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.744961-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/drPostAndComposition.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.745988-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.747245-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.748750-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.749140-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761189-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761611-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761983-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.762604-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.763575-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.764859-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 18: Json Deserialization; unknown member 'EnableARProbes' - skipping. 2023-02-14 17:14:54.764902-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 20: Json Deserialization; unknown member 'EnableGuidedFilterOcclusion' - skipping. 2023-02-14 17:14:55.531748-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534559-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534633-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534680-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534733-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534777-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534825-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534871-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534955-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:56.207438-0500 AppToTest-02-14-23[21446:1307383] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [2] 2023-02-14 17:17:15.741931-0500 AppToTest-02-14-23[21446:1307414] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] 2023-02-14 17:22:07.075990-0500 AppToTest-02-14-23[21446:1308137] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] code-block
3
0
1.5k
Mar ’24
Cannot pause and deallocate ARSession using SwiftUI and ARKit
Hi, I'm developing an AR App using Apple ARKit. At the moment in my AugmentedView I'm using the boilerplate directly provided by Apple for AR Apps template when choosing initial type of project. When I run my app in debug mode I see that I'm receiving this warning/advice in the console: ARSession is being deallocated without being paused. Please pause running sessions explicitly. Is possible to pause ARSession in SwiftUI? Because as far as I know all this stuff is managed by default by the OS. Is that correct? To notice, I have two Views, a parent View and by using a NavigationStack / NavigationLink the child "subView" Here is my code snippet for the parent View: import SwiftUI struct ArIntroView: View { var body: some View { NavigationStack{ NavigationLink(destination: AugmentedView(), label: { HStack { Text("Go to ARView") } .padding() } ) } } } struct ArIntroView_Previews: PreviewProvider { static var previews: some View { ArIntroView() } } Here is my code for the child View: import SwiftUI import RealityKit struct AugmentedView : View { @State private var showingSheet = false // add state to try explicitly end AR // @State private var isExitTriggered: Bool = false var body: some View { ZStack { ARViewContainer() // to hide toolbar in View .toolbar(.hidden, for: .tabBar) .ignoresSafeArea(.all) VStack { Spacer() HStack { Button { //toggle bottom sheet showingSheet.toggle() } label: { Text("Menu") .frame(maxWidth: .infinity) .padding() } .background() .foregroundColor(.black) .cornerRadius(10) //present the bottom sheet .sheet(isPresented: $showingSheet) { HStack{ Button { // dismiss bottom sheet showingSheet.toggle() } label: { Label("Exit", systemImage: "cross") } } .presentationDetents([.medium]) .presentationDragIndicator(.visible) } } .padding() } Spacer() } } } struct ARViewContainer: UIViewRepresentable { // @Binding var isExitTriggered: Bool let arView = ARView(frame: .zero) func makeUIView(context: Context) -> ARView { // Load the "Box" scene from the "Experience" Reality File let boxAnchor = try! Experience.loadBox() // Add the box anchor to the scene arView.scene.anchors.append(boxAnchor) return arView } // not work it will remove the View and after it will re-create it from makeUIView func updateUIView(_ uiView: ARView, context: Context) { // if isExitTriggered { // uiView.session.pause() // uiView.removeFromSuperview() // } } } #if DEBUG struct Augmented_Previews : PreviewProvider { static var previews: some View { AugmentedView() } } #endif In addition, when running the app and check the performance with Apple Instruments, during the AR session I have memory leaks, apparently with the CoreRE library here the snap: Any suggestion or critique will be welcome.
2
0
1.1k
Sep ’23