I'm just using the standard template out of the box with custom configuration for my SwiftUI + RealityKit project:
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero, cameraMode: .ar, automaticallyConfigureSession: false)
let config = ARWorldTrackingConfiguration()
config.sceneReconstruction = .meshWithClassification
config.isAutoFocusEnabled = true
if type(of: config).supportsFrameSemantics(.sceneDepth) {
config.frameSemantics = .personSegmentationWithDepth
}
arView.environment.sceneUnderstanding.options.insert([.occlusion, .collision, .physics])
arView.session.run(config)
// Load the "Box" scene from the "Experience" Reality File
let boxAnchor = try! Experience.loadBox()
// Add the box anchor to the scene
arView.scene.anchors.append(boxAnchor)
return arView
}
And in my AppDelegate.swift:
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
guard ARWorldTrackingConfiguration.supportsSceneReconstruction(.meshWithClassification)
else {
fatalError("Scene reconstruction requires a device with a LiDAR Scanner.")
}
// Create the SwiftUI view that provides the window contents.
let contentView = ContentView()
// Use a UIHostingController as window root view controller.
let window = UIWindow(frame: UIScreen.main.bounds)
window.rootViewController = UIHostingController(rootView: contentView)
self.window = window
window.makeKeyAndVisible()
return true
}
Where the only change in the AppDelegate was the guard
. The app loads fine, no crash, but the asset/"Box" does not load. Even if I enable debug mode with the config with meshes being displayed, nothing appears. When I don't run arView.session.run(config)
, the asset loads perfectly fine. What am I doing wrong to crash this?