Hello.
I am currently building an app using AR Kit.
As for the UI, I am using SwiftUI and NavigationStack + NavigationLink for navigation and screen transitions!
Here I need to go back and forth between the AR screen and other screens.
If the number of screen transitions is small, this is not a problem.
However, if the number of screen transitions increases to 10 or 20, it crashes somewhere.
We are struggling with this problem. (The nature of the application requires multiple screen transitions.)
The crash log showed the following.
error: read memory from 0x1e387f2d4 failed
Incident Identifier: B23D806E-D578-4A95-8828-2A1E8D6BB7F8 Beta Identifier: 924A85AB-441C-41A7-9BC2-063940BDAF32 Hardware Model: iPhone16,1 Process: AR_Crash_Sample [2375] Path: /private/var/containers/Bundle/Application/FAC3D662-DB10-434E-A006-79B9515D8B7A/AR_Crash_Sample.app/AR_Crash_Sample Identifier: ar.crash.sample.AR.Crash.Sample Version: 1.0 (1) AppStoreTools: 16C7015 AppVariant: 1:iPhone16,1:18 Beta: YES Code Type: ARM-64 (Native) Role: Foreground Parent Process: launchd [1] Coalition: ar.crash.sample.AR.Crash.Sample [1464] Date/Time: 2025-03-07 11:59:14.3691 +0900 Launch Time: 2025-03-07 11:57:47.3955 +0900 OS Version: iPhone OS 18.3.1 (22D72) Release Type: User Baseband Version: 2.40.05 Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: AR_Crash_Sample [2375] Triggered by Thread: 7 Application Specific Information: abort() called Thread 7 name: Dispatch queue: com.apple.arkit.depthtechnique Thread 7 Crashed: 0 libsystem_kernel.dylib 0x1e387f2d4 __pthread_kill + 8 1 libsystem_pthread.dylib 0x21cedd59c pthread_kill + 268 2 libsystem_c.dylib 0x199f98b08 abort + 128 3 libc++abi.dylib 0x21ce035b8 abort_message + 132 4 libc++abi.dylib 0x21cdf1b90 demangling_terminate_handler() + 320 5 libobjc.A.dylib 0x18f6c72d4 _objc_terminate() + 172 6 libc++abi.dylib 0x21ce0287c std::__terminate(void (*)()) + 16 7 libc++abi.dylib 0x21ce02820 std::terminate() + 108 8 libdispatch.dylib 0x199edefbc _dispatch_client_callout + 40 9 libdispatch.dylib 0x199ee65cc _dispatch_lane_serial_drain + 768 10 libdispatch.dylib 0x199ee7158 _dispatch_lane_invoke + 432 11 libdispatch.dylib 0x199ee85c0 _dispatch_workloop_invoke + 1744 12 libdispatch.dylib 0x199ef238c _dispatch_root_queue_drain_deferred_wlh + 288 13 libdispatch.dylib 0x199ef1bd8 _dispatch_workloop_worker_thread + 540 14 libsystem_pthread.dylib 0x21ced8680 _pthread_wqthread + 288 15 libsystem_pthread.dylib 0x21ced6474 start_wqthread + 8
Perhaps I am using too much memory!
How can I address this phenomenon?
For the AR functionality, we are using UIViewRepresentable, which is written in UIKit and can be called from SwiftUI
import ARKit import AsyncAlgorithms import AVFoundation import SCNLine import SwiftUI internal struct MeasureARViewContainer: UIViewRepresentable { @Binding var tapCount: Int @Binding var distance: Double? @Binding var currentIndex: Int var focusSquare: FocusSquare = FocusSquare() let coachingOverlay: ARCoachingOverlayView = ARCoachingOverlayView() func makeUIView(context: Context) -> ARSCNView { let arView: ARSCNView = ARSCNView() arView.delegate = context.coordinator let configuration: ARWorldTrackingConfiguration = ARWorldTrackingConfiguration() configuration.planeDetection = [.horizontal, .vertical] if ARWorldTrackingConfiguration.supportsFrameSemantics(.sceneDepth) { configuration.frameSemantics = [.sceneDepth, .smoothedSceneDepth] } arView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors]) context.coordinator.sceneView = arView context.coordinator.scanTarget() coachingOverlay.session = arView.session coachingOverlay.delegate = context.coordinator coachingOverlay.goal = .horizontalPlane coachingOverlay.activatesAutomatically = true coachingOverlay.autoresizingMask = [.flexibleWidth, .flexibleHeight] coachingOverlay.translatesAutoresizingMaskIntoConstraints = false arView.addSubview(coachingOverlay) return arView } func updateUIView(_ _: ARSCNView, context: Context) { context.coordinator.mode = MeasurementMode(rawValue: currentIndex) ?? .width if tapCount == 0 { context.coordinator.resetMeasurement() return } if distance != nil { return } DispatchQueue.main.async { if context.coordinator.distance == nil { context.coordinator.handleTap() } } } static func dismantleUIView(_ uiView: ARSCNView, coordinator: Coordinator) { uiView.session.pause() coordinator.stopScanTarget() coordinator.stopSpeech() DispatchQueue.main.async { uiView.removeFromSuperview() } } func makeCoordinator() -> Coordinator { Coordinator(self) } class Coordinator: NSObject, ARSCNViewDelegate, ARSessionDelegate, ARCoachingOverlayViewDelegate { var parent: MeasureARViewContainer var sceneView: ARSCNView? var startPosition: SCNVector3? var pointedCount: Int = 0 var distance: Float? var mode: MeasurementMode = .width let synthesizer: AVSpeechSynthesizer = AVSpeechSynthesizer() var scanTargetTask: Task<Void, Never>? var currentResult: ARRaycastResult? init(_ parent: MeasureARViewContainer) { self.parent = parent } // ... etc } }