WorldTrackingProvider stops working on device

After re-launching the immersive space in my app 5-10 times, the WorldTrackingProvider stops working. Only restarting the app will allow it to start working again.

Only on device, not the simulator.

I get these errors when it happens:

The device_anchor can only be queried when the world tracking provider is running. ARPredictorRemoteService <0x107cbb5e0>: Service configured with error: Error Domain=com.apple.arkit.error Code=501 "(null)"

Remote Service was invalidated: <ARPredictorRemoteService: 0x107cbb5e0>, will stop all data_providers.

ARRemoteService: remote object proxy failed with error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service with pid 81 named com.apple.arkit.service.session was invalidated from this process." UserInfo={NSDebugDescription=The connection to service with pid 81 named com.apple.arkit.service.session was invalidated from this process.} ARRemoteService: weak self released before invalidation

@Observable class VisionPro {
    let session = ARKitSession()
    let worldTracking = WorldTrackingProvider()
    
    func transformMatrix() async -> simd_float4x4 {

        guard let deviceAnchor = worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime())
              
        else { return .init() }
        return deviceAnchor.originFromAnchorTransform
    }
    
    func runArkitSession() async {
        Task {
            try? await session.run([worldTracking])
        }
    }            
}

which I call from my RealityView:

.task {
 await visionPro.runArkitSession()
}

Hi @Appyou234

The system stops/pauses tracking for various reasons you can't control. To fix this error:

  • Check to see if the provider is running using WorldTrackingProvider.state.
  • If it's not running create a new provider and run it.
  • Make sure the task modifier does not run until an immersive space is open.

Please followup if that doesn't fix the issue.

I'm having the same issue, world tracking is initialized, but it stops running.

I also have scene reconstruaction and hand tracking, but had both commented out.

Tracking script

import ARKit
import SwiftUI
@Observable
@MainActor class Tracking{
    let arSession = ARKitSession()
    let handTracking = HandTrackingProvider()
   let worldTracking = WorldTrackingProvider()
    let sceneReconstruction = SceneReconstructionProvider()
    var latestLeftHand: HandAnchor?
    var meshAnchors = Entity()
    var latestRightHand: HandAnchor?
    var headAnchor: DeviceAnchor?
    func StartTracking() async {
      
         guard WorldTrackingProvider.isSupported else {
            print("HandTrackingProvider is not supported on this device.")
            return
        }
        let meshGenerator = MeshAnchorGenerator(root: meshAnchors)
        do {
          
            
            print("head tracking \(worldTracking.state)")
            try await arSession.run([worldTracking])
            //            try await arSession.run([handTracking, worldTracking,sceneReconstruction])
        } catch let error as ARKitSession.Error {
          
            
            print("Encountered an error while running providers: \(error.localizedDescription)")
        } catch let error {
         
            
            print("Encountered an unexpected error: \(error.localizedDescription)")
        }
            
        if worldTracking.state != .running  {
                print("world tracking not running, stop the session.")
           
            }
            
        headAnchor = worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime())
        print("start head tracking")
            
       
        await meshGenerator.run(sceneReconstruction)
        
        print("start scene reconstruction")
        for await anchorUpdate in handTracking.anchorUpdates {
            switch anchorUpdate.anchor.chirality {
            case .left:
                self.latestLeftHand = anchorUpdate.anchor
            case .right:
                self.latestRightHand = anchorUpdate.anchor
            }
            print("start scene reconstruction")
        }
    
    }
}

ImmersiveVIew

import RealityKit
import RealityKitContent
import ARKit

struct ImmersiveView: View {
    @Environment(AppModel.self) var appModel
    @Environment(Tracking.self) var tracking
    
    var body: some View {
        let meshAnchors = Entity()
        RealityView { content in
            // Add the initial RealityKit content
            
//            content.add(meshAnchors)
            let sphere = ModelEntity()
            if let mat = try? await ShaderGraphMaterial(
                named: "/Root/Material",
                from: "Blur",
                in: realityKitContentBundle
            ){
                
                sphere.components.set([
                    ModelComponent(
                        mesh: .generateSphere(radius: 1),
                        materials: [mat]),
                   
                    CollisionComponent(shapes: [
                        ShapeResource.generateSphere(radius: 0.02)
                    ], mode: .trigger)
                ])
                
                
            }
            let root = Entity()
            root.addChild(sphere)
            content.add(root)
//
            
                root.components.set(ClosureComponent(closure: { deltaTime in
                
                    let translation = ConvertTrackingTransform
                        .GetHeadPosition(tracking.headAnchor)
                    sphere.transform.translation = translation
//                 print("Head position \(translation)")
                }))
                
            }
        .task{
            await tracking.StartTracking()
        }
        }
       
    }

App

        ImmersiveSpace(id: appModel.immersiveSpaceID) {
            ImmersiveView()
                .environment(appModel)
                .environment(Tracking())
                .onAppear {
                    appModel.immersiveSpaceState = .open
                    avPlayerViewModel.play()
                }
                .onDisappear {
                    appModel.immersiveSpaceState = .closed
                    avPlayerViewModel.reset()
                }
        }
        .immersionStyle(selection: .constant(.mixed), in: .full, .mixed)
    }

I realized having headAnchor = worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) right after the run ar session is problematic, I moved this line to system component in Immersive view, it works fine now. root.components.set(ClosureComponent(closure: { deltaTime in let headAnchor = tracking.worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) let translation = ConvertTrackingTransform .GetHeadPosition(headAnchor) sphere.transform.translation = translation // print("Head position (translation)") }))`

I also got scene reconstruction to work at the same time.

WorldTrackingProvider stops working on device
 
 
Q