Grab frames in Vision Pro using ARFrame

I'd like to grab the current camera frame in visionOS. I have a Swift file (am new to Swift) that looks like this:

import ARKit
import SwiftUI

class ARSessionManager: NSObject, ObservableObject, ARSessionDelegate {
    var arSession: ARSession
    
    override init() {
        arSession = ARSession()
        super.init()
        arSession.delegate = self
    }
    
    func startSession() {
        let configuration = ARWorldTrackingConfiguration()
        configuration.planeDetection = .horizontal
        arSession.run(configuration)
    }
    
    // ARSessionDelegate method to capture frames
    func session(_ session: ARSession, didUpdate frame: ARFrame) {
        // Process the frame, e.g., capture image data
    }
}

and I get errors including "Cannot find type 'ARSessionDelegate' in scope". Help? Is ARFrame called something different for Vision Pro?

As far as I know there is not third party app camera access for the Vision Pro. To use ARKit, you have to be in a ImmersiveSpace (vs the shared space). The data you get from ARKit is via providers for things like anchors, planes and hand tracking. You can get meshes of the geometry of the room the visionpro is seeing, but no images.

Check out the cube demo from the ARKit WWDC23 videos for good examples Of ARKit and hand tracking.

Grab frames in Vision Pro using ARFrame
 
 
Q