To what extend does AR FaceTracking still rely the Truedepth camera?

I'm exploring face tracking and experimenting with ARKit's ARSCNFaceGeometry face mesh. I'm running a minimal demo application on the latest iPad Pro M4 11-inch, and I've provided the code below.

I've heard that Apple still offers some of the best face tracking technology on consumer devices, largely because they are one of the few that combine depth and image data. Both a colleague and I tested the demo, and while it works as well or better than some other solutions we tried, we weren’t particularly impressed compared to Google’s MediaPipe or Nvidia’s Maxine, both of which rely solely on image data without depth. In our case, the ARKit face mesh doesn’t always align perfectly with the chin, and as the face rotates, in some areas vertices shift by up to a centimeter from their original position.

This led us to question whether our demo app was using the TrueDepth sensor at all. To test this, we used a piece of cardboard with a small hole punched in it and taped it over the sensor array, leaving only the camera exposed. On the iOS lock screen, this prevents FaceID from working, but we still get a clear image from the camera. With the TrueDepth sensor blocked, the face mesh tracking in our app still functioned, but honestly, we couldn’t detect a significant difference in tracking performance with or without the TrueDepth sensor obscured.

Could we be setting up the face tracking configuration incorrectly? Or has face tracking in newer versions of iOS become less dependent on the TrueDepth sensor?

The controller:

import SwiftUI
import ARKit

struct FaceTrackingView1: UIViewControllerRepresentable {
    func makeUIViewController(context: Context) -> FaceTrackingViewController1 {
        return FaceTrackingViewController1()
    }

    func updateUIViewController(_ uiViewController: FaceTrackingViewController1, context: Context) {
    }
}

class FaceTrackingViewController1: UIViewController, ARSCNViewDelegate, ARSessionDelegate {
    var sceneView: ARSCNView!

    override func viewDidLoad() {
        super.viewDidLoad()

        sceneView = ARSCNView(frame: view.bounds)
        sceneView.delegate = self
        sceneView.automaticallyUpdatesLighting = true
        view.addSubview(sceneView)

        let config = ARFaceTrackingConfiguration()
        sceneView.session.run(config)
    }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        sceneView.session.pause()
    }

    func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
        guard anchor is ARFaceAnchor else { return nil }

        let faceGeometry = ARSCNFaceGeometry(device: sceneView.device!)!
        let faceNode = SCNNode(geometry: faceGeometry)
        faceNode.geometry?.firstMaterial?.fillMode = .lines // Makes it a wireframe mesh

        return faceNode
    }

    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        guard let faceAnchor = anchor as? ARFaceAnchor,
              let faceGeometry = node.geometry as? ARSCNFaceGeometry else { return }

        faceGeometry.update(from: faceAnchor.geometry)
    }
}

The view:

import SwiftUI

struct ContentView: View {
    @State private var isFaceTrackingActive = false
    
    var body: some View {
        VStack {
            Text("Face mesh tracking demo")
                .font(.title)
                .padding()
            
            Button(action: {
                isFaceTrackingActive.toggle()
            }) {
                Text("Start Face Tracking")
                    .font(.title2)
                    .padding()
                    .background(Color.blue)
                    .foregroundColor(.white)
                    .cornerRadius(10)
            }        
            .fullScreenCover(isPresented: $isFaceTrackingActive) {
                FaceTrackingView1()
            }
        }
        .padding()
    }
}

#Preview {
    ContentView()
}

Hello @jonas87,

Could we be setting up the face tracking configuration incorrectly?

There is no issue with your face tracking configuration setup :)

Or has face tracking in newer versions of iOS become less dependent on the TrueDepth sensor?

ARFaceTrackingConfiguration's usage of any particular hardware is an implementation detail, it should not make a difference to your app how it achieves the results it achieves.

Best regards,

Greg

To what extend does AR FaceTracking still rely the Truedepth camera?
 
 
Q