A configuration that tracks the movement and expressions of the user’s face with the TrueDepth camera.


class ARFaceTrackingConfiguration : ARConfiguration


A face tracking configuration detects the user’s face in view of the device’s front-facing camera. When running this configuration, an AR session detects the user's face (if visible in the front-facing camera image) and adds to its list of anchors an ARFaceAnchor object representing the face. Each face anchor provides information about the face’s position and orientation, its topology, and features that describe facial expressions.

The ARFaceTrackingConfiguration class provides no methods or properties, but supports all properties inherited from its superclass ARConfiguration. Additionally, when you enable the isLightEstimationEnabled setting, a face tracking configuration uses the detected face as a light probe and provides an estimate of directional or environmental lighting (an ARDirectionalLightEstimate object).


Creating a Configuration


Creates a new face tracking configuration.


Inherits From

Conforms To

See Also

Face Tracking

Creating Face-Based AR Experiences

Place and animate 3D content using information provided by a face-tracking AR session.

class ARFaceAnchor

Information about the pose, topology, and expression of a face detected in a face-tracking AR session.

class ARDirectionalLightEstimate

Estimated environmental lighting information associated with a captured video frame in a face-tracking AR session.