A configuration you use when you just want to track faces using the device's front camera.


@interface ARFaceTrackingConfiguration : ARConfiguration


A face tracking configuration detects the faces that can be seen in the device’s front camera feed. When ARKit detects a face, it creates an ARFaceAnchor object that provides information about the face’s position and orientation, its topology, and features that describe facial expressions.

The ARFaceTrackingConfiguration class provides no methods or properties, but supports all properties inherited from its superclass ARConfiguration. Additionally, when you enable the lightEstimationEnabled setting, a face tracking configuration uses the detected face as a light probe and provides an estimate of directional or environmental lighting (an ARDirectionalLightEstimate object).


Creating a Configuration

- init

Creates a new face-tracking configuration.

+ new

Creates a new face-tracking configuration.

Enabling World Tracking


A flag that indicates whether the iOS device supports world tracking with face tracking.


A Boolean value that enables world tracking with face tracking.

Tracking Multiple Faces


The maximum number of faces to track simultaneously.


The maximum number of faces which ARKit can simultaneously track.


Inherits From

See Also

Face Tracking

Tracking and Visualizing Faces

Detect faces in a front-camera AR experience, overlay virtual content, and animate facial expressions in real-time.


Information about the pose, topology, and expression of a face that ARKit detects in the front camera feed.