A configuration you use when you just want to track faces using the device's front camera.


class ARFaceTrackingConfiguration : ARConfiguration


A face tracking configuration detects the faces that can be seen in the device’s front camera feed. When ARKit detects a face, it creates an ARFaceAnchor object that provides information about the face’s position and orientation, its topology, and features that describe facial expressions.

The ARFaceTrackingConfiguration class provides no methods or properties, but supports all properties inherited from its superclass ARConfiguration. Additionally, when you enable the isLightEstimationEnabled setting, a face tracking configuration uses the detected face as a light probe and provides an estimate of directional or environmental lighting (an ARDirectionalLightEstimate object).


Creating a Configuration


Creates a new face-tracking configuration.

Enabling World Tracking

class var supportsWorldTracking: Bool

A flag that indicates whether the iOS device supports world tracking with face tracking.

var isWorldTrackingEnabled: Bool

A Boolean value that enables world tracking with face tracking.

Tracking Multiple Faces

var maximumNumberOfTrackedFaces: Int

The maximum number of faces to track simultaneously.

class var supportedNumberOfTrackedFaces: Int

The maximum number of faces which ARKit can simultaneously track.


Inherits From

Conforms To

See Also

Face Tracking

Tracking and Visualizing Faces

Detect faces in a front-camera AR experience, overlay virtual content, and animate facial expressions in real-time.

Combining User Face-Tracking and World Tracking

Track the user’s face in an app that displays an AR experience with the rear camera.

class ARFaceAnchor

Information about the pose, topology, and expression of a face that ARKit detects in the front camera feed.