Class

ARFaceTrackingConfiguration

A configuration you use when you just want to track faces using the device's front camera.

Declaration

@interface ARFaceTrackingConfiguration : ARConfiguration

Overview

A face tracking configuration detects the faces that can be seen in the device’s front camera feed. When ARKit detects a face, it creates an ARFaceAnchor object that provides information about the face’s position and orientation, its topology, and features that describe facial expressions.

The ARFaceTrackingConfiguration class provides no methods or properties, but supports all properties inherited from its superclass ARConfiguration. Additionally, when you enable the lightEstimationEnabled setting, a face tracking configuration uses the detected face as a light probe and provides an estimate of directional or environmental lighting (an ARDirectionalLightEstimate object).

Topics

Creating a Configuration

- init

Creates a new face tracking configuration.

Enabling World Tracking

supportsWorldTracking

A flag that indicates whether the iOS device supports world tracking with face tracking.

worldTrackingEnabled

A Boolean value that enables world tracking with face tracking.

Tracking Multiple Faces

maximumNumberOfTrackedFaces

The maximum number of faces to track simultaneously.

supportedNumberOfTrackedFaces

The maximum number of faces which ARKit can simultaneously track.

Relationships

Inherits From

See Also

Face Tracking

Tracking and Visualizing Faces

Detect faces in a camera feed, overlay matching virtual content, and animate facial expressions in real-time.

ARFaceAnchor

Information about the pose, topology, and expression of a face that ARKit detects in the front camera feed.