A 3D mesh describing face topology used in face-tracking AR sessions.
SDK
- iOS 11.0+
Framework
- ARKit
Declaration
@interface ARFaceGeometry : NSObject
Overview
This class provides a general model for the detailed topology of a face, in the form of a 3D mesh appropriate for use with various rendering technologies or for exporting 3D assets. (For a quick way to visualize a face geometry using SceneKit, see the ARSCNFace
class.)
When you obtain a face geometry from an ARFace
object in a face-tracking AR session, the model conforms to match the dimensions, shape, and current expression of the detected face. You can also create a face mesh using a dictionary of named blend shape coefficients, which provides a detailed, but more efficient, description of the face’s current expression.
In an AR session, you can use this model as the basis for overlaying content that follows the shape of the user’s face—for example, to apply virtual makeup or tattoos. You can also use this model to create occlusion geometry, which hides other virtual content behind the 3D shape of the detected face in the camera image.
Note
Face mesh topology is constant across ARFace
instances. That is, the values of the vertex
, texture
, and triangle
properties never change, the triangle
buffer always describes the same arrangement of vertices, and the texture
buffer always maps the same vertex indices to the same texture coordinates.
Only the vertices
buffer changes between face meshes provided by an AR session, indicating the change in vertex positions as ARKit adapts the mesh to the shape and expression of the user's face.