Estimated environmental lighting information associated with a captured video frame in a face-tracking AR session.


When you run a face tracking AR session (see ARFaceTrackingConfiguration) with the isLightEstimationEnabled property set to true, ARKit uses the detected face as a light probe to estimate the directional lighting environment in the scene. The lightEstimate property of each frame vended by the session contains an ARDirectionalLightEstimate instance containing this information.

If you render your own overlay graphics for the AR scene, you can use this information in shading algorithms to help make those graphics match the real-world lighting conditions of the scene captured by the camera. (The ARSCNView class automatically uses this information to configure SceneKit lighting.)


Examining Light Parameters

var sphericalHarmonicsCoefficients: Data

Data describing the estimated lighting environment in all directions.

var primaryLightDirection: simd_float3

A vector indicating the orientation of the strongest directional light source in the scene.

var primaryLightIntensity: CGFloat

The estimated intensity, in lumens, of the strongest directional light source in the scene.


Inherits From

Conforms To

See Also

Face Tracking

Creating Face-Based AR Experiences

Use the information provided by a face tracking AR session to place and animate 3D content.

class ARFaceTrackingConfiguration

A configuration that tracks the movement and expressions of the user’s face with the TrueDepth camera.

class ARFaceAnchor

Information about the pose, topology, and expression of a face detected in a face-tracking AR session.