Estimated environmental lighting information associated with a captured video frame in a face-tracking AR session.
- iOS 11.0+
When you run a face tracking AR session (see
ARFace) with the
light property set to
YES, ARKit uses the detected face as a light probe to estimate the directional lighting environment in the scene. The
light property of each frame vended by the session contains an
ARDirectional instance containing this information.
If you render your own overlay graphics for the AR scene, you can use this information in shading algorithms to help make those graphics match the real-world lighting conditions of the scene captured by the camera. (The
ARSCNView class automatically uses this information to configure SceneKit lighting.)