Augment the user's environment through either the front or back camera.
iOS devices come equiped with two cameras, and for each ARKit session you need to choose which camera's feed to augment. Since ARKit 3, you can get data from both cameras that ARKit provides simultaneously, but you still must choose one camera feed to show to the user at a time.
Augmented Reality with the Back Camera
The most common kinds of AR experience display a view from the device's back camera, augmented by other visual content, giving the user a new way to see and interact with the world around them.
ARWorld provides this kind of experience: ARKit tracks the real-world the user inhabits, and matches it with a coordinate space for you to place virtual content. World tracking also offers features to make AR experiences more immersive, like recognizing objects and images in the user's environment and responding to real-world lighting conditions.
Augmented Reality with the Front Camera
For iOS devices that have a TrueDepth camera,
ARFace enables you to augment the front-camera feed, while providing you with real-time tracking for the pose and expression of faces. With that information, that you might, for example, choose to overlay realistic virtual masks. Or, you might omit the camera view and use facial expression data to animate virtual characters, as done by the Animoji app for iMessage.