Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game.
- iOS 11.0+
Augmented reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device's camera in a way that makes those elements appear to inhabit the real world. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can use these technologies to create many kinds of AR experiences using either the back camera or front camera of an iOS device.
Augmented Reality with the Back Camera
The most common kinds of AR experience display a view from an iOS device's back-facing camera, augmented by other visual content, giving the user a new way to see and interact with the world around them.
ARWorld provides this kind of experience: ARKit maps and tracks the real-world space the user inhabits, and matches it with a coordinate space for you to place virtual content. World tracking also offers features to make AR experiences more immersive, such as recognizing objects and images in the user's environment and responding to real-world lighting conditions.
Augmented Reality with the Front Camera
On iPhone X,
ARFace uses the front-facing TrueDepth camera to provide real-time information about the pose and expression of the user's face for you to use in rendering virtual content. For example, you might show the user's face in a camera view and provide realistic virtual masks. You can also omit the camera view and use ARKit facial expression data to animate virtual characters, as seen in the Animoji app for iMessage.