Framework

ARKit

Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game.

Overview

Augmented reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device's camera, in a way that makes those elements appear to inhabit the real world. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can use these technologies to create many kinds of AR experiences, using either the back camera or front camera of an iOS device.

Topics

Essentials

Choosing Which Camera Feed to Augment

Augment the user's environment through either the front or back camera.

Verifying Device Support and User Permission

Check whether your app can use ARKit and respect user privacy at runtime.

Managing Session Lifecycle and Tracking Quality

Keep the user informed on the current session state and recover from interruptions.

ARSession

The main object you use to control an AR experience.

ARConfiguration

An object that defines the particular ARKit features you enable in your session at a given time.

ARAnchor

A position and orientation of something of interest in the physical environment.

Camera

Get details about a user's iOS device, like its position and orientation in 3D space, and the camera's video data and exposure.

Occluding Virtual Content with People

Cover your app’s virtual content with people that ARKit perceives in the camera feed.

ARFrame

A video image captured as part of a session with position tracking information.

ARCamera

Information about the camera position and imaging characteristics for a given frame.

Quick Look

The easiest way to add an AR experience to your app or website.

Previewing a Model with AR Quick Look

Display a model or scene that the user can move, scale, and share with others.

ARQuickLookPreviewItem

An object you use to customize the AR Quick Look experience.

Display

Create a full-featured AR experience using a view that handles the rendering for you.

ARSCNView

A view that enables you to display an AR experience with SceneKit.

ARSKView

A view that enables you to display an AR experience with SpriteKit.

World Tracking

Augment the environment surrounding the user, by tracking surfaces, images, objects, people, and faces.

Understanding World Tracking

Discover supporting concepts, features, and best practices for building great AR experiences.

ARWorldTrackingConfiguration

A configuration that monitors the iOS device's position and orientation while enabling you to augment the environment that's in front of the user.

ARPlaneAnchor

A 2D surface that ARKit detects in the physical environment.

Tracking and Visualizing Planes

Detect surfaces in the physical environment and visualize their shape and location in 3D space.

ARCoachingOverlayView

A view that presents visual instructions that guide the user.

Placing Objects and Handling 3D Interaction

Place virtual content on real-world surfaces, and enable the user to interact with virtual content by using gestures.

ARWorldMap

The space-mapping state and set of anchors from a world-tracking AR session.

Saving and Loading World Data

Serialize a world tracking session to resume it later on.

Ray-Casting and Hit-Testing

Find 3D positions on real-world surfaces given a screen point.

Face Tracking

Track faces that appear in the front camera feed.

Tracking and Visualizing Faces

Detect faces in a camera feed, overlay matching virtual content, and animate facial expressions in real-time.

ARFaceAnchor

Information about the pose, topology, and expression of a face that ARKit detects in the front camera feed.

ARFaceTrackingConfiguration

A configuration you use when you just want to track faces using the device's front camera.

People

React to people that ARKit identifies in the camera feed.

Capturing Body Motion in 3D

Track a person in the physical environment and visualize their motion by applying the same body movements to a virtual character.

ARBodyTrackingConfiguration

A configuration you use to track a person's motion in 3D space.

ARBodyAnchor

An object that tracks the movement in 3D space of a body that ARKit recognizes in the camera feed.

ARBody2D

The screen-space representation of a person ARKit recognizes in the camera feed.

Image Tracking

Recognize images in the physical environment and track their position and orientation.

Tracking and Altering Images

Create images from rectangular shapes found in the user’s environment, and augment their appearance.

Detecting Images in an AR Experience

React to known 2D images in the user’s environment, and use their positions to place AR content.

ARReferenceImage

The description of an image you want ARKit to detect in the physical environment.

ARImageAnchor

Information about the position and orientation of an image detected in a world-tracking AR session.

ARImageTrackingConfiguration

A configuration you use when you just want to track known images using the device's back camera feed.

Object Tracking

Recognize known objects at run-time by first scanning them with the scanner app.

Scanning and Detecting 3D Objects

Record spatial features of real-world objects, then use the results to find those objects in the user’s environment and trigger AR content.

ARReferenceObject

The description of a real-world object you want ARKit to look for in the physical environment during an AR session.

ARObjectAnchor

Information about the position and orientation of a real-world 3D object detected in a world-tracking AR session.

ARObjectScanningConfiguration

A configuration you use to collect high-fidelity spatial data about real objects in the physical environment.

Orientation Tracking

AROrientationTrackingConfiguration

A configuration you use when you just want to track the device's orientation using the device's back camera.

Positional Tracking

ARPositionalTrackingConfiguration

A configuration you use when you just want to track the device's position in space.

Rendering Effects

Adding Realistic Reflections to an AR Experience

Use ARKit to generate environment probe textures from camera imagery and render reflective virtual objects.

AREnvironmentProbeAnchor

An object that provides environmental lighting information for a specific area of space in a world-tracking AR session.

ARLightEstimate

Estimated scene lighting information associated with a captured video frame in an AR session.

ARDirectionalLightEstimate

Estimated environmental lighting information associated with a captured video frame in a face-tracking AR session.

Multiuser

Communicate with other devices to create a shared AR experience.

Creating a Collaborative Session

Enable nearby devices to share an AR experience by using a peer-to-peer multiuser strategy.

Creating a Multiuser AR Experience

Enable nearby devices to share an AR experience by using a host-guest multiuser strategy.

SwiftShot: Creating a Game for Augmented Reality

See how Apple built the featured demo for WWDC18, and get tips for making your own multiplayer games using ARKit, SceneKit, and Swift.

ARCollaborationData

An object that holds information that a user has collected about the physical environment.

ARParticipantAnchor

An anchor that represents another user in a multiuser AR experience.

Audio

Creating an Immersive AR Experience with Audio

Use sound effects and environmental sound layers to create an engaging AR experience.

Text

Annotate an AR experience by displaying anchored text.

Creating Screen Annotations for Objects in an AR Experience

Annotate an AR experience with virtual sticky notes that you display onscreen over real and virtual objects.

Recognizing and Labeling Arbitrary Objects

Create anchors that track objects you recognize in the camera feed, using a custom optical-recognition algorithm.

Custom Display

Create a full-featured AR experience by implementing your own renderer.

Displaying an AR Experience with Metal

Control rendering of your app's virtual content on top of a camera feed.

ARMatteGenerator

An object that creates matte textures you use to occlude your app's virtual content with people, that ARKit recognizes in the camera feed.

Effecting People Occlusion in Custom Renderers

Occlude your app’s virtual content where ARKit recognizes people in the camera feed by using matte generator.