Framework

ARKit

Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game.

Overview

Augmented reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device's camera, in a way that makes those elements appear to inhabit the real world. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can use these technologies to create many kinds of AR experiences, using either the back camera or front camera of an iOS device.

Topics

Essentials

Choosing Which Camera Feed to Augment

Augment the user's environment through either the front or back camera.

Verifying Device Support and User Permission

Check whether your app can use ARKit and respect user privacy at runtime.

Managing Session Lifecycle and Tracking Quality

Keep the user informed on the current session state and recover from interruptions.

class ARSession

The main object you use to control an AR experience.

class ARConfiguration

An object that defines the particular ARKit features you enable in your session at a given time.

class ARAnchor

A position and orientation of something of interest in the physical environment.

Camera

Get details about a user's iOS device, like its position and orientation in 3D space, and the camera's video data and exposure.

Occluding Virtual Content with People

Enable people to cover your app’s virtual content where ARKit recognizes people in the camera feed.

class ARFrame

A video image captured as part of a session with position tracking information.

class ARCamera

Information about the camera position and imaging characteristics for a given frame.

Quick Look

The easiest way to add an AR experience to your app or website.

Previewing a Model with AR Quick Look

Display a model or scene that the user can move, scale, and share with others.

class ARQuickLookPreviewItem

An object you use to customize the AR Quick Look experience.

Beta

Display

Create a full-featured AR experience using a view that handles the rendering for you.

class ARView

A view that enables you to display an AR experience with RealityKit.

Beta
class ARSCNView

A view that enables you to display an AR experience with SceneKit.

class ARSKView

A view that enables you to display an AR experience with SpriteKit.

World Tracking

Augment the environment surrounding the user, by tracking surfaces, images, objects, people, and faces.

Understanding World Tracking

Discover supporting concepts, features, and best practices for building great AR experiences.

class ARWorldTrackingConfiguration

A configuration that monitors the iOS device's position and orientation while enabling you to augment the environment that's in front of the user.

class ARPlaneAnchor

A 2D surface that ARKit detects in the physical environment.

Tracking and Visualizing Planes

Detect surfaces in the physical environment and visualize their shape and location in 3D space.

class ARCoachingOverlayView

A view that presents visual instructions that guide the user during session initialization and in limited tracking situations.

Beta
Placing Objects and Handling 3D Interaction

Place virtual content on real-world surfaces, and enable the user to interact with virtual content by using gestures.

class ARWorldMap

The space-mapping state and set of anchors from a world-tracking AR session.

Saving and Loading World Data

Serialize a world tracking session to resume it later on.

Ray-Casting and Hit-Testing

Find 3D positions on real-world surfaces given a screen point.

Face Tracking

Track faces that appear in the front camera feed.

Tracking and Visualizing Faces

Detect faces in a camera feed, overlay matching virtual content, and animate facial expressions in real-time.

class ARFaceAnchor

Information about the pose, topology, and expression of a face that ARKit detects in the front camera feed.

class ARFaceTrackingConfiguration

A configuration you use when you just want to track faces using the device's front camera.

People

React to people that ARKit identifies in the camera feed.

Capturing Body Motion in 3D

Track a person in the physical environment and visualize their motion by applying the same body movements to a virtual puppet.

class ARBodyTrackingConfiguration

A configuration you use to track a person's motion in 3D space.

Beta
class ARBodyAnchor

An object that tracks the movement in 3D space of a body that ARKit recognizes in the camera feed.

Beta
class ARBody2D

The screen-space representation of a person ARKit recognizes in the camera feed.

Beta

Image Tracking

Recognize images in the physical environment and track their position and orientation.

Tracking and Altering Images

Create images from rectangular shapes found in the user’s environment, and augment their appearance.

Detecting Images in an AR Experience

React to known 2D images in the user’s environment, and use their positions to place AR content.

class ARReferenceImage

The description of an image you want ARKit to detect in the physical environment.

class ARImageAnchor

Information about the position and orientation of an image detected in a world-tracking AR session.

class ARImageTrackingConfiguration

A configuration you use when you just want to track known images using the device's back camera feed.

Object Tracking

Recognize known objects at run-time by first scanning them with the scanner app.

Scanning and Detecting 3D Objects

Record spatial features of real-world objects, then use the results to find those objects in the user’s environment and trigger AR content.

class ARReferenceObject

The description of a real-world object you want ARKit to look for in the physical environment during an AR session.

class ARObjectAnchor

Information about the position and orientation of a real-world 3D object detected in a world-tracking AR session.

class ARObjectScanningConfiguration

A configuration you use to collect high-fidelity spatial data about real objects in the physical environment.

Orientation Tracking

class AROrientationTrackingConfiguration

A configuration you use when you just want to track the device's orientation using the device's back camera.

Positional Tracking

class ARPositionalTrackingConfiguration

A configuration you use when you just want to track the device's position in space.

Beta

Rendering Effects

Adding Realistic Reflections to an AR Experience

Use ARKit to generate environment probe textures from camera imagery and render reflective virtual objects.

class AREnvironmentProbeAnchor

An object that provides environmental lighting information for a specific area of space in a world-tracking AR session.

class ARLightEstimate

Estimated scene lighting information associated with a captured video frame in an AR session.

class ARDirectionalLightEstimate

Estimated environmental lighting information associated with a captured video frame in a face-tracking AR session.

Multiuser

Communicate with other devices to create a multiuser AR app, or multiplayer game.

Creating a Multiuser AR Experience

Transmit ARKit world-mapping data between nearby devices with the MultipeerConnectivity framework to create a shared basis for AR experiences.

SwiftShot: Creating a Game for Augmented Reality

See how Apple built the featured demo for WWDC18, and get tips for making your own multiplayer games using ARKit, SceneKit, and Swift.

class ARSession.CollaborationData

An object that holds information about the physical environment collected by a user.

Beta
class ARParticipantAnchor

An anchor that represents another user in a multiuser AR experience.

Beta

Audio

Creating an Immersive AR Experience with Audio

Use sound effects and environmental sound layers to create an engaging AR experience.

Custom Recognition

Recognizing and Labeling Arbitrary Objects

Create anchors that track objects you recognize in the camera feed, using a custom optical-recognition algorithm.

Custom Display

Create a full-featured AR experience by implementing your own renderer.

Displaying an AR Experience with Metal

Control rendering of your app's virtual content on top of a camera feed.

class ARMatteGenerator

An object that creates matte textures you use to occlude your app's virtual content with people, that ARKit recognizes in the camera feed.

Beta
Effecting People Occlusion in Custom Renderers

Occlude your app’s virtual content where ARKit recognizes people in the camera feed by using matte generator.

Beta Software

This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.

Learn more about using Apple's beta software