World Tracking

Create a rear-camera AR experience by tracking surfaces, images, objects, people, or user faces.


When you run an AR session using ARWorldTrackingConfiguration, your app can detect real-world surfaces, and interact with them by placing virtual content. You can enable optional features too, like image detection and tracking, known-object detection, body tracking and people occlusion, user face-tracking, realistic reflections, world saving and loading, and scene reconstruction.

Illustration of an iPhone running an app that displays an AR experience using the rear camera. The physical environment is depicted as a living room with a couch, on which, the app displays a virtual character.

Consider Device Resources

Although ARWorldTrackingConfiguration provides the widest range of features in a rear-camera experience, you should enable additional options sparingly. Each feature consumes device energy and compute cycles, so to maximize device uptime and performance, enable only the world-tracking features your app currently requires.

Take proactive measures by disabling options your app doesn't require ahead of time. For example, you should refrain from turning on people occlusion for single-user experiences in which you don’t expect people in the scene. If another AR configuration fulfills your requirements with a more concise feature set, use that configuration instead. For example, use ARBodyTrackingConfiguration instead of a world-tracking configuration for 3D motion-capture if you don't need user face-tracking, collaboration, or scene reconstruction. For a list of available configurations, see Choose Your Configuration.

Gracefully downgrade the AR experience in the event of low-power or thermal events. For example, you could temporarily switch from a world-tracking configuration to a position-tracking configuration (ARPositionalTrackingConfiguration), if your app can function at a basic level in that limited capacity until the device cools down. When you change your app's configuration mid-experience, call run(with:) using the updated or new configuration. For more information, see Switch Configurations at Runtime.


Incorporating World Tracking

Understanding World Tracking

Discover features and best practices for building rear-camera AR experiences.

class ARWorldTrackingConfiguration

A configuration that monitors the iOS device's position and orientation while enabling you to augment the environment that's in front of the user.

On-Boarding Users to the Experience

class ARCoachingOverlayView

A view that presents visual instructions that guide the user during session initialization and recovery.

Tracking and Interacting with Surfaces

Tracking and Visualizing Planes

Detect surfaces in the physical environment and visualize their shape and location in 3D space.

Visualizing and Interacting with a Reconstructed Scene

Estimate the shape of the physical environment using a polygonal mesh.

class ARPlaneAnchor

A 2D surface that ARKit detects in the physical environment.

class ARMeshAnchor

A section of the reconstructed-scene mesh.

Raycasting and Hit-Testing

Find points on real-world surfaces given a screen location.

Placing Virtual Content

Placing Objects and Handling 3D Interaction

Place virtual content at tracked, real-world locations, and enable the user to interact with virtual content by using gestures.

Managing World Data

Saving and Loading World Data

Serialize a world-tracking session to resume it later on.

class ARWorldMap

The space-mapping state and set of anchors from a world-tracking AR session.