A configuration that monitors the iOS device's position and orientation while enabling you to augment the environment that's in front of the user.


@interface ARWorldTrackingConfiguration : ARConfiguration


All AR configurations establish a correspondence between the real world that the device inhabits and a virtual 3D-coordinate space, where you can model content. When your app mixes virtual content with a live camera image, the user experiences the illusion that your virtual content is part of the real world.

Creating and maintaining this correspondence between spaces requires tracking the device's motion. The ARWorldTrackingConfiguration class tracks the device's movement with six degrees of freedom (6DOF): specifically, the three rotation axes (roll, pitch, and yaw), and three translation axes (movement in x, y, and z).

This kind of tracking can create immersive AR experiences: A virtual object can appear to stay in the same place relative to the real world, even as the user tilts the device to look above or below the object, or moves the device around to see the object's sides and back.

Figure 1

6DOF tracking maintains an AR illusion regardless of device rotation or movement

World-tracking sessions also provide several ways for your app to recognize or interact with elements of the real-world scene visible to the camera:


Creating a Configuration

- init

Initializes a new world-tracking configuration.

+ new

Creates a new world-tracking configuration.


The state from a previous AR session to attempt to resume with this session configuration.

Enabling Plane Detection


A value that specifies if and how the session automatically attempts to detect flat surfaces in the camera-captured image.


Options for whether and how ARKit detects flat surfaces in captured images.

Enabling Image Detection or Tracking


A set of images that ARKit attempts to detect in the user's environment.


The maximum number of detection images for which to simultaneously track movement.


A flag that instructs ARKit to estimate and set the scale of a detected or tracked image on your behalf.

Enabling 3D Object Detection


A set of 3D objects for ARKit to attempt to detect in the user's environment.

Enabling Face Tracking


A flag that determines whether ARKit tracks the user's face in a world tracking session.


A Boolean value that tells you whether the iOS device supports tracking the user's face during a world-tracking session.

Creating Realistic Reflections


The behavior ARKit uses for generating environment textures.


Options for generating environment textures in a world-tracking AR session.


An object that provides environmental lighting information for a specific area of space in a world-tracking AR session.


A flag that instructs ARKit to create environment textures in HDR format.

Managing Device Camera Behavior


A Boolean value that determines whether the device camera uses fixed focus or autofocus behavior.

Enabling Collaboration


A flag that opts you in to a peer-to-peer multiuser AR experience.


Inherits From

See Also

World Tracking

Understanding World Tracking

Discover supporting concepts, features, and best practices for building great AR experiences.


A 2D surface that ARKit detects in the physical environment.

Tracking and Visualizing Planes

Detect surfaces in the physical environment and visualize their shape and location in 3D space.


A view that presents visual instructions that guide the user.

Placing Objects and Handling 3D Interaction

Place virtual content on real-world surfaces, and enable the user to interact with virtual content by using gestures.


The space-mapping state and set of anchors from a world-tracking AR session.

Saving and Loading World Data

Serialize a world tracking session to resume it later on.

Ray-Casting and Hit-Testing

Find 3D positions on real-world surfaces given a screen point.