A configuration that monitors the iOS device's position and orientation while enabling you to augment the environment that's in front of the user.
SDK
- iOS 11.0+
Framework
- ARKit
Declaration
@interface ARWorldTrackingConfiguration : ARConfiguration
Overview
All AR configurations establish a correspondence between the real world that the device inhabits and a virtual 3D-coordinate space, where you can model content. When your app mixes virtual content with a live camera image, the user experiences the illusion that your virtual content is part of the real world.
Creating and maintaining this correspondence between spaces requires tracking the device's motion. The ARWorld
class tracks the device's movement with six degrees of freedom (6DOF): specifically, the three rotation axes (roll, pitch, and yaw), and three translation axes (movement in x, y, and z).
This kind of tracking can create immersive AR experiences: A virtual object can appear to stay in the same place relative to the real world, even as the user tilts the device to look above or below the object, or moves the device around to see the object's sides and back.
6DOF tracking maintains an AR illusion regardless of device rotation or movement

World-tracking sessions also provide several ways for your app to recognize or interact with elements of the real-world scene visible to the camera:
Use
plane
to find real-world horizontal or vertical surfaces, adding them to the session asDetection ARPlane
objects.Anchor Use
detection
to recognize and track the movement of known 2D images, adding them to the scene asImages ARImage
objects.Anchor Use
detection
to recognize known 3D objects, adding them to the scene asObjects ARObject
objects.Anchor Use hit-testing methods on
ARFrame
,ARSCNView
, orARSKView
to find the 3D positions of real-world features correspoding to a 2D point in the camera view.