A configuration that monitors the iOS device's position and orientation, while enabling you to augment the environment that's in front of the user.
All AR configurations establish a correspondence between the real world that the device inhabits and a virtual 3D-coordinate space, where you can model content. When your app mixes virtual content with a live camera image, the user experiences the illusion that your virtual content is part of the real world.
Creating and maintaining this correspondence between spaces requires tracking the device's motion. The ARWorldTrackingConfiguration class tracks the device's movement with six degrees of freedom (6DOF): specifically, the three rotation axes (roll, pitch, and yaw), and three translation axes (movement in x, y, and z).
This kind of tracking can create immersive AR experiences: A virtual object can appear to stay in the same place relative to the real world, even as the user tilts the device to look above or below the object, or moves the device around to see the object's sides and back.
World-tracking sessions also provide several ways for your app to recognize or interact with elements of the real-world scene visible to the camera:
Find 3D positions on real-world surfaces given a screen point.
This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.