A configuration that uses the back-facing camera, tracks a device's orientation and position, and detects real-world flat surfaces.
- iOS 11.0+
All AR configurations establish a correspondence between the real world the device inhabits and a virtual 3D coordinate space where you can model content. When your app displays that content together with a live camera image, the user experiences the illusion that your virtual content is part of the real world.
Creating and maintaining this correspondence between spaces requires tracking the device's motion. The
ARWorld class tracks the device's movement with six degrees of freedom (6DOF): specifically, the three rotation axes (roll, pitch, and yaw), and three translation axes (movement in x, y, and z).
This kind of tracking can create immersive AR experiences: A virtual object can appear to stay in the same place relative to the real world, even as the user tilts the device to look above or below the object, or moves the device around to see the object's sides and back.
If you enable the
plane setting, ARKit analyzes the scene to find real-world horizontal or vertical surfaces. For each plane detected, ARKit automatically adds an
ARPlane object to the session.
If you provide images for the detectionImages property, ARKit analyzes the scene to recognize those images. For each detected image, ARKit automatically adds an
ARImage object to the session.