Create a rear-camera AR experience by tracking surfaces, images, objects, people, or user faces.
When you run an AR session using
ARWorld, your app can detect real-world surfaces, and interact with them by placing virtual content. You can enable optional features too, like image detection and tracking, known-object detection, body tracking and people occlusion, user face-tracking, realistic reflections, world saving and loading, and scene reconstruction.
Consider Device Resources
ARWorld provides the widest range of features in a rear-camera experience, you should enable additional options sparingly. Each feature consumes device energy and compute cycles, so to maximize device uptime and performance, enable only the world-tracking features your app currently requires.
Take proactive measures by disabling options your app doesn't require ahead of time. For example, you should refrain from turning on people occlusion for single-user experiences in which you don’t expect people in the scene. If another AR configuration fulfills your requirements with a more concise feature set, use that configuration instead. For example, use
ARBody instead of a world-tracking configuration for 3D motion-capture if you don't need user face-tracking, collaboration, or scene reconstruction. For a list of available configurations, see Choose Your Configuration.
Gracefully downgrade the AR experience in the event of low-power or thermal events. For example, you could temporarily switch from a world-tracking configuration to a position-tracking configuration (
ARPositional), if your app can function at a basic level in that limited capacity until the device cools down. When you change your app's configuration mid-experience, call
run(with:) using the updated or new configuration. For more information, see Switch Configurations at Runtime.