Class

ARBodyTrackingConfiguration

A configuration you use to track a person's motion in 3D space.

Declaration

class ARBodyTrackingConfiguration : ARConfiguration

Overview

When ARKit identifies a person in the back camera feed, it calls session(_:didAdd:), passing you an ARBodyAnchor you can use to track the body's movement.

Plane detection and image detection are enabled. If you use a body anchor to display a virtual character, you can set the character on a surface or image that you choose.

ARConfiguration.FrameSemantics type bodyDetection is enabled by default, which gives you access to the joint positions of a person that ARKit detects in the camera feed via the frame's detectedBody.

Topics

Creating a Configuration

init()

Creates a new body tracking configuration.

var initialWorldMap: ARWorldMap?

The state from a previous AR session to attempt to resume with this session configuration.

Estimating Body Scale

var automaticSkeletonScaleEstimationEnabled: Bool

A flag that determines whether ARKit estimates the height of a body that it's tracking.

Enabling Auto Focus

var isAutoFocusEnabled: Bool

A Boolean value that determines whether the device camera uses fixed focus or autofocus behavior.

Enabling Plane Detection

var planeDetection: ARWorldTrackingConfiguration.PlaneDetection

A value specifying whether and how the session attempts to automatically detect flat surfaces in the camera-captured image.

struct ARWorldTrackingConfiguration.PlaneDetection

Options for whether and how ARKit detects flat surfaces in captured images.

Enabling Image Tracking

var automaticImageScaleEstimationEnabled: Bool

A flag that instructs ARKit to estimate and set the scale of a tracked image on your behalf.

var detectionImages: Set<ARReferenceImage>

A set of images that ARKit attempts to detect in the user's environment.

var maximumNumberOfTrackedImages: Int

The maximum number of detection images, for which to simultaneously track movement.

Adding Realistic Reflections

var wantsHDREnvironmentTextures: Bool

A flag that instructs ARKit to create environment textures in HDR format.

var environmentTexturing: ARWorldTrackingConfiguration.EnvironmentTexturing

The behavior ARKit uses for generating environment textures.

Relationships

Inherits From

Conforms To

See Also

People

Capturing Body Motion in 3D

Track a person in the physical environment and visualize their motion by applying the same body movements to a virtual puppet.

class ARBodyAnchor

An object that tracks the movement in 3D space of a body that ARKit recognizes in the camera feed.

Beta
class ARBody2D

The screen-space representation of a person ARKit recognizes in the camera feed.

Beta

Beta Software

This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.

Learn more about using Apple's beta software