Class

ARFrame

A video image captured as part of a session with position tracking information.

Declaration

class ARFrame : NSObject

Overview

A running session continuously captures video frames from the device camera while ARKit analyzes them to estimate the user's position in the world. ARKit also provides this information to you in the form of an ARFrame, and at the frequency of your app's frame rate.

Your app has two ways to receive an ARFrame:

Topics

Accessing Captured Video Frames

var capturedImage: CVPixelBuffer

A pixel buffer containing the image captured by the camera.

var timestamp: TimeInterval

The time at which the frame was captured.

var capturedDepthData: AVDepthData?

The depth map, if any, captured along with the video frame.

var capturedDepthDataTimestamp: TimeInterval

The time at which depth data for the frame (if any) was captured.

Checking World Mapping Status

var worldMappingStatus: ARFrame.WorldMappingStatus

The feasibility of generating or relocalizing a world map for this frame.

enum ARFrame.WorldMappingStatus

Possible values describing how thoroughly ARKit has mapped the the area visible in a given frame.

Examining Scene Parameters

var camera: ARCamera

Information about the camera position, orientation, and imaging parameters used to capture the frame.

var lightEstimate: ARLightEstimate?

An estimate of lighting conditions based on the camera image.

func displayTransform(for: UIInterfaceOrientation, viewportSize: CGSize) -> CGAffineTransform

Returns an affine transform for converting between normalized image coordinates and a coordinate space appropriate for rendering the camera image onscreen.

Tracking and Finding Objects

var anchors: [ARAnchor]

The list of anchors representing positions tracked or objects detected in the scene.

func hitTest(CGPoint, types: ARHitTestResult.ResultType) -> [ARHitTestResult]

Searches for real-world objects or AR anchors in the captured camera image.

Debugging Scene Detection

var rawFeaturePoints: ARPointCloud?

The current intermediate results of the scene analysis ARKit uses to perform world tracking.

class ARPointCloud

A collection of points in the world coordinate space of the AR session.

Tracking Human Bodies in 2D

var detectedBody: ARBody2D?

The screen position information of a body that ARKit recognizes in the camera image.

class ARBody2D

The screen-space representation of a person ARKit recognizes in the camera feed.

Occluding Virtual Content with People

var segmentationBuffer: CVPixelBuffer?

A buffer that contains pixel information identifying the shape of objects from the camera feed that you use to occlude virtual content.

var estimatedDepthData: CVPixelBuffer?

A buffer that represents the estimated depth values from the camera feed that you use to occlude virtual content.

enum ARFrame.SegmentationClass

A categorization of a pixel that defines a type of content you use to occlude your app's virtual content.

Enabling Camera Grain

var cameraGrainIntensity: Float

A value that specifies the amount of grain present in the camera grain texture.

var cameraGrainTexture: MTLTexture?

A tileable Metal texture created by ARKit to match the visual characteristics of the current video stream.

Relationships

Inherits From

See Also

Camera

Occluding Virtual Content with People

Cover your app’s virtual content with people that ARKit perceives in the camera feed.

class ARCamera

Information about the camera position and imaging characteristics for a given frame.