Cameras and Media Capture

Capture photos and record video and audio; configure built-in cameras and microphones or external capture devices.

Overview

The AVFoundation Capture subsystem provides a common high-level architecture for video, photo, and audio capture services in iOS and macOS. Use this system if you want to:

  • Build a custom camera UI to integrate shooting photos or videos into your app’s user experience.

  • Give users more direct control over photo and video capture, such as focus, exposure, and stabilization options.

  • Produce different results than the system camera UI, such as RAW format photos, depth maps, or videos with custom timed metadata.

  • Get live access to pixel or audio data streaming directly from a capture device.

The main parts of the capture architecture are sessions, inputs, and outputs: Capture sessions connect one or more inputs to one or more outputs. Inputs are sources of media, including capture devices like the cameras and microphones built into an iOS device or Mac. Outputs acquire media from inputs to produce useful data, such as movie files written to disk or raw pixel buffers available for live processing.

Block diagram of the basic capture session architecture: an AVCaptureSession acquires data from an AVCaptureDevice through AVCaptureDeviceInput, and provides data to one or more AVCaptureOutput objects.

Topics

User Privacy

Requesting Authorization for Media Capture on iOS

Respect user privacy by seeking permission to capture and store photos, audio, and video.

Requesting Authorization for Media Capture on macOS

Prompt the user to authorize access to the camera and microphone.

Capture Sessions

Setting Up a Capture Session

Configure input devices, output media, preview views, and basic settings before capturing photos or video.

AVCam: Building a Camera App

Capture photos with depth data and record video using the front and rear iPhone and iPad cameras.

AVMultiCamPiP: Capturing from Multiple Cameras

Simultaneously record the output from the front and back cameras into a single movie file by using a multi-camera capture session.

AVCaptureSession

An object that manages capture activity and coordinates the flow of data from input devices to capture outputs.

AVCaptureMultiCamSession

A capture session that supports simultaneous capture from multiple inputs of the same media type.

Capture Devices

Choosing a Capture Device

Select the front or back camera, or use advanced features like the TrueDepth camera or dual camera.

AVCaptureDevice

A device that provides input (such as audio or video) for capture sessions and offers controls for hardware-specific capture features.

AVCaptureDeviceInput

A capture input that provides media from a capture device to a capture session.

Photo Capture

Capturing Still and Live Photos

Configure and capture single or multiple still images, Live Photos, and other forms of photography.

Supporting Continuity Camera in Your Mac App

Incorporate scanned documents and pictures taken with a user's iPhone, iPad, or iPod touch into your Mac app using Continuity Camera.

AVCapturePhoto

A container for image data collected by a photo capture output.

AVCapturePhotoOutput

A capture output for still image, Live Photo, and other photography workflows.

AVCapturePhotoCaptureDelegate

Methods for monitoring progress and receiving results from a photo capture output.

Depth Data Capture

Capturing Photos with Depth

Get a depth map with a photo to create effects like the system camera’s Portrait mode (on compatible devices).

AVCamFilter: Applying Filters to a Capture Stream

Render a capture stream with rose-colored filtering and depth effects.

Streaming Depth Data from the TrueDepth Camera

Visualize depth data in 2D and 3D from the TrueDepth camera.

AVCaptureDepthDataOutput

A capture output that records scene depth information on compatible camera devices.

AVDepthData

A container for per-pixel distance or disparity information captured by compatible camera devices.

AVPortraitEffectsMatte

An auxiliary image used to separate foreground from background with high resolution.

AVSemanticSegmentationMatte

An object that wraps a matting image for a particular semantic segmentation.

Movie and Video Capture

Capturing Video in Alternative Formats

Change the format used for capturing movie files.

AVCaptureMovieFileOutput

A capture output that records video and audio to a QuickTime movie file.

AVCaptureVideoDataOutput

A capture output that records video and provides access to video frames for processing.

Audio Capture

AVCaptureAudioFileOutput

A capture output that records audio and saves the recorded audio to a file.

AVCaptureAudioDataOutput

A capture output that records audio and provides access to audio sample buffers as they are recorded.

Metadata Capture

AVMetadataBodyObject

An abstract class that defines the interface for a metadata body object.

AVMetadataCatBodyObject

An object representing a single detected cat body in a picture.

AVMetadataDogBodyObject

An object representing a single detected dog body in a picture.

AVMetadataHumanBodyObject

An object representing a single detected human body in a picture.

AVMetadataSalientObject

An object representing a single salient area in a picture.

AVCaptureMetadataInput

A capture input for providing timed metadata to a capture session.

AVCaptureMetadataOutput

A capture output for processing timed metadata produced by a capture session.

AVMetadataFaceObject

Face information detected by a metadata capture output.

AVMetadataMachineReadableCodeObject

Barcode information detected by a metadata capture output.

AVMetadataObject

The abstract superclass for objects provided by a metadata capture output.

Synchronized Capture

AVCaptureDataOutputSynchronizer

An object that coordinates time-matched delivery of data from multiple capture outputs.

AVCaptureSynchronizedDataCollection

A set of data samples collected simultaneously from multiple capture outputs.

AVCaptureSynchronizedDepthData

A container for scene depth information collected using synchronized capture.

AVCaptureSynchronizedMetadataObjectData

A container for metadata objects collected using synchronized capture.

AVCaptureSynchronizedSampleBufferData

A container for video or audio samples collected using synchronized capture.

AVCaptureSynchronizedData

The abstract superclass for media samples collected using synchronized capture.

Media Capture Preview

AVCaptureVideoPreviewLayer

A Core Animation layer that displays the video as it’s captured.

AVCaptureAudioPreviewOutput

A capture output that provides preview playback for audio being recorded in a capture session.

Mac Capture Options

AVCaptureScreenInput

A capture input for recording from a screen in macOS.

AVCaptureStillImageOutput

A capture output for capturing still photos in macOS.

Deprecated

Session Configuration

AVCaptureInput

The abstract superclass for objects that provide input data to a capture session.

AVCaptureOutput

The abstract superclass for objects that output the media recorded in a capture session.

AVCaptureConnection

A connection between a specific pair of capture input and capture output objects in a capture session.

AVCaptureAudioChannel

An object that monitors average and peak power levels for an audio channel in a capture connection.

File Output

AVCaptureFileOutput

The abstract superclass for capture outputs that can record captured data to a file.

AVCaptureFileOutputDelegate

Methods for monitoring or controlling the output of a media file capture.

AVCaptureFileOutputRecordingDelegate

Methods for responding to events that occur while recording captured media to a file.