Cameras and Media Capture

Capture photos and record video and audio; configure built-in cameras and microphones or external capture devices.


The AVFoundation Capture subsystem provides a common high-level architecture for video, photo, and audio capture services in iOS and macOS. Use this system if you want to:

  • Build a custom camera UI to integrate shooting photos or videos into your app’s user experience.

  • Give users more direct control over photo and video capture, such as focus, exposure, and stabilization options.

  • Produce different results than the system camera UI, such as RAW format photos, depth maps, or videos with custom timed metadata.

  • Get live access to pixel or audio data streaming directly from a capture device.

The main parts of the capture architecture are sessions, inputs, and outputs: Capture sessions connect one or more inputs to one or more outputs. Inputs are sources of media, including capture devices like the cameras and microphones built into an iOS device or Mac. Outputs acquire media from inputs to produce useful data, such as movie files written to disk or raw pixel buffers available for live processing.

Block diagram of the basic capture session architecture: an AVCaptureSession acquires data from an AVCaptureDevice through AVCaptureDeviceInput, and provides data to one or more AVCaptureOutput objects.


User Privacy

Requesting Authorization for Media Capture on iOS

Respect user privacy by seeking permission to capture and store photos, audio, and video.

Requesting Authorization for Media Capture on macOS

Prompt the user to authorize access to the camera and microphone.

Capture Sessions

Setting Up a Capture Session

Configure input devices, output media, preview views, and basic settings before capturing photos or video.

AVCam: Building a Camera App

Capture photos with depth data and record video using the front and rear iPhone and iPad cameras.

AVMultiCamPiP: Capturing from Multiple Cameras

Simultaneously record the output from the front and back cameras into a single movie file by using a multi-camera capture session.

class AVCaptureSession

An object that manages capture activity and coordinates the flow of data from input devices to capture outputs.

class AVCaptureMultiCamSession

A capture session that supports simultaneous capture from multiple inputs of the same media type.

Capture Devices

Choosing a Capture Device

Select the front or back camera, or use advanced features like the TrueDepth camera or dual camera.

class AVCaptureDevice

A device that provides input (such as audio or video) for capture sessions and offers controls for hardware-specific capture features.

class AVCaptureDeviceInput

A capture input that provides media from a capture device to a capture session.

Photo Capture

Capturing Still and Live Photos

Configure and capture single or multiple still images, Live Photos, and other forms of photography.

Supporting Continuity Camera in Your Mac App

Incorporate scanned documents and pictures taken with a user's iPhone, iPad, or iPod touch into your Mac app using Continuity Camera.

class AVCapturePhoto

A container for image data collected by a photo capture output.

class AVCapturePhotoOutput

A capture output for still image, Live Photo, and other photography workflows.

protocol AVCapturePhotoCaptureDelegate

Methods for monitoring progress and receiving results from a photo capture output.

Depth Data Capture

Capturing Photos with Depth

Get a depth map with a photo to create effects like the system camera’s Portrait mode (on compatible devices).

AVCamFilter: Applying Filters to a Capture Stream

Render a capture stream with rose-colored filtering and depth effects.

Streaming Depth Data from the TrueDepth Camera

Visualize depth data in 2D and 3D from the TrueDepth camera.

Enhancing Live Video by Leveraging TrueDepth Camera Data

Apply your own background to a live capture feed streamed from the front-facing TrueDepth camera.

class AVCaptureDepthDataOutput

A capture output that records scene depth information on compatible camera devices.

class AVDepthData

A container for per-pixel distance or disparity information captured by compatible camera devices.

class AVPortraitEffectsMatte

An auxiliary image used to separate foreground from background with high resolution.

class AVSemanticSegmentationMatte

An object that wraps a matting image for a particular semantic segmentation.

Movie and Video Capture

Capturing Video in Alternative Formats

Change the format used for capturing movie files.

class AVCaptureMovieFileOutput

A capture output that records video and audio to a QuickTime movie file.

class AVCaptureVideoDataOutput

A capture output that records video and provides access to video frames for processing.

Audio Capture

class AVCaptureAudioFileOutput

A capture output that records audio and saves the recorded audio to a file.

class AVCaptureAudioDataOutput

A capture output that records audio and provides access to audio sample buffers as they are recorded.

Metadata Capture

class AVMetadataBodyObject

An abstract class that defines the interface for a metadata body object.

class AVMetadataCatBodyObject

An object representing a single detected cat body in a picture.

class AVMetadataDogBodyObject

An object representing a single detected dog body in a picture.

class AVMetadataHumanBodyObject

An object representing a single detected human body in a picture.

class AVMetadataSalientObject

An object representing a single salient area in a picture.

class AVCaptureMetadataInput

A capture input for providing timed metadata to a capture session.

class AVCaptureMetadataOutput

A capture output for processing timed metadata produced by a capture session.

class AVMetadataFaceObject

Face information detected by a metadata capture output.

class AVMetadataMachineReadableCodeObject

Barcode information detected by a metadata capture output.

class AVMetadataObject

The abstract superclass for objects provided by a metadata capture output.

Synchronized Capture

class AVCaptureDataOutputSynchronizer

An object that coordinates time-matched delivery of data from multiple capture outputs.

class AVCaptureSynchronizedDataCollection

A set of data samples collected simultaneously from multiple capture outputs.

class AVCaptureSynchronizedDepthData

A container for scene depth information collected using synchronized capture.

class AVCaptureSynchronizedMetadataObjectData

A container for metadata objects collected using synchronized capture.

class AVCaptureSynchronizedSampleBufferData

A container for video or audio samples collected using synchronized capture.

class AVCaptureSynchronizedData

The abstract superclass for media samples collected using synchronized capture.

Media Capture Preview

class AVCaptureVideoPreviewLayer

A Core Animation layer that displays the video as it’s captured.

class AVCaptureAudioPreviewOutput

A capture output that provides preview playback for audio being recorded in a capture session.

Mac Capture Options

class AVCaptureScreenInput

A capture input for recording from a screen in macOS.

class AVCaptureStillImageOutput

A capture output for capturing still photos in macOS.


Session Configuration

class AVCaptureInput

The abstract superclass for objects that provide input data to a capture session.

class AVCaptureOutput

The abstract superclass for objects that output the media recorded in a capture session.

class AVCaptureConnection

A connection between a specific pair of capture input and capture output objects in a capture session.

class AVCaptureAudioChannel

An object that monitors average and peak power levels for an audio channel in a capture connection.

File Output

class AVCaptureFileOutput

The abstract superclass for capture outputs that can record captured data to a file.

protocol AVCaptureFileOutputDelegate

Methods for monitoring or controlling the output of a media file capture.

protocol AVCaptureFileOutputRecordingDelegate

Methods for responding to events that occur while recording captured media to a file.


struct AVError

The error codes for the AVError domain.