AVCaptureVideoDataOutput Class Reference

Inherits from
Conforms to
Framework
/System/Library/Frameworks/AVFoundation.framework
Availability
Available in iOS 4.0 and later.
Declared in
AVCaptureOutput.h
Related sample code

Overview

AVCaptureVideoDataOutput is a concrete sub-class of AVCaptureOutput you use to process uncompressed frames from the video being captured, or to access compressed frames.

An instance of AVCaptureVideoDataOutput produces video frames you can process using other media APIs. You can access the frames with the captureOutput:didOutputSampleBuffer:fromConnection: delegate method.

Tasks

Configuration

Retrieving Supported Video Types

Managing the Delegate

Properties

alwaysDiscardsLateVideoFrames

Indicates whether video frames are dropped if they arrive late.

@property(nonatomic) BOOL alwaysDiscardsLateVideoFrames
Discussion

When the value of this property is YES, the object immediately discards frames that are captured while the dispatch queue handling existing frames is blocked in the captureOutput:didOutputSampleBuffer:fromConnection: delegate method.

When the value of this property is NO, delegates are allowed more time to process old frames before new frames are discarded, but application memory usage may increase significantly as a result.

The default is YES.

Availability
  • Available in iOS 4.0 and later.
Declared In
AVCaptureOutput.h

availableVideoCodecTypes

Indicates the supported video codec formats that can be specified in videoSettings. (read-only)

@property(nonatomic, readonly) NSArray *availableVideoCodecTypes
Discussion

The value of this property is an array of NSString objects you can use as values for the AVVideoCodecKey in the videoSettings property. The first format in the returned list is the most efficient output format.

Availability
  • Available in iOS 5.0 and later.
Declared In
AVCaptureOutput.h

availableVideoCVPixelFormatTypes

Indicates the supported video pixel formats that can be specified in videoSettings. (read-only)

@property(nonatomic, readonly) NSArray *availableVideoCVPixelFormatTypes
Discussion

The value of this property is an array of NSNumber objects you can use as values for the kCVPixelBufferPixelFormatTypeKey in the videoSettings property. The first format in the returned list is the most efficient output format.

Availability
  • Available in iOS 5.0 and later.
Declared In
AVCaptureOutput.h

sampleBufferCallbackQueue

The queue on which delegate callbacks should be invoked (read-only)

@property(nonatomic, readonly) dispatch_queue_t sampleBufferCallbackQueue
Discussion

You set the queue using setSampleBufferDelegate:queue:.

Availability
  • Available in iOS 4.0 and later.
Declared In
AVCaptureOutput.h

sampleBufferDelegate

The capture object’s delegate. (read-only)

@property(nonatomic, readonly) id<AVCaptureVideoDataOutputSampleBufferDelegate> sampleBufferDelegate
Discussion

The delegate receives sample buffers after they are captured.

You set the delegate using setSampleBufferDelegate:queue:.

Availability
  • Available in iOS 4.0 and later.
Declared In
AVCaptureOutput.h

videoSettings

The compression settings for the output.

@property(nonatomic, copy) NSDictionary *videoSettings
Discussion

The dictionary contains values for compression settings keys defined in Video Settings, or pixel buffer attributes keys defined in CVPixelBufferRef. The only key currently supported is the kCVPixelBufferPixelFormatTypeKey key.

To get possible values for the supported video pixel formats (kCVPixelBufferPixelFormatTypeKey) and video codec formats (AVVideoCodecKey), see availableVideoCVPixelFormatTypes and availableVideoCodecTypes respectively.

To receive samples in their device native format, set this property to nil:

AVCaptureVideoDataOutput *myVideoOutput;   // assume this exists
myVideoOutput.videoSettings = nil;  // receives samples in device format

If you set this property to nil and then subsequently query it, you get a dictionary reflecting the settings used by the capture sessions’s current sessionPreset.

Availability
  • Available in iOS 4.0 and later.
Declared In
AVCaptureOutput.h

Instance Methods

recommendedVideoSettingsForAssetWriterWithOutputFileType:

Specifies the recommended settings for use with an AVAssetWriterInput.

- (NSDictionary *)recommendedVideoSettingsForAssetWriterWithOutputFileType:(NSString *)outputFileType
Parameters
outputFileType

Specifies the UTI of the file type to be written. See File Format UTIs for supported types.

Return Value

A fully populated dictionary of keys and values that are compatible with AVAssetWriter.

Discussion

The value of this property is an NSDictionary containing values for compression settings keys defined in Video Settings.

This dictionary is suitable for use as the outputSettings parameter when creating an AVAssetWriterInput as follows:

[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings sourceFormatHint:hint];

The dictionary returned contains all necessary keys and values needed by AVAssetWriter. See the AVAssetWriterInput class’s initWithMediaType:outputSettings: method for a more in depth discussion.

For QuickTime movie and ISO file types, the recommended video settings will produce output comparable to that of AVCaptureMovieFileOutput.

Note that the dictionary of settings is dependent on the current configuration of the receiver's AVCaptureSession and its inputs. The settings dictionary may change if the session's configuration changes. As such, configure your session first, then query the recommended video settings.

Availability
  • Available in iOS 7.0 and later.
Declared In
AVCaptureOutput.h

setSampleBufferDelegate:queue:

Sets the sample buffer delegate and the queue on which callbacks should be invoked.

- (void)setSampleBufferDelegate:(id<AVCaptureVideoDataOutputSampleBufferDelegate>)sampleBufferDelegate queue:(dispatch_queue_t)sampleBufferCallbackQueue
Parameters
sampleBufferDelegate

An object conforming to the AVCaptureVideoDataOutputSampleBufferDelegate protocol that will receive sample buffers after they are captured.

sampleBufferCallbackQueue

The queue on which callbacks should be invoked. You must use a serial dispatch queue, to guarantee that video frames will be delivered in order.

The sampleBufferCallbackQueue parameter may not be NULL, except when setting the sampleBufferDelegate to nil.

Discussion

When a new video sample buffer is captured, it is sent to the sample buffer delegate using captureOutput:didOutputSampleBuffer:fromConnection:. All delegate methods are invoked on the specified dispatch queue.

If the queue is blocked when new frames are captured, those frames will be automatically dropped at a time determined by the value of the alwaysDiscardsLateVideoFrames property. This allows you to process existing frames on the same queue without having to manage the potential memory usage increases that would otherwise occur when that processing is unable to keep up with the rate of incoming frames.

If your frame processing is consistently unable to keep up with the rate of incoming frames, you should consider using the minFrameDuration property, which will generally yield better performance characteristics and more consistent frame rates than frame dropping alone.

If you need to minimize the chances of frames being dropped, you should specify a queue on which a sufficiently small amount of processing is being done outside of receiving sample buffers. However, if you migrate extra processing to another queue, you are responsible for ensuring that memory usage does not grow without bound from frames that have not been processed.

Special Considerations

This method uses dispatch_retain and dispatch_release to manage the queue.

Availability
  • Available in iOS 4.0 and later.
Related Sample Code
Declared In
AVCaptureOutput.h