iOS Developer Library

Developer

AVFoundation Framework Reference AVCaptureVideoDataOutput Class Reference

Options
Deployment Target:

On This Page
Language:

AVCaptureVideoDataOutput

Inherits From


Conforms To


Import Statement


Swift

import AVFoundation

Objective-C

@import AVFoundation;

Availability


Available in iOS 4.0 and later

AVCaptureVideoDataOutput is a concrete sub-class of AVCaptureOutput you use to process uncompressed frames from the video being captured, or to access compressed frames.

An instance of AVCaptureVideoDataOutput produces video frames you can process using other media APIs. You can access the frames with the captureOutput:didOutputSampleBuffer:fromConnection: delegate method.

  • The compression settings for the output.

    Declaration

    Swift

    var videoSettings: [NSObject : AnyObject]!

    Objective-C

    @property(nonatomic, copy) NSDictionary *videoSettings

    Discussion

    The dictionary contains values for compression settings keys defined in Video Settings, or pixel buffer attributes keys defined in CVPixelBufferRef. The only key currently supported is the kCVPixelBufferPixelFormatTypeKey key.

    To get possible values for the supported video pixel formats (kCVPixelBufferPixelFormatTypeKey) and video codec formats (AVVideoCodecKey), see availableVideoCVPixelFormatTypes and availableVideoCodecTypes respectively.

    To receive samples in their device native format, set this property to nil:

    • AVCaptureVideoDataOutput *myVideoOutput; // assume this exists
    • myVideoOutput.videoSettings = nil; // receives samples in device format

    If you set this property to nil and then subsequently query it, you get a dictionary reflecting the settings used by the capture sessions’s current sessionPreset.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later

  • minFrameDuration minFrameDuration (iOS 5.0) Property

    The minimum frame duration.

    Deprecation Statement

    Use the AVCaptureConnection class videoMinFrameDuration property instead.

    Declaration

    Objective-C

    @property(nonatomic) CMTime minFrameDuration

    Discussion

    This property specifies the minimum duration of each video frame output by the receiver, placing a lower bound on the amount of time that should separate consecutive frames. This is equivalent to the inverse of the maximum frame rate. A value of kCMTimeZero or kCMTimeInvalid indicates an unlimited maximum frame rate.

    The default value is kCMTimeInvalid.

    Import Statement

    Objective-C

    @import AVFoundation;

    Availability

    Available in iOS 4.0 and later

    Deprecated in iOS 5.0

  • Indicates whether video frames are dropped if they arrive late.

    Declaration

    Swift

    var alwaysDiscardsLateVideoFrames: Bool

    Objective-C

    @property(nonatomic) BOOL alwaysDiscardsLateVideoFrames

    Discussion

    When the value of this property is YEStrue, the object immediately discards frames that are captured while the dispatch queue handling existing frames is blocked in the captureOutput:didOutputSampleBuffer:fromConnection: delegate method.

    When the value of this property is NOfalse, delegates are allowed more time to process old frames before new frames are discarded, but application memory usage may increase significantly as a result.

    The default is YEStrue.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later

  • Specifies the recommended settings for use with an AVAssetWriterInput.

    Declaration

    Swift

    func recommendedVideoSettingsForAssetWriterWithOutputFileType(_ outputFileType: String!) -> [NSObject : AnyObject]!

    Objective-C

    - (NSDictionary *)recommendedVideoSettingsForAssetWriterWithOutputFileType:(NSString *)outputFileType

    Parameters

    outputFileType

    Specifies the UTI of the file type to be written. See File Format UTIs for supported types.

    Return Value

    A fully populated dictionary of keys and values that are compatible with AVAssetWriter.

    Discussion

    The value of this property is an NSDictionary containing values for compression settings keys defined in Video Settings.

    This dictionary is suitable for use as the outputSettings parameter when creating an AVAssetWriterInput as follows:

    • [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings sourceFormatHint:hint];

    The dictionary returned contains all necessary keys and values needed by AVAssetWriter. See the AVAssetWriterInput class’s initWithMediaType:outputSettings: method for a more in depth discussion.

    For QuickTime movie and ISO file types, the recommended video settings will produce output comparable to that of AVCaptureMovieFileOutput.

    Note that the dictionary of settings is dependent on the current configuration of the receiver's AVCaptureSession and its inputs. The settings dictionary may change if the session's configuration changes. As such, configure your session first, then query the recommended video settings.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later

  • Sets the sample buffer delegate and the queue on which callbacks should be invoked.

    Declaration

    Swift

    func setSampleBufferDelegate(_ sampleBufferDelegate: AVCaptureVideoDataOutputSampleBufferDelegate!, queue sampleBufferCallbackQueue: dispatch_queue_t!)

    Objective-C

    - (void)setSampleBufferDelegate:(id<AVCaptureVideoDataOutputSampleBufferDelegate>)sampleBufferDelegate queue:(dispatch_queue_t)sampleBufferCallbackQueue

    Parameters

    sampleBufferDelegate

    An object conforming to the AVCaptureVideoDataOutputSampleBufferDelegate protocol that will receive sample buffers after they are captured.

    sampleBufferCallbackQueue

    The queue on which callbacks should be invoked. You must use a serial dispatch queue, to guarantee that video frames will be delivered in order.

    The sampleBufferCallbackQueue parameter may not be NULL, except when setting the sampleBufferDelegate to nil.

    Discussion

    When a new video sample buffer is captured, it is sent to the sample buffer delegate using captureOutput:didOutputSampleBuffer:fromConnection:. All delegate methods are invoked on the specified dispatch queue.

    If the queue is blocked when new frames are captured, those frames will be automatically dropped at a time determined by the value of the alwaysDiscardsLateVideoFrames property. This allows you to process existing frames on the same queue without having to manage the potential memory usage increases that would otherwise occur when that processing is unable to keep up with the rate of incoming frames.

    If your frame processing is consistently unable to keep up with the rate of incoming frames, you should consider using the minFrameDuration property, which will generally yield better performance characteristics and more consistent frame rates than frame dropping alone.

    If you need to minimize the chances of frames being dropped, you should specify a queue on which a sufficiently small amount of processing is being done outside of receiving sample buffers. However, if you migrate extra processing to another queue, you are responsible for ensuring that memory usage does not grow without bound from frames that have not been processed.

    Special Considerations

    This method uses dispatch_retain and dispatch_release to manage the queue.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later

  • The capture object’s delegate. (read-only)

    Declaration

    Swift

    var sampleBufferDelegate: AVCaptureVideoDataOutputSampleBufferDelegate! { get }

    Objective-C

    @property(nonatomic, readonly) id< AVCaptureVideoDataOutputSampleBufferDelegate > sampleBufferDelegate

    Discussion

    The delegate receives sample buffers after they are captured.

    You set the delegate using setSampleBufferDelegate:queue:.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later

  • The queue on which delegate callbacks should be invoked (read-only)

    Declaration

    Swift

    var sampleBufferCallbackQueue: dispatch_queue_t! { get }

    Objective-C

    @property(nonatomic, readonly) dispatch_queue_t sampleBufferCallbackQueue

    Discussion

    You set the queue using setSampleBufferDelegate:queue:.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later