Class

AVVideoComposition

An object that represents an immutable video composition.

Declaration

class AVVideoComposition : NSObject

Overview

The AVFoundation framework also provides a mutable subclass, AVMutableVideoComposition, that you can use to create new videos.

A video composition describes, for any time in the aggregate time range of its instructions, the number and IDs of video tracks that are to be used in order to produce a composed video frame corresponding to that time. When AVFoundation’s built-in video compositor is used, the instructions an AVVideoComposition contain can specify a spatial transformation, an opacity value, and a cropping rectangle for each video source, and these can vary over time via simple linear ramping functions.

You can implement your own custom video compositor by implementing the AVVideoCompositing protocol; a custom video compositor is provided with pixel buffers for each of its video sources during playback and other operations and can perform arbitrary graphical operations on them in order to produce visual output.

Topics

Creating a Video Composition Object

init(propertiesOf: AVAsset)

Creates a video composition object configured to present the video tracks of the specified asset.

init(asset: AVAsset, applyingCIFiltersWithHandler: (AVAsynchronousCIImageFilteringRequest) -> Void)

Creates a video composition configured to apply Core Image filters to each video frame of the specified asset.

Configuring Video Composition Properties

var frameDuration: CMTime

A time interval for which the video composition should render composed video frames.

var renderSize: CGSize

The size at which the video composition should render.

var renderScale: Float

The scale at which the video composition should render.

protocol AVVideoCompositionInstructionProtocol

Methods you can implement to represent operations to be performed by a compositor.

var animationTool: AVVideoCompositionCoreAnimationTool?

A video composition tool to use with Core Animation in offline rendering.

var sourceTrackIDForFrameTiming: CMPersistentTrackID

A value that indicates whether frame timing for the video composition is derived from the source's asset track.

var colorPrimaries: String?

The color primaries used for video composition.

var colorTransferFunction: String?

The transfer function used for video composition.

var colorYCbCrMatrix: String?

The YCbCr matrix used for video composition.

Validating the Time Range

func isValid(for: AVAsset?, timeRange: CMTimeRange, validationDelegate: AVVideoCompositionValidationHandling?) -> Bool

Indicates whether the time ranges of the composition’s instructions conform to validation requirements.

protocol AVVideoCompositionValidationHandling

Methods you can implement to indicate whether validation of a video composition should continue after specific errors are found.

See Also

Video Composition

class AVMutableVideoComposition

An object that represents a mutable video composition.

class AVAsynchronousCIImageFilteringRequest

An object that supprts using Core Image filters to process an individual video frame in a video composition.

class AVAsynchronousVideoCompositionRequest

An object that contains the information necessary for a video compositor to render an output pixel buffer.

class AVMutableVideoCompositionInstruction

An operation performed by a compositor.

class AVMutableVideoCompositionLayerInstruction

An object used to modify the transform, cropping, and opacity ramps applied to a given track in a mutable composition.

class AVVideoCompositionCoreAnimationTool

An object used to incorporate Core Animation into a video composition.

class AVVideoCompositionInstruction

An operation performed by a compositor.

class AVVideoCompositionLayerInstruction

An object used to modify the transform, cropping, and opacity ramps applied to a given track in a composition.

class AVVideoCompositionRenderContext

An object that defines the context within which custom compositors render new output pixel buffers.