Creates a video composition configured to apply Core Image filters to each video frame of the specified asset.
SDKs
- iOS 9.0+
- macOS 10.11+
- Mac Catalyst 13.0+
- tvOS 9.0+
Framework
- AVFoundation
Declaration
init(asset: AVAsset, applyingCIFiltersWithHandler applier: @escaping (AVAsynchronous CIImage Filtering Request) -> Void)
Parameters
asset
The asset whose configuration matches the intended use of the video composition.
applier
A block that AVFoundation calls when processing each video frame.
The block takes a single parameter and has no return value:
- request
An
AVAsynchronous
object representing the frame to be processed.CIImage Filtering Request
Return Value
A new video composition object.
Discussion
To process video frames using Core Image filters—whether for display or export—create a composition with this method. AVFoundation calls your applier block once for each frame to be displayed (or processed for export) from the asset’s first enabled video track. In that block, you access the video frame and return a filtered result using the provided AVAsynchronous
object. Use that object’s source
property to get the video frame in the form of a CIImage object you can apply filters to. Pass the result of your filters to the request
object’s finish(with:
method. (If your filter rendering fails, call the request
object’s finish(with:)
method if you cannot apply filters.)
Creating a composition with this method sets values for the following properties:
The
frame
property is set to accommodate theDuration nominal
value for the asset's first enabled video track. If the nominal frame rate is zero, AVFoundation uses a default framerate of 30 fps.Frame Rate The
render
property is set to a size that encompasses the asset's first enabled video track, respecting the track'sSize preferred
property.Transform The
render
property is set toScale 1
..0