Creates a video composition configured to apply Core Image filters to each video frame of the specified asset.
- iOS 9.0+
- macOS 10.11+
- Mac Catalyst 13.0+
- tvOS 9.0+
The asset whose configuration matches the intended use of the video composition.
A block that AVFoundation calls when processing each video frame.
The block takes a single parameter and has no return value:
AVAsynchronousobject representing the frame to be processed.
CIImage Filtering Request
A new video composition object.
To process video frames using Core Image filters—whether for display or export—create a composition with this method. AVFoundation calls your applier block once for each frame to be displayed (or processed for export) from the asset’s first enabled video track. In that block, you access the video frame and return a filtered result using the provided
AVAsynchronous object. Use that object’s
source property to get the video frame in the form of a CIImage object you can apply filters to. Pass the result of your filters to the
finish(with: method. (If your filter rendering fails, call the
finish(with:) method if you cannot apply filters.)
Creating a composition with this method sets values for the following properties:
frameproperty is set to accommodate the
nominalvalue for the asset's first enabled video track. If the nominal frame rate is zero, AVFoundation uses a default framerate of 30 fps.
renderproperty is set to