An object that supprts using Core Image filters to process an individual video frame in a video composition.
SDKs
- iOS 9.0+
- macOS 10.11+
- Mac Catalyst 13.0+
- tvOS 9.0+
Framework
- AVFoundation
Declaration
@interface AVAsynchronousCIImageFilteringRequest : NSObject
Overview
You use this class when creating a composition for Core Image filtering with the video
method. In that method call, you provide a block to be called by AVFoundation as it processes each frame of video, and the block’s sole parameter is a AVAsynchronous
object. Use that object both to the video frame image to be filtered and allows you to return a filtered image to AVFoundation for display or export. Listing 1 shows an example of applying a filter to an asset.
Applying Core Image filters to a video asset
CIFilter *filter = [CIFilter filterWithName:@"CIGaussianBlur"];
AVVideoComposition *composition = [AVVideoComposition videoCompositionWithAsset: asset
applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest *request){
// Clamp to avoid blurring transparent pixels at the image edges
CIImage *source = [request.sourceImage imageByClampingToExtent];
[filter setValue:source forKey:kCIInputImageKey];
// Vary filter parameters based on video timing
Float64 seconds = CMTimeGetSeconds(request.compositionTime);
[filter setValue:seconds * 10.0 forKey:kCIInputRadiusKey];
// Crop the blurred output to the bounds of the original image
CIImage *output = [filter.outputImage imageByCroppingToRect:request.sourceImage.extent];
// Provide the filter output to the composition
[request finishWithImage:output context:nil];
}];
Tip
To use the created video composition for playback, create an AVPlayer
object from the same asset used as the composition’s source, then assign the composition to the player item’s video
property. To export the composition to a new movie file, create an AVAsset
object from the same source asset, then assign the composition to the export session’s video
property.