An object that supprts using Core Image filters to process an individual video frame in a video composition.
Mac Catalyst 13.0+Beta
You use this class when creating a composition for Core Image filtering with the init(asset:applyingCIFiltersWithHandler:) method. In that method call, you provide a block to be called by AVFoundation as it processes each frame of video, and the block’s sole parameter is a AVAsynchronousCIImageFilteringRequest object. Use that object both to the video frame image to be filtered and allows you to return a filtered image to AVFoundation for display or export. Listing 1 shows an example of applying a filter to an asset.
An object that defines the context within which custom compositors render new output pixel buffers.
This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.