Provides the filtered video frame image to AVFoundation for further processing or display.
SDKs
- iOS 9.0+
- macOS 10.11+
- Mac Catalyst 13.0+
- tvOS 9.0+
Framework
- AVFoundation
Declaration
Parameters
filteredImage
A Core Image image representing the output of whatever filters you’ve applied to the source image.
context
A Core Image context to be used for rendering the output image, or
nil
to use a default context provided by AVFoundation.
Discussion
Call this method when your handler block has finished applying filters, passing the output
object from the final filter in your filter chain for the filtered
parameter. The pixel format for this image must be the BGRA8
format (of the k
type).
You can pass the source
object to the filtered
parameter to disable filtering for the current frame.
By default, you can pass nil
for the context
parameter to use a default rendering context provided by Core Image. In iOS and tvOS, the default context uses the Device RGB color space. In macOS, the default context uses the sRGB color space. AVFoundation automatically uses a GPU-accelerated context if possible. To use a different color space or control other rendering options, pass your own CIContext
object instead.
Important
A CIContext
instance is a heavyweight object that maintains expensive rendering state. Don’t create a new context object in the block where you call this method (which runs once per video frame); instead, create a CIContext
instance before create a composition with the init(asset:
method, and use that instance in your handler block.