A GPU-based image processing routine used to create custom Core Image filters.
SDKs
- iOS 8.0+
- macOS 10.4+
- Mac Catalyst 13.0+
- tvOS 9.0+
Framework
- Core Image
Declaration
class CIKernel : NSObject
Overview
Note
If your custom filter uses both color and geometry information, but does not require processing both at the same time, you can improve performance by separating your image processing code: use a CIColor
object for the color processing step and a CIWarp
object for the geometry processing step.
The kernel language routine for a general-purpose filter kernel has the following characteristics:
Its return type is
vec4
(Core Image Kernel Language) orfloat4
(Metal Shading Language); that is, it returns a pixel color for the output image.It may use zero or more input images. Each input image is represented by a parameter of type
sampler
.
A kernel routine typically produces its output by calculating source image coordinates (using the dest
and sampler
functions or the sampler
function), samples from the source images (using the sample
function), and computes a final pixel color (output using the return
keyword). For example, the Metal Shading Language source below implements a filter that passes through its input image unchanged.
#include <CoreImage/CoreImage.h>
extern "C" {
namespace coreimage {
float4 do_nothing(sampler src) {
return src.sample(src.coord());
}
}
}
The equivalent code in Core Image Kernel Language is:
kernel vec4 do_nothing(sampler image) {
vec2 dc = destCoord();
return sample(image, samplerTransform(image, dc));
}
The Core Image Kernel Language is a dialect of the OpenGL Shading Language. See Core Image Kernel Language Reference and Core Image Programming Guide for more details.