Framework

Core Image

Use built-in or custom filters to process still and video images in near real-time. Detect features such as faces and eyes, and track faces in video images.

Overview

Core Image is an image processing and analysis technology designed to provide high-performance processing for still and video images.

Getting Started with Core Image

Use Core Image classes to:

  • Process images using the many built-in image filters

  • Chain together filters and then archive them for later use

  • Detect features (such as faces and eyes) in still and video images, and track faces in video images

  • Analyze images to get a set of auto adjustment filters

  • Create custom filters for use in your app

In macOS, you can also package custom filters as Image Units that you can publish for use by others. For details, see Image Unit Tutorial.

To learn about using Core Image, read Core Image Programming Guide.

Symbols

Classes

CIColor

The component values defining a color in a specific color space.

CIColorKernel

A GPU-based image processing routine that processes only the color information in images, used to create custom Core Image filters.

CIContext

An evaluation context for rendering image processing results and performing image analysis.

CIDetector

An image processor that identifies notable features (such as faces and barcodes) in a still image or video.

CIFaceFeature

Information about a face detected in a still or video image.

CIFeature

The abstract superclass for objects representing notable features detected in an image.

CIFilter

An image processor that produces an image by manipulating one or more input images or by generating new image data.

CIFilterGenerator

An object that creates and configures chains of individual image filters.

CIFilterShape

A description of the bounding shape of a filter and the domain of definition for a filter operation.

CIImage

A representation of an image to be processed or produced by Core Image filters.

CIImageAccumulator

An object that manages feedback-based image processing for tasks such as painting or fluid simulation.

CIImageProcessorKernel

The abstract class you extend to create custom image processors that can integrate with Core Image workflows.

CIKernel

A GPU-based image processing routine used to create custom Core Image filters.

CIPlugIn

The mechanism for loading image units—bundles containing custom Core Image filters—in macOS.

CIQRCodeFeature

Information about a Quick Response code (a kind of 2D barcode) detected in a still or video image.

CIRectangleFeature

Information about a rectangular region detected in a still or video image.

CISampler

An object that retrieves pixel samples for processing by a filter kernel.

CITextFeature

Information about a region likely to contain upright text detected in a still or video image.

CIVector

A container for coordinate values, direction vectors, matrices, and other non-scalar values, typically used in Core Image for filter parameters.

CIWarpKernel

A GPU-based image processing routine that processes only the geometry information in an image, used to create custom Core Image filters.

Protocols

CIFilterConstructor

A general interface for objects that produce CIFilter instances.

CIImageProcessorInput

A container of image data and information for use in a custom image processor.

CIImageProcessorOutput

A container for writing image data and information produced by a custom image processor.

CIPlugInRegistration

The interface for loading Core Image image units.

Extended Types

NSObject

NSObject is the root class of most Objective-C class hierarchies. Through NSObject, objects inherit a basic interface to the runtime system and the ability to behave as Objective-C objects.