iOS Developer Library

Developer

Core Image Reference Collection

Core Image Reference Collection

Classes   Other Reference  

Core Image is an image processing and analysis technology designed to provide near real-time processing for still and video images. In iOS and OS X you can use Core Image classes to:

  • Process images using the many built-in image filters

  • Chain together filters and then archive them for later use

  • Detect features (such as faces and eyes) in still and video images, and track faces in video images

  • Analyze images to get a set of auto adjustment filters

  • Create custom filters for use in your app

On OS X you can also package custom filters as Image Units that you can publish for use by others. For details, see Image Unit Tutorial.

To learn about using Core Image, read Core Image Programming Guide.

Classes

Class

Abstract

NSObject

NSObject is the root class of most Objective-C class hierarchies.

CIColor

The CIColor class contains color values and the color space for which the color values are valid.

CIContext

The CIContext class provides an evaluation context for rendering a CIImage object through Quartz 2D or OpenGL.

CIDetector

A CIDetector object uses image processing to search for and identify notable features (faces, rectangles, and barcodes) in a still image or video.

CIFeature

A CIFeature object represents a portion of an image that a detector believes matches its criteria.

CIFaceFeature

A CIFaceFeature object describes a face detected in a still or video image.

CIQRCodeFeature

CIRectangleFeature

CIFilter

The CIFilter class produces a CIImage object as output.

CIImage

The CIImage class represents an image.

CIKernel

CIColorKernel

CIWarpKernel

CIVector

The CIVector class is used for coordinate values and direction vectors.

Other Reference

Reference

Core Image Kernel Language Reference

Core Image Filter Reference