Framework

Core Media

Represent time-based audio-visual assets with essential data types.

Overview

The Core Media framework defines the media pipeline used by AVFoundation and other high-level media frameworks found on Apple platforms. Use Core Media's low-level data types and interfaces to efficiently process media samples and manage queues of media data.

Topics

Sample Processing

CMSampleBuffer

An object containing zero or more media samples of a uniform media type.

CMBlockBuffer

An object used to move blocks of memory through a processing system.

CMFormatDescription

A media format descriptor that describes the samples in a sample buffer.

CMAttachment

An API for attaching additional metadata to a sample buffer.

Time Representation

CMTime

A struct representing a time value such as a timestamp or duration.

CMTimeRange

A struct representing a range of time.

CMTimeMapping

A struct used to specify the mapping of a segment of one time line into another.

Media Synchronization

CMClock

A reference clock used to synchronize applications and devices.

CMAudioClock

A specialized reference clock used to synchronize with audio sources.

CMTimebase

A model of a timeline under application control.

Text Markup

CMTextMarkup

The collection of text markup-related attributes supported by Core Media.

Metadata

CMMetadata

The APIs for working with the framework's Metadata Identifier Services and Metadata Data Type Registry.

Queues

CMSimpleQueue

A simple, lockless FIFO queue of (void *) elements.

CMBufferQueue

A queue of timed buffers.

CMMemoryPool

A pool used for optimizing memory allocation when large blocks of memory must be repeatedly allocated, deallocated, and then reallocated.