Framework

Core Media

Represent time-based audio-visual assets with essential data types.

Overview

The Core Media framework defines the media pipeline used by AVFoundation and other high-level media frameworks found on Apple platforms. Use Core Media's low-level data types and interfaces to efficiently process media samples and manage queues of media data.

Symbols

Sample Processing

CMSample​Buffer

An object containing zero or more media samples of a uniform media type.

CMBlock​Buffer

An object used to move blocks of memory through a processing system.

CMFormat​Description

A media format descriptor that describes the samples in a sample buffer.

CMAttachment

An API for attaching additional metadata to a sample buffer.

Time Representation

CMTime

A struct representing a time value such as a timestamp or duration.

CMTime​Range

A struct representing a range of time.

CMTime​Mapping

A struct used to specify the mapping of a segment of one time line into another.

Media Synchronization

CMClock

A reference clock used to synchronize applications and devices.

CMAudio​Clock

A specialized reference clock used to synchronize with audio sources.

CMTimebase

A model of a timeline under application control.

Text Markup

CMText​Markup

The collection of text markup-related attributes supported by Core Media.

Metadata

CMMetadata

The APIs for working with the framework's Metadata Identifier Services and Metadata Data Type Registry.

Queues

CMSimple​Queue

A simple, lockless FIFO queue of (void *) elements.

CMBuffer​Queue

A queue of timed buffers.

CMMemory​Pool

A pool used for optimizing memory allocation when large blocks of memory must be repeatedly allocated, deallocated, and then reallocated.