The Media layer contains the graphics, audio, and video technologies geared toward creating the best multimedia experience available on a mobile device. The technologies in this layer were designed to make it easy for you to build applications that look and sound great.
High-quality graphics are an important part of all iOS applications. The simplest (and most efficient) way to create an application is to use prerendered images together with the standard views and controls of the UIKit framework and let the system do the drawing. However, there may be situations where you need to go beyond simple graphics. In those situations, you can use the following technologies to manage your application’s graphical content:
Core Graphics (also known as Quartz) handles native 2D vector- and image-based rendering; see “Core Graphics Framework.”
Core Animation (part of the Quartz Core framework) provides advanced support for animating views and other content; see “Quartz Core Framework.”
Core Image provides advanced support for manipulating video and still images; see “Core Image Framework.”
Core Text provides a sophisticated text layout and rendering engine; see “Core Text Framework.”
Image I/O provides interfaces for reading and writing most image formats; see “Image I/O Framework.”
The Assets Library framework provides access to the photos and videos in the user’s photo library; see “Assets Library Framework.”
For the most part, applications running on devices with Retina displays should work with little or no modifications. Any content you draw is automatically scaled as needed to support high-resolution screens. For vector-based drawing code, the system frameworks automatically use any extra pixels to improve the crispness of your content. And if you use images in your application, UIKit provides support for loading high-resolution variants of your existing images automatically. For more information about what you need to do to support high-resolution screens, see “App-Related Resources” in iOS App Programming Guide.
For information about the graphics-related frameworks, see the corresponding entries in “Media Layer Frameworks.”
The audio technologies available in iOS are designed to help you provide a rich audio experience for your users. This experience includes the ability to play high-quality audio, record high-quality audio, and trigger the vibration feature on certain devices.
The system provides several ways to play back and record audio content. The frameworks in the following list are ordered from high level to low level, with the Media Player framework offering the highest-level interfaces you can use. When choosing an audio technology, remember that higher-level frameworks are easier to use and are generally preferred. Lower-level frameworks offer more flexibility and control but require you to do more work.
The Media Player framework provides easy access to the user’s iTunes library and support for playing tracks and playlists; see “Media Player Framework.”
The AV Foundation framework provides a set of easy-to-use Objective-C interfaces for managing audio playback and recording; see “AV Foundation Framework.”
OpenAL provides a set of cross-platform interfaces for delivering positional audio; see “OpenAL Framework.”
The Core Audio frameworks offer both simple and sophisticated interfaces for playing and recording audio content. You use these interfaces for playing system alert sounds, triggering the vibrate capability of a device, and managing the buffering and playback of multichannel local or streamed audio content; see “Core Audio.”
The audio technologies in iOS support the following audio formats:
Apple Lossless (ALAC)
DVI/Intel IMA ADPCM
Microsoft GSM 6.10
For information about each of the audio frameworks, see the corresponding entry in “Media Layer Frameworks.”
Whether you are playing movie files from your application or streaming them from the network, iOS provides several technologies to play your video-based content. On devices with the appropriate video hardware, you can also use these technologies to capture video and incorporate it into your application.
The system provides several ways to play and record video content that you can choose depending on your needs. When choosing a video technology, remember that the higher-level frameworks simplify the work you have to do to support the features you need and are generally preferred. The frameworks in the following list are ordered from highest to lowest level, with the Media Player framework offering the highest-level interfaces you can use.
UIImagePickerControllerclass in UIKit provides a standard interface for recording video on devices with a supported camera.
The Media Player framework provides a set of simple-to-use interfaces for presenting full- or partial-screen movies from your application; see “Media Player Framework.”
The AV Foundation framework provides a set of Objective-C interfaces for managing the capture and playback of movies; see “AV Foundation Framework.”
Core Media describes the low-level data types used by the higher-level frameworks and provides low-level interfaces for manipulating media; see “Core Media Framework.”
The video technologies in iOS support the playback of movie files with the
.3gp filename extensions and using the following compression standards:
H.264 video, up to 1.5 Mbps, 640 by 480 pixels, 30 frames per second, Low-Complexity version of the H.264 Baseline Profile with AAC-LC audio up to 160 Kbps, 48 kHz, stereo audio in
H.264 video, up to 768 Kbps, 320 by 240 pixels, 30 frames per second, Baseline Profile up to Level 1.3 with AAC-LC audio up to 160 Kbps, 48 kHz, stereo audio in
MPEG-4 video, up to 2.5 Mbps, 640 by 480 pixels, 30 frames per second, Simple Profile with AAC-LC audio up to 160 Kbps, 48 kHz, stereo audio in
Numerous audio formats, including the ones listed in “Audio Technologies”
For information about each of the video frameworks in the Media layer, see the corresponding entry in “Media Layer Frameworks.” For more information on using the
UIImagePickerController class, see Camera Programming Topics for iOS.
AirPlay is a technology that lets your application stream audio to Apple TV and to third-party AirPlay speakers and receivers. AirPlay support is built in to the AV Foundation framework and the Core Audio family of frameworks. Any audio content you play using these frameworks is automatically made eligible for AirPlay distribution. Once the user chooses to play your audio using AirPlay, it is routed automatically by the system.
In iOS 5, users can mirror the content of an iPad 2 to an Apple TV 2 using AirPlay for any application. And developers who want to display different content (instead of mirroring) can assign a new window object to any
UIScreen objects connected to an iPad 2 via AirPlay. iOS 5 also offers more ways to deliver content over AirPlay, including using the
AVPlayer class in the AV Foundation framework and the
UIWebView class in the UIKit framework. In addition, the Media Player framework now includes support for displaying “Now Playing” information in several places, including as part of the content delivered over AirPlay.
For information on how to take advantage of AirPlay, see AirPlay Overview.
Media Layer Frameworks
The following sections describe the frameworks of the Media layer and the services they offer.
Assets Library Framework
Introduced in iOS 4.0, the Assets Library framework (
AssetsLibrary.framework) provides a query-based interface for retrieving photos and videos from the user’s device. Using this framework, you can access the same assets that are normally managed by the Photos application, including items in the user’s saved photos album and any photos and videos that were imported onto the device. You can also save new photos and videos back to the user’s saved photos album.
For more information about the classes and methods of this framework, see Assets Library Framework Reference.
AV Foundation Framework
Introduced in iOS 2.2, the AV Foundation framework (
AVFoundation.framework) contains Objective-C classes for playing audio content. You can use these classes to play file- or memory-based sounds of any duration. You can play multiple sounds simultaneously and control various playback aspects of each sound. In iOS 3.0 and later, this framework also includes support for recording audio and managing audio session information.
In iOS 4.0 and later, the services offered by this framework were expanded to include:
Media asset management
Metadata management for media items
Precise synchronization between sounds
An Objective-C interface for determining details about sound files, such as the data format, sample rate, and number of channels
In iOS 5, the AV Foundation framework includes support for streaming audio and video content over AirPlay using the
AVPlayer class. AirPlay support is enabled by default, but applications can opt out as needed.
The AV Foundation framework is a single source for recording and playing back audio and video in iOS. This framework also provides much more sophisticated support for handling and managing media items than higher-level frameworks.
For more information about the classes of the AV Foundation framework, see AV Foundation Framework Reference.
Native support for audio is provided by the Core Audio family of frameworks, which are listed in Table 2-1. Core Audio is a C-based interface that supports the manipulation of stereo-based audio. You can use Core Audio in iOS to generate, record, mix, and play audio in your applications. You can also use Core Audio to trigger the vibrate capability on devices that support it.
Defines the audio data types used throughout Core Audio. Core Audio Framework Reference.
Provides playback and recording services for audio files and streams. This framework also provides support for managing audio files, playing system alert sounds, and triggering the vibrate capability on some devices. See Audio Toolbox Framework Reference.
Provides services for using the built-in audio units, which are audio processing modules. Audio Unit Framework Reference.
Provides low-level MIDI services. See Core MIDI Framework Reference.
Provides access to the audio tap interfaces.
For more information about Core Audio, see Core Audio Overview. For information about how to use the Audio Toolbox framework to play sounds, see Audio Queue Services Programming Guide.
Core Graphics Framework
The Core Graphics framework (
CoreGraphics.framework) contains the interfaces for the Quartz 2D drawing API. Quartz is the same advanced, vector-based drawing engine that is used in OS X. It provides support for path-based drawing, anti-aliased rendering, gradients, images, colors, coordinate-space transformations, and PDF document creation, display, and parsing. Although the API is C based, it uses object-based abstractions to represent fundamental drawing objects, making it easy to store and reuse your graphics content.
For more information on how to use Quartz to draw content, see Quartz 2D Programming Guide and Core Graphics Framework Reference.
Core Image Framework
Introduced in iOS 5, the Core Image framework (
CoreImage.framework) provides a powerful set of built-in filters for manipulating video and still images. You can use the built-in filters for everything from simple operations (like touching up and correcting photos) to more advanced operations (like face and feature detection). The advantage of using these filters is that they operate in a nondestructive manner so that your original images are never changed directly. In addition, Core Image takes advantage of the available CPU and GPU processing power to ensure that operations are fast and efficient.
CIImage class provides access to a standard set of filters that you can use to improve the quality of a photograph. To create other types of filters, you can create and configure a
CIFilter object for the appropriate filter type.
For information about the classes and filters of the Core Image framework, see Core Image Reference Collection.
Core MIDI Framework
Introduced in iOS 4.2, the Core MIDI framework (
CoreMIDI.framework) provides a standard way to communicate with MIDI devices, including hardware keyboards and synthesizers. You use this framework to send and receive MIDI messages and to interact with MIDI peripherals connected to an iOS-based device using the dock connector or network.
For more information about using this framework, see Core MIDI Framework Reference.
Core Text Framework
Introduced in iOS 3.2, the Core Text framework (
CoreText.framework) contains a set of simple, high-performance C-based interfaces for laying out text and handling fonts. The Core Text framework provides a complete text layout engine that you can use to manage the placement of text on the screen. The text you manage can also be styled with different fonts and rendering attributes.
This framework is intended for use by applications that require sophisticated text handling capabilities, such as word-processing applications. If your application requires only simple text input and display, you should continue to use the existing text classes of the UIKit framework.
For more information about using the Core Text interfaces, see Core Text Programming Guide and Core Text Reference Collection.
Core Video Framework
Introduced in iOS 4.0, the Core Video framework (
CoreVideo.framework) provides buffer and buffer pool support for the Core Media framework (described in “Core Media Framework”). Most applications never need to use this framework directly.
Image I/O Framework
Introduced in iOS 4.0, the Image I/O framework (
ImageIO.framework) provides interfaces for importing and exporting image data and image metadata. This framework makes use of the Core Graphics data types and functions and supports all of the standard image types available in iOS.
In iOS 6 and later, you can use this framework to access EXIF and IPTC metadata properties for images.
For more information about the functions and data types of this framework, see Image I/O Reference Collection.
Introduced in iOS 5, the GLKit framework (
GLKit.framework) contains a set of Objective-C based utility classes that simplify the effort required to create an OpenGL ES 2.0 application. GLKit provides support for four key areas of application development:
GLKViewControllerclasses provide a standard implementation of an OpenGL ES–enabled view and associated rendering loop. The view manages the underlying framebuffer object on behalf of the application; your application just draws to it.
GLKTextureLoaderclass provides image conversion and loading routines to your application, allowing it to automatically load texture images into your context. It can load textures synchronously or asynchronously. When loading textures asynchronously, your application provides a completion handler block to be called when the texture is loaded into your context.
The GLKit framework provides implementations of vector, matrix, and quaternions as well as a matrix stack operation that provides the same functionality found in OpenGL ES 1.1.
GLKReflectionMapEffectclasses provide existing, configurable graphics shaders that implement commonly used graphics operations. In particular, the
GLKBaseEffectclass implements the lighting and material model found in the OpenGL ES 1.1 specification, simplifying the effort required to migrate an application from OpenGL ES 1.1 to OpenGL ES 2.0.
For information about the classes of the GLKit framework, see GLKit Framework Reference.
Media Player Framework
The Media Player framework (
MediaPlayer.framework) provides high-level support for playing audio and video content from your application. You can use this framework to play video using a standard system interface.
In iOS 3.0, support was added for accessing the user’s iTunes music library. With this support, you can play music tracks and playlists, search for songs, and present a media picker interface to the user.
In iOS 3.2, changes were made to support the playback of video from a resizable view. (Previously, only full-screen support was available.) In addition, numerous interfaces were added to support the configuration and management of movie playback.
In iOS 5, support was added for displaying “Now Playing” information in the lock screen and multitasking controls. This information can also be displayed on an Apple TV and with content delivered via AirPlay. There are also interfaces for detecting whether video is being streamed over AirPlay.
For information about the classes of the Media Player framework, see Media Player Framework Reference. For information on how to use these classes to access the user’s iTunes library, see iPod Library Access Programming Guide.
The Open Audio Library (OpenAL) interface is a cross-platform standard for delivering positional audio in applications. You can use it to implement high-performance, high-quality audio in games and other programs that require positional audio output. Because OpenAL is a cross-platform standard, the code modules you write using OpenAL on iOS can be ported to many other platforms easily.
For information about OpenAL, including how to use it, see http://www.openal.org.
OpenGL ES Framework
The OpenGL ES framework (
OpenGLES.framework) provides tools for drawing 2D and 3D content. It is a C-based framework that works closely with the device hardware to provide high frame rates for full-screen game-style applications.
You always use the OpenGL framework in conjunction with the EAGL interfaces. These interfaces are part of the OpenGL ES framework and provide the interface between your OpenGL ES drawing code and the native window objects defined by UIKit.
In iOS 3.0 and later, the OpenGL ES framework includes support for both the OpenGL ES 2.0 and the OpenGL ES 1.1 interface specifications. The 2.0 specification provides support for fragment and vertex shaders and is available only on specific iOS-based devices running iOS 3.0 and later. Support for OpenGL ES 1.1 is available on all iOS-based devices and in all versions of iOS.
For information on how to use OpenGL ES in your applications, see OpenGL ES Programming Guide for iOS. For reference information, see OpenGL ES Framework Reference.
Quartz Core Framework
The Quartz Core framework (
QuartzCore.framework) contains the Core Animation interfaces. Core Animation is an advanced animation and compositing technology that uses an optimized rendering path to implement complex animations and visual effects. It provides a high-level Objective-C interface for configuring animations and effects that are then rendered in hardware for performance. Core Animation is integrated into many parts of iOS, including UIKit classes such as
UIView, providing animations for many standard system behaviors. You can also use the Objective-C interface in this framework to create custom animations.
For more information on how to use Core Animation in your applications, see Core Animation Programming Guide and Core Animation Reference Collection.
© 2012 Apple Inc. All Rights Reserved. (Last updated: 2012-09-19)