Media Layer

The Media layer contains the graphics, audio, and video technologies you use to implement multimedia experiences in your apps. The technologies in this layer make it easy for you to build apps that look and sound great.

Graphics Technologies

High-quality graphics are an important part of all apps, and iOS provides numerous technologies to help put your custom art and graphics onscreen. The iOS graphics technologies offer a wide range of support, working seamlessly with the UIKit view architecture to make it easy to deliver content. You can use the standard views to deliver a high-quality interface quickly, or you can create your own custom views and use any of the technologies listed in Table 2-1 to deliver an even richer graphical experience.

Table 2-1  Graphics technologies in iOS

Technology

Description

UIKit graphics

UIKit defines high-level support for drawing images and Bézier paths and for animating the content of your views. In addition to providing classes to implement drawing support, UIKit views provide a fast and efficient way to render images and text-based content. Views can also be animated, both explicitly and using UIKit dynamics, to provide feedback and promote user interactivity.

For more information about the classes of the UIKit framework, see UIKit Framework Reference.

Core Graphics framework

Core Graphics (also known as Quartz) is the native drawing engine for iOS apps and provides support for custom 2D vector- and image-based rendering. Although not as fast as OpenGL ES rendering, this framework is well suited for situations where you want to render custom 2D shapes and images dynamically.

For more information, see “Core Graphics Framework.”

Core Animation

Core Animation (part of the Quartz Core framework) is a foundational technology that optimizes the animation experience of your apps. UIKit views use Core Animation to provide view-level animation support. You can use Core Animation directly when you want more control over the behavior of your animations.

For more information, see “Quartz Core Framework.”

Core Image

Core Image provides advanced support for manipulating video and still images in a nondestructive manner.

For more information, see “Core Image Framework.”

OpenGL ES and GLKit

OpenGL ES handles advanced 2D and 3D rendering using hardware-accelerated interfaces. This framework is traditionally used by game developers, or by anyone wanting to implement an immersive graphical experience. This framework gives you full control over the rendering process and offers the frame rates needed to create smooth animations. For more information, see “OpenGL ES Framework.”

GLKit is a set of Objective-C classes that provide the power of OpenGL ES using an object-oriented interface. For more information, see “GLKit Framework.”

Text Kit and Core Text

Text Kit is a family of UIKit classes used to perform fine typography and text management. If your app performs advanced text manipulations, Text Kit provides seamless integration with the rest of your views. For more information, see “Text Kit.”

Core Text is a lower-level C-based framework for handling advanced typography and layout. For more information, see “Core Text Framework.”

Image I/O

Image I/O provides interfaces for reading and writing most image formats. For more information, see “Image I/O Framework.”

Assets Library

The Assets Library framework lets you access a user’s photos, videos, and media. You use this framework in places where you want to integrate the user’s own content with your app. For more information, see “Assets Library Framework.”

iOS provides built-in support for apps running on either Retina displays or standard-resolution displays. For vector-based drawing, the system frameworks automatically use the extra pixels of a Retina display to improve the crispness of your content. And if you use images in your app, UIKit provides support for loading high-resolution variants of your existing images automatically. For more information about what you need to do to support high-resolution screens, see “App-Related Resources” in iOS App Programming Guide.

Audio Technologies

The iOS audio technologies work with the underlying hardware to provide a rich audio experience for your users. This experience includes the ability to play and record high-quality audio, to handle MIDI content, and to work with a device’s built-in sounds.

If your app uses audio, there are several technologies available for you to use. Table 2-2 lists these frameworks and describes the situations where you might use each.

Table 2-2  Audio technologies in iOS

Technology

Description

Media Player framework

This high-level framework provides easy access to a user’s iTunes library and support for playing tracks and playlists. Use this framework when you want to integrate audio into your app quickly and when you don’t need to control playback behavior. For more information, see “Media Player Framework.”

AV Foundation

AV Foundation is an Objective-C interface for managing the recording and playback of audio and video. Use this framework for recording audio and when you need fine-grained control over the audio playback process. For more information, see “AV Foundation Framework.”

OpenAL

OpenAL is an industry-standard technology for delivering positional audio. Game developers frequently use this technology to deliver high-quality audio using a set of cross-platform interfaces. For more information, see “OpenAL Framework.”

Core Audio

Core Audio is a set of frameworks that provide both simple and sophisticated interfaces for the recording and playback of audio and MIDI content. This framework is for advanced developers who need fine-grained control over their audio. For more information, see “Core Audio.”

iOS supports many industry-standard and Apple-specific audio formats, including the following:

Video Technologies

The iOS video technologies provide support for managing static video content in your app or playing back streaming content from the Internet. For devices with the appropriate recording hardware, you can also record video and incorporate it into your app. Table 2-3 lists the technologies that support video playback and recording.

Table 2-3  Video technologies in iOS

Technology

Description

UIImagePickerController

The UIImagePickerController class is a UIKit view controller for choosing user media files. You can use this view controller to prompt the user for an existing picture or video or to let the user capture new content. For more information, see Camera Programming Topics for iOS.

Media Player

The Media Player framework provides a set of simple-to-use interfaces for presenting video. This framework supports both full-screen and partial-screen video playback and supports optional playback controls for the user. For more information, see “Media Player Framework.”

AV Foundation

AV Foundation provides advanced video playback and recording capabilities. Use this framework in situations where you need more control over the presentation or recording of video. For example, augmented reality apps could use this framework to layer live video content with other app-provided content. For more information, see “AV Foundation Framework.”

Core Media

The Core Media framework defines the low-level data types and interfaces for manipulating media. Most apps do not need to use this framework directly, but it is available when you need unparalleled control over your app’s video content. For more information, see “Core Media Framework.”

iOS supports many industry-standard video formats and compression standards, including the following:

AirPlay

AirPlay lets your app stream audio and video content to Apple TV and stream audio content to third-party AirPlay speakers and receivers. AirPlay support is built into numerous frameworks—UIKit framework, Media Player framework, AV Foundation framework, and the Core Audio family of frameworks—so in most cases you do not need to do anything special to support it. Any content you play using these frameworks is automatically made eligible for AirPlay distribution. When the user chooses to play your content using AirPlay, it is routed automatically by the system.

Additional options for delivering content over AirPlay include the following:

For information on how to take advantage of AirPlay in your apps, see AirPlay Overview.

Media Layer Frameworks

The following sections describe the frameworks of the Media layer and the services they offer.

Assets Library Framework

The Assets Library framework (AssetsLibrary.framework) provides access to the photos and videos managed by the Photos app on a user’s device. Use this framework to access items in the user’s saved photos album or in any albums imported onto the device. You can also save new photos and videos back to the user’s saved photos album.

For more information about the classes and methods of this framework, see Assets Library Framework Reference.

AV Foundation Framework

The AV Foundation framework (AVFoundation.framework) provides a set of Objective-C classes for playing, recording, and managing audio and video content. Use this framework when you want to integrate media capabilities seamlessly into your app’s user interface. You can also use it for more advanced media handling. For example, you use this framework to play multiple sounds simultaneously and control numerous aspects of the playback and recording process.

The services offered by this framework include:

  • Audio session management, including support for declaring your app’s audio capabilities to the system

  • Management of your app’s media assets

  • Support for editing media content

  • The ability to capture audio and video

  • The ability to play back audio and video

  • Track management

  • Metadata management for media items

  • Stereophonic panning

  • Precise synchronization between sounds

  • An Objective-C interface for determining details about sound files, such as the data format, sample rate, and number of channels

  • Support for streaming content over AirPlay

For more information about how to use AV Foundation, see AV Foundation Programming Guide. For information about the classes of the AV Foundation framework, see AV Foundation Framework Reference.

Core Audio

Core Audio is a family of frameworks (listed in Table 2-4) that provides native support for handling audio. These frameworks support the generation, recording, mixing, and playing of audio in your apps. You can also use these interfaces to work with MIDI content and to stream audio and MIDI content to other apps.

Table 2-4  Core Audio frameworks

Framework

Services

CoreAudio.framework

Defines the audio data types used throughout Core Audio. For more information, see Core Audio Framework Reference.

AudioToolbox.framework

Provides playback and recording services for audio files and streams. This framework also provides support for managing audio files, playing system alert sounds, and triggering the vibrate capability on some devices. For more information, see Audio Toolbox Framework Reference.

AudioUnit.framework

Provides services for using the built-in audio units, which are audio processing modules. This framework also supports vending your app’s audio content as an audio component that is visible to other apps. For more information, see Audio Unit Framework Reference.

CoreMIDI.framework

Provides a standard way to communicate with MIDI devices, including hardware keyboards and synthesizers. You use this framework to send and receive MIDI messages and to interact with MIDI peripherals connected to an iOS-based device using the dock connector or network. For more information, see Core MIDI Framework Reference.

MediaToolbox.framework

Provides access to the audio tap interfaces.

For more information about Core Audio, see Core Audio Overview. For information about how to use the Audio Toolbox framework to play sounds, see Audio Queue Services Programming Guide.

Core Graphics Framework

The Core Graphics framework (CoreGraphics.framework) contains the interfaces for the Quartz 2D drawing API. Quartz is the same advanced, vector-based drawing engine that is used in OS X. It supports path-based drawing, antialiased rendering, gradients, images, colors, coordinate-space transformations, and PDF document creation, display, and parsing. Although the API is C based, it uses object-based abstractions to represent fundamental drawing objects, making it easy to store and reuse your graphics content.

For more information on how to use Quartz to draw content, see Quartz 2D Programming Guide and Core Graphics Framework Reference.

Core Image Framework

The Core Image framework (CoreImage.framework) provides a powerful set of built-in filters for manipulating video and still images. You can use the built-in filters for everything from touching up and correcting photos to face and feature detection. The advantage of these filters is that they operate in a nondestructive manner, leaving your original images unchanged. Because the filters are optimized for the underlying hardware, they are fast and efficient.

For information about the classes and filters of the Core Image framework, see Core Image Reference Collection.

Core Text Framework

The Core Text framework (CoreText.framework) offers a simple, high-performance C-based interface for laying out text and handling fonts. This framework is for apps that do not use Text Kit but that still want the kind of advanced text handling capabilities found in word processor apps. The framework provides a sophisticated text layout engine, including the ability to wrap text around other content. It also supports advanced text styling using multiple fonts and rendering attributes.

For more information about the Core Text interfaces, see Core Text Programming Guide and Core Text Reference Collection.

Core Video Framework

The Core Video framework (CoreVideo.framework) provides buffer and buffer-pool support for the Core Media framework (described in “Core Media Framework”). Most apps never need to use this framework directly.

Game Controller Framework

The Game Controller framework (GameController.framework) lets you discover and configure Made-for-iPhone/iPod/iPad (MFi) game controller hardware in your app. Game controllers can be devices connected physically to an iOS device or connected wirelessly over Bluetooth. The Game Controller framework notifies your app when controllers become available and lets you specify which controller inputs are relevant to your app.

For more information about supporting game controllers, Game Controller Programming Guide.

GLKit Framework

The GLKit framework (GLKit.framework) contains a set of Objective-C based utility classes that simplify the effort required to create an OpenGL ES app. GLKit supports four key areas of app development:

  • The GLKView and GLKViewController classes provide a standard implementation of an OpenGL ES–enabled view and associated rendering loop. The view manages the underlying framebuffer object on behalf of the app; your app just draws to it.

  • The GLKTextureLoader class provides image conversion and loading routines to your app, allowing it to automatically load texture images into your context. It can load textures synchronously or asynchronously. When loading textures asynchronously, your app provides a completion handler block to be called when the texture is loaded into your context.

  • The GLKit framework provides implementations of vectors, matrices, and quaternions, as well as a matrix stack operation that provides the same functionality found in OpenGL ES 1.1.

  • The GLKBaseEffect, GLKSkyboxEffect, and GLKReflectionMapEffect classes provide existing, configurable graphics shaders that implement commonly used graphics operations. In particular, the GLKBaseEffect class implements the lighting and material model found in the OpenGL ES 1.1 specification, simplifying the effort required to migrate an app from OpenGL ES 1.1 to later versions of OpenGL ES.

For information about the classes of the GLKit framework, see GLKit Framework Reference.

Image I/O Framework

The Image I/O framework (ImageIO.framework) provides interfaces for importing and exporting image data and image metadata. This framework makes use of the Core Graphics data types and functions and supports all of the standard image types available in iOS. You can also use this framework to access Exif and IPTC metadata properties for images.

For information about the functions and data types of this framework, see Image I/O Reference Collection.

Media Accessibility Framework

The Media Accessibility framework (MediaAccessibility.framework) manages the presentation of closed-caption content in your media files. This framework works in conjunction with new settings that let the user enable the display of closed captions.

For information about the contents of this framework, see the header files.

Media Player Framework

The Media Player framework (MediaPlayer.framework) provides high-level support for playing audio and video content from your app. You can use this framework to do the following:

  • Play video to a user’s screen or to another device over AirPlay. You can play this video full screen or in a resizable view.

  • Access the user’s iTunes music library. You can play music tracks and playlists, search for songs, and present a media picker interface to the user.

  • Configure and manage movie playback.

  • Display Now Playing information in the lock screen and App Switcher. You can also display this information on an Apple TV when content is delivered via AirPlay.

  • Detect when video is being streamed over AirPlay.

For information about the classes of the Media Player framework, see Media Player Framework Reference. For information on how to use these classes to access the user’s iTunes library, see iPod Library Access Programming Guide.

OpenAL Framework

The Open Audio Library (OpenAL) interface is a cross-platform standard for delivering positional audio in apps. You can use it to implement high-performance, high-quality audio in games and other programs that require positional audio output. Because OpenAL is a cross-platform standard, the code modules you write using OpenAL on iOS can be ported to many other platforms easily.

For information about OpenAL, including how to use it, see http://www.openal.org.

OpenGL ES Framework

The OpenGL ES framework (OpenGLES.framework) provides tools for drawing 2D and 3D content. It is a C-based framework that works closely with the device hardware to provide fine-grained graphics control and high frame rates for full-screen immersive apps such as games. You use the OpenGL framework in conjunction with the EAGL interfaces, which provide the interface between your OpenGL ES drawing calls and the native window objects in UIKit.

The framework supports OpenGL ES 1.1, 2.0, and 3.0. The 2.0 specification added support for fragment and vertex shaders and the 3.0 specification added support for many more features, including multiple render targets and transform feedback.

For information on how to use OpenGL ES in your apps, see OpenGL ES Programming Guide for iOS. For reference information, see OpenGL ES Framework Reference.

Quartz Core Framework

The Quartz Core framework (QuartzCore.framework) contains the Core Animation interfaces. Core Animation is an advanced compositing technology that makes it easy to create view-based animations that are fast and efficient. The compositing engine takes advantage of the underlying hardware to manipulate your view’s contents efficiently and in real time. Specify the start and end points of the animation, and let Core Animation do the rest. And because Core Animation is built in to the underlying UIView architecture, it is always available.

For more information on how to use Core Animation in your apps, see Core Animation Programming Guide and Core Animation Reference Collection.

Sprite Kit Framework

The Sprite Kit framework (SpriteKit.framework) provides a hardware-accelerated animation system for 2D and 2.5D games. Sprite Kit provides the infrastructure that most games need, including a graphics rendering and animation system, sound playback support, and a physics simulation engine. Using Sprite Kit frees you from creating these things yourself and lets you focus on the design of your content and the high-level interactions for that content.

Content in a Sprite Kit app is organized into scenes. A scene can include textured objects, video, path-based shapes, Core Image filters, and other special effects. Sprite Kit takes those objects and determines the most efficient way to render them onscreen. When it comes time to animate the content in your scenes, you can use Sprite Kit to specify explicit actions you want to perform or use the physics simulation engine to define physical behaviors (such as gravity, attraction, or repulsion) for your objects.

In addition to the Sprite Kit framework, there are Xcode tools for creating particle emitter effects and texture atlases. You can use the Xcode tools to manage app assets and update Sprite Kit scenes quickly.

For more information about how to use Sprite Kit, see Sprite Kit Programming Guide. For an example of how to use Sprite Kit to build a working app, see code:Explained Adventure.