Audio & Video Starting Point
Multimedia technologies in iOS let you access the sophisticated audio and video capabilities of iPhone, iPad, and iPod touch. Specialized classes let you easily add basic features such as iPod library playback and movie capture, while rich multimedia APIs support advanced solutions.
Choose the right technology for your needs:
To play the audio items in a user’s iPod library, or to play local or streamed movies, use the Media Player framework. Classes in this framework automatically support sending audio and video to AirPlay devices such as Apple TV.
To easily add picture or movie capture to your app, employ dedicated classes and functions from the UIKit framework.
For basic audio recording and playback, including stereo panning, synchronization, and metering, use the audio classes from the AV Foundation framework.
To add high-performance positional audio playback to your OpenGL-based game or other app, take advantage of the open-source OpenAL (Open Audio Library) API.
To work directly with audio and video data—for high performance or advanced solutions such as VoIP, streaming, virtual music instruments, or MIDI (Musical Instrument Digital Interface)—use the AV Foundation framework, the Assets Library framework, the various Core Audio frameworks (including the Core Audio, Audio Toolbox, and Audio Unit frameworks), and the Core MIDI framework.
Familiarize yourself with iOS audio development by checking out these resources:
Read “Using Audio” in Multimedia Programming Guide to learn about audio development for iOS devices. Make sure to understand the importance of audio session objects as introduced in “The Basics: Audio Codecs, Supported Audio Formats, and Audio Sessions” in Multimedia Programming Guide.
View the avTouch sample code project, which shows how to play sounds with the
AVAudioPlayerclass; the SpeakHere project, which demonstrates basic recording and playback; and the Audio UI Sounds (SysSound) project, which demonstrates how to invoke vibration and play alerts and user-interface sound effects.
Download and explore the AddMusic sample code project to see a simple demonstration of how to add iPod library playback to your app.
Read MPVolumeView Class Reference to learn how to quickly add AirPlay capability to your app.
Get up and running with iOS video development with these resources:
View the MoviePlayer sample code project, which demonstrates the powerful
MPMoviePlayerControllerclass for playing local or streamed video content; and the PhotoPicker project, which demonstrates simple movie and picture capture using the UIKit framework.
Gain a complete understanding of audio session objects, and how they determine your app’s audio behavior, by reading Audio Session Programming Guide. Also be sure to read “Sound” in iOS Human Interface Guidelines, which explains how your app should handle sound to meet user expectations.
No matter which iOS audio technologies you employ, users expect to be able to play and pause your app’s audio using the system transport controls in the multitasking UI. To learn how to support this feature, read “Remote Control of Multimedia” in Event Handling Guide for iOS.
Take full advantage of the iPod library by reading iPod Library Access Programming Guide along with Media Player Framework Reference.
To learn how to play audio using OpenAL, view the oalTouch project. The website openal.org hosts documentation for the open-source OpenAL API.
To play streamed audio content, such as from a network connection, use an
AVPlayer object as described in “Playback” in AV Foundation Programming Guide. You can also play certain Internet audio files by using the
MPMoviePlayerController class; for sample code that shows how, see MoviePlayer.
To play audio files with stereo panning, synchronization, and metering, use the
AVAudioPlayer class. To record audio, use the
AVAudioRecorder class. Apple recommends these classes for audio playback and recording when you do not need direct access to audio data.
To get started with handling audio data directly, read Core Audio Essentials in Core Audio Overview to learn about the architecture, programming conventions, and use of Core Audio.
If you are creating a VoIP (voice over Internet protocol) app or a virtual music instrument, you need the highest audio performance available in iOS. The solution to use is audio units, the iOS audio plug-in technology. Audio units also provide advanced capabilities including mixing and equalization. Read Audio Unit Hosting Guide for iOS to learn how to use audio units. View the Audio Mixer (MixerHost) and iPhoneMixerEQGraphTest sample code projects.
To create a MIDI app for connecting hardware keyboards or synthesizers to an iOS device, refer to Core MIDI Framework Reference and look at the MFi program.
To progress beyond the capabilities of the Media Player framework and the
UIImagePickerController class, read AV Foundation Programming Guide. AV Foundation, with support from the Assets Library and Core Media frameworks, provides tools for advanced video solutions including track-based editing, transcoding, and direct access to data from the camera and microphone.
View the AVCam sample code project to see how to capture still images and movies using the AV Foundation framework. AVCam demonstrates the use of several important AV Foundation classes. View the AVPlayerDemo project to see how to play movies from the iPod library.