Introduction
Whether your multimedia needs are basic or advanced, OS X brings world-class support for adding professional-grade audio and video features to your application.
For audio recording, playback, and synchronization, Audio Queue Services offers a flexible, high-level API. For even more control, look at Extended Audio File Services and audio units—OS X’s audio plug-in architecture.
Audio units provide digital signal processing for filtering, effects, format conversion, I/O, and MIDI-based music synthesis. Use one of the many system-supplied audio units or develop your own. Other OS X interfaces support audio streaming, surround sound, custom codec development, hardware access for driver development and disc recording, and MIDI control.
If your application needs to play video, including content purchased through iTunes, you can take advantage of the new, lightweight, and more efficient media playback capability provided in QuickTime X. You gain access to this capability through the QTKit framework, a feature-rich Objective-C API for manipulating and rendering time-based media such as movies, audio files, animations, and streaming content.
Start Here
Before you embark on adding OS X audio technologies to your application, become familiar with Core Audio’s features and architecture by reading Core Audio Overview. Learn about OS X video support by reading QTKit Application Tutorial.
Want to get familiar with the fundamentals?
Audio Queue Services Programming Guide explains how to add audio recording, playback, and synchronization to your application. Audio Queue Services can work with any OS X audio format.
Sound Programming Topics for Cocoa describes a simple playback interface suitable for playing uncompressed audio.
Audio Unit Programming Guide explains how to create audio processing plug-ins.
Getting Started with Hardware and Drivers provides orientation for supporting or developing audio peripheral devices.
You can also review presentations from past Worldwide Developer Conferences on ADC on iTunes, including Understanding the Core Audio Architecture
Prefer to learn by example?
For audio:
AudioQueueTools demonstrates how to record to an audio file and play it back using Audio Queue Services.
RecordAudioToFile shows how to perform low-latency audio recording using the AUHAL audio unit and Extended Audio File Services.
PlayFile demonstrates audio file playback using Audio File Services, the Audio File Player audio unit, and the Default Output audio unit.
PlaySoftMIDI shows how to play back a MIDI file using system-supplied audio units.
StarterAudioUnitExample is a simple effect audio unit that corresponds to the tutorial in Audio Unit Programming Guide.
Audio Toolbox Convert File provides examples of audio format conversion using Extended Audio File Services and Audio Converter Services.
OpenALExample demonstrates how to bind OpenAL audio sources to OpenGL objects to create an immersive audio environment.
For video, QTKit Application Tutorial explains how to build three different Cocoa applications for playing, editing, and recording audio and video media:
MyMediaPlayer demonstrates how to build a media player using Cocoa bindings. You can extend the media player by adding new capabilities for movie editing and custom movie playback.
MyMediaRecorder builds an application for capturing and recording audio and video, and then outputting that media to QuickTime movies.
StopMotion lets you construct a stop-motion application to capture single frames of video and assemble those frames into an animated QuickTime movie for playback.
Go In Depth
To perform spatial manipulation of sound in your application, especially if you are a games developer, use the OS X OpenAL framework. Learn more about OpenAL on the OpenAL website. The AU Lab application (Apple’s reference audio unit host, included with Xcode Tools) supports working with surround sound.
To add recording capability to your application, use Audio Queue Services. Read Audio Queue Services Programming Guide to learn how to record linear PCM or compressed audio.
To parse an audio file stream, use Audio File Stream Services, part of the Audio Toolbox framework. Read Audio File Stream Services Reference and Audio File Services Reference, which describe the C interfaces you need.
To support MIDI interfacing, play MIDI data from a file, or record incoming MIDI data, read Core MIDI Services and MIDI Server Services in Core Audio Overview. Look at MIDI File Formats to learn how to use MIDI data in QuickTime.
If you are a hardware vendor, you may need to supply drivers to allow your product to interact with Mac apps. Core Audio supports driver development. Consult Audio Device Driver Programming Guide.
If your application offers disc recording capability, refer to Disc Recording Framework Reference and Disc Recording UI Framework Reference for comprehensive descriptions of these interfaces.
Ready for More?
The OS X Reference Library contains many additional resources to make your job easier. Browse by topic, framework, or resource type (such as guides or sample code). Set filters to focus on what you are looking for.
Copyright © 2004, 2009 Apple Inc. All Rights Reserved. Terms of Use | Privacy Policy | Updated: 2009-05-27