Core Audio provides software interfaces for implementing audio features in applications you create for iOS and OS X. Under the hood, it handles all aspects of audio on each of these platforms. In iOS, Core Audio capabilities include recording, playback, sound effects, positioning, format conversion, and file stream parsing, as well as:
A built-in equalizer and mixer that you can use in your applications
Automatic access to audio input and output hardware
APIs to let you manage the audio aspects of your application in the context of a device that can take phone calls
Optimizations to extend battery life without impacting audio quality
On the Mac, Core Audio encompasses recording, editing, playback, compression and decompression, MIDI, signal processing, file stream parsing, and audio synthesis. You can use it to write standalone applications or modular effects and codec plug-ins that work with existing products.
Core Audio combines C and Objective-C programming interfaces with tight system integration, resulting in a flexible programming environment that maintains low latency through the signal chain.
Core Audio is available in all versions of OS X, although older versions may not contain some features described here. Core Audio is available in iOS starting with version 2.0. This document describes Core Audio features available as of iOS 2.2 and OS X v10.5.
Core Audio Overview is for all developers interested in creating audio software. Before reading this document you should have basic knowledge of general audio, digital audio, and MIDI terminology. You will also do well to have some familiarity with object-oriented programming concepts and with Apple’s development environment, Xcode. If you are developing for iOS-based devices, you should be familiar with Cocoa Touch development as introduced in iOS App Programming Guide.
Organization of This Document
This document is organized into the following chapters:
“What Is Core Audio?” describes the features of Core Audio and what you can use it for.
“Core Audio Essentials” describes the architecture of Core Audio, introduces you to its programming patterns and idioms, and shows you the basics of how to use it in your applications.
“Common Tasks in OS X” outlines how you can use Core Audio to accomplish several audio tasks in OS X.
This document also contains four appendixes:
“Core Audio Frameworks” lists the frameworks and headers that define Core Audio.
“Core Audio Services” provides an alternate view of Core Audio, listing the services available in iOS, OS X, and on both platforms.
“System-Supplied Audio Units in OS X” lists the audio units that ship in OS X v10.5.
“Supported Audio File and Data Formats in OS X” lists the audio file and data formats that Core Audio supports in OS X v10.5.
For more detailed information about audio and Core Audio, see the following resources:
AVAudioPlayer Class Reference, which describes a simple Objective-C interface for audio playback in iOS applications.
Audio Session Programming Guide, which explains how to specify important aspects of audio behavior for iOS applications.
Audio Queue Services Programming Guide, which explains how to implement recording and playback in your application.
Core Audio Data Types Reference, which describes the data types used throughout Core Audio.
Audio File Stream Services Reference, which describes the interfaces you use for working with streamed audio.
Audio Unit Programming Guide, which contains detailed information about creating audio units for OS X.
Core Audio Glossary, which defines terms used throughout the Core Audio documentation suite.
Apple Core Audio Format Specification 1.0, which describes Apple’s universal audio container format, the Core Audio File (CAF) format.
The Core Audio mailing list: http://lists.apple.com/mailman/listinfo/coreaudio-api
The OS X audio developer site: http://developer.apple.com/audio/
The Core Audio SDK (software development kit), available at http://developer.apple.com/sdk/
© 2008 Apple Inc. All Rights Reserved. (Last updated: 2008-11-13)