iOS handles audio behavior at the app, inter-app, and device levels through audio sessions and the
AVAudioSession API, you resolve questions such as:
Should your app’s audio be silenced by the Ring/Silent switch? Yes, if audio is not essential to your app. An example is an app that lets users take notes in a meeting without its sound disturbing others. But a pronouncing dictionary should ignore the Ring/Silent switch and always play—the central purpose of the app requires sound.
Should music continue when your audio starts? Yes, if your app is a virtual piano that lets users play along to songs from their music libraries. On the other hand, music should stop when your app starts if your app plays streaming Internet radio.
Users may plug in or unplug headsets, phone calls may arrive, and alarms may sound. Indeed, the audio environment on an iOS device is quite complex. iOS does the heavy lifting, while you employ audio session APIs to specify configuration and to respond gracefully to system requests, using very little code.
At a Glance
AVAudioSession gives you control your app’s audio behavior. You can:
Select the appropriate input and output routes for your app
Determine how your app integrates audio from other apps
Handle interruptions from other apps
Automatically configure audio for the type of app your are creating
An Audio Session Encapsulates a Set of Behaviors
An audio session is the intermediary between your app and iOS used to configure your app’s audio behavior. Upon launch, your app automatically gets a singleton audio session.
Categories Express Audio Roles
The primary mechanism for expressing audio behaviors is the audio session category. By setting the category, you indicate whether your app uses input or output routes, whether you want music to continue playing along with your audio, and so on. The behavior you specify should meet user expectations as described in “Sound” in iOS Human Interface Guidelines.
Seven audio session categories, along with a set of override and modifier switches, let you customize audio behavior according to your app’s personality or role. Various categories support playback, recording, and playback along with recording. When the system knows your app’s audio role, it affords you appropriate access to hardware resources. The system also ensures that other audio on the device behaves in a way that works for your app; for example, if you need the Music app to be interrupted, it is.
Modes Customize Categories
Users expect certain behaviors from certain categories of apps. Modes specialize the behavior of a given category. For example, when an app uses the Video Recording mode, the system may choose a different built-in microphone than it would if it was using the default mode. The system may also engage microphone signal processing that is tuned for video recording use cases.
Notifications Support Interruption Handling
An audio interruption is the deactivation of your app’s audio session—which immediately stops your audio. Interruptions occur when a competing audio session from an app activates and that session is not categorized by the system to mix with yours. After your session goes inactive, the system sends a “you were interrupted” message which you can respond to by saving state, updating the user interface, and so on.
To handle interruptions, register for
AVAudioSessionInterruptionNotification provided in
AVAudioSession. Write your
endInterruption methods to ensure the minimum possible disruption, and the most graceful possible recovery, from the perspective of the user.
Notifications Support Audio Route Change Handling
Users have particular expectations when they initiate an audio route change by docking or undocking a device, or by plugging in or unplugging a headset. “Sound” in iOS Human Interface Guidelines describes these expectations and provides guidelines on how to meet them. Handle route changes by registering for
Categories Support Advanced Features
You can fine-tune an audio session category in a variety of ways. Depending on the category, you can:
Allow other audio (such as from the Music app) to mix with yours when a category normally disallows it.
Change the audio output route from the receiver to the speaker.
Allow Bluetooth audio input.
Specify that other audio should reduce in volume (“duck”) when your audio plays.
Optimize your app for device hardware at runtime. Your code adapts to the device it’s running on and to changes initiated by the user (such as by plugging in a headset) as your app runs.
Be familiar with Cocoa Touch development as introduced in iOS App Programming Guide and with the basics of Core Audio as described in that document and in Core Audio Overview. Because audio sessions bear on practical end-user scenarios also be familiar with iOS devices, and with iOS Human Interface Guidelines, especially the “Sound” in iOS Human Interface Guidelines section in iOS Human Interface Guidelines.
You may find the following resources helpful:
AVAudioSession Class Reference, which describes the Objective-C interface for configuring and using this technology.
AddMusic, a sample code project that demonstrates use of an audio session object in the context of a playback app. This sample also demonstrates coordination between app audio and Music app audio.