This article summarizes the key developer-related features introduced in iOS 5.0. This version of the operating system runs on current iOS-based devices. In addition to describing the key new features, this article lists the documents that describe those features in more detail.
For late-breaking news and information about known issues, see iOS 5.0 Release Notes. For the complete list of new APIs added in iOS 5.0, see iOS 5.0 API Diffs.
iCloud Storage APIs
The iCloud storage APIs let your app write user documents and data to a central location and access those items from all of a user’s computers and iOS devices. Making a user’s documents ubiquitous using iCloud means that a user can view or edit those documents from any device without having to sync or transfer files explicitly. Storing documents in a user’s iCloud account also provides a layer of security for that user. Even if a user loses a device, the documents on that device are not lost if they are in iCloud storage.
There are two ways that apps can take advantage of iCloud storage, each of which has a different intended usage:
iCloud document storage—Use this feature to store user documents and data in the user’s iCloud account.
iCloud key-value data storage—Use this feature to share small amounts of data among instances of your app.
Most apps will use iCloud document storage to share documents from a user’s iCloud account. This is the feature that users think of when they think of iCloud storage. A user cares about whether documents are shared across devices and can see and manage those documents from a given device. In contrast, the iCloud key-value data store is not something a user would see. It is a way for your app to share small amounts of data (up to a per-app total of 1 MB and a maximum of 1,024 keys) with other instances of itself. Apps can use this feature to store important state information. A magazine app might save the issue and page that the user read last, while a stocks app might store the stock symbols the user is tracking.
For information on how to use iCloud document and key-value data storage, see iCloud Design Guide.
Users can now opt to have their apps and app data backed up directly to their iCloud account, making it easier to restore apps to their most recent state. Having data backed up in iCloud makes it easy for a user to reinstall that data to a new or existing iOS device. However, because the amount of space in a user’s iCloud account is limited, apps must be even more selective about where they store files.
The placement of files in your app’s home directory determines what gets backed up and what does not. Anything that would be backed up to a user’s computer is also backed up wirelessly to iCloud. Thus, everything in the
Documents directory and most (but not all) of your app’s
Library directory. To minimize the amount of data stored in the user’s iCloud account, developers are encouraged to put more files in the
Library/Caches directory, especially if those files can be easily re-created or obtained in another way.
For information on which directories are backed up, and for information about iCloud document storage, see App Programming Guide for iOS.
Automatic Reference Counting
Automatic Reference Counting (ARC) is a compiler-level feature that simplifies the process of managing the lifetimes of Objective-C objects. Instead of you having to remember when to retain or release an object, ARC evaluates the lifetime requirements of your objects and automatically inserts the appropriate method calls at compile time.
To be able to deliver these features, ARC imposes some restrictions—primarily enforcing some best practices and disallowing some other practices:
In addition, you cannot implement custom
Because you do not call the
releasemethod, there is often no need to implement a custom
deallocmethod—the compiler synthesizes all that is required to relinquish ownership of instance variables. You can provide a custom implementation of
deallocif you need to manage other resources.
Do not store object pointers in C structures.
Store object pointers in other objects instead of in C structures.
Do not directly cast between object and nonobject types (for example, between
You must use special functions or casts that tell the compiler about an object’s lifetime. You use these to cast between Objective-C objects and Core Foundation objects.
You cannot use
Instead, you must use a new
@autoreleasepoolkeyword to mark the start of an autorelease block. The contents of the block are enclosed by curly braces, as shown in the following example:
// Your code here
ARC encourages you to think in terms of object graphs, and the relationships between objects, rather than in terms of retain and release. For this reason, ARC introduces new lifetime qualifiers for objects, including zeroing weak references. The value of a zeroing weak reference is automatically set to
nil if the object to which it points is deallocated. There are qualifiers for variables, and new
strong declared property attributes, as illustrated in the following examples:
// The following declaration is a synonym for: @property(retain) MyClass *myObject;
@property(strong) MyClass *myObject;
// The following declaration is similar to "@property(assign) MyOtherClass *delegate;"
// except that if the MyOtherClass instance is deallocated,
// the property value is set to nil instead of remaining as a dangling pointer
@property(weak) MyOtherClass *delegate;
Xcode provides migration tools to help convert existing projects to use ARC. For more information about how to perform this migration, see What’s New in Xcode. For more information about ARC itself, see Transitioning to ARC Release Notes.
Storyboards are the new way to define your app’s user interface. In the past, you used nib files to define your user interface one view controller at a time. A storyboard file captures your entire user interface in one place and lets you define both the individual view controllers and the transitions between those view controllers. As a result, storyboards capture the flow of your overall user interface in addition to the content you present.
If you are creating new apps, the Xcode templates come preconfigured to use storyboards. For other apps, the process for using storyboards is as follows:
Configure your app’s
Info.plistfile to use storyboards:
UIMainStoryboardFilekey and set its value to the name of your storyboard file.
Remove the existing
NSMainNibFilekey. (Storyboards replace the main nib file.)
Create and configure the storyboard file in Xcode; see Creating Storyboard Files.
Update your view controllers to handle storyboard transitions; see Preparing to Transition to a New View Controller.
If you ever need to present a view controller manually (perhaps to support motion-related events), use the storyboard classes to retrieve and present the appropriate view controller; see Presenting Storyboard View Controllers Programmatically.
Apps can use a single storyboard file to store all of their view controllers and views. At build time, Interface Builder takes the contents of the storyboard file and divides it up into discrete pieces that can be loaded individually for better performance. Your app never needs to manipulate these pieces directly, though. All you must do is declare the main storyboard in your app’s
Info.plist file. UIKit handles the rest.
Creating Storyboard Files
You use Interface Builder to create storyboard files for your app. Most apps need only one storyboard file, but you can create multiple storyboard files if you want. Every storyboard file has a view controller known as the initial view controller. This view controller represents the entry point into the storyboard. For example, in your app’s main storyboard file, the initial view controller would be the first view controller presented by your app.
Each view controller in a storyboard file manages a single scene. For iPhone apps, a scene manages one screen’s worth of content, but for iPad apps the content from multiple scenes can be on screen simultaneously. To add new scenes to your storyboard file, drag a view controller from the library to the storyboard canvas. You can then add controls and views to the view controller’s view just as you would for a nib file. And as before, you can configure outlets and actions between your view controller and its views.
When you want to transition from one view controller to another, Control-click a button, table view cell, or other trigger object in one view controller, and drag to the view controller for a different scene. Dragging between view controllers creates a segue, which appears in Interface Builder as a configurable object. Segues support all of the same types of transitions available in UIKit, such as modal transitions and navigation transitions. You can also define custom transitions and transitions that replace one view controller with another.
For more information about using Interface Builder to configure your storyboard files, see What’s New in Xcode.
Preparing to Transition to a New View Controller
Whenever a user triggers a segue in the current scene, the storyboard runtime calls the
prepareForSegue:sender: method of the current view controller. This method gives the current view controller an opportunity to pass any needed data to the view controller that is about to be displayed. When implementing your view controller classes, you should override this method and use it to handle these transitions.
For more information about implementing the methods of the
UIViewController class, see UIViewController Class Reference.
Presenting Storyboard View Controllers Programmatically
Although the storyboard runtime usually handles transitions between view controllers, you can also trigger segues programmatically from your code. You might do so when it is not possible to set up the segue entirely in Interface Builder—for example, when doing so involves using accelerometer events to trigger the transition. There are several options for transitioning to a new view controller:
If a storyboard file contains an existing segue between the current view controller and the destination view controller (perhaps triggered by some other control in the view controller), you can trigger that segue programmatically using the
If there is no segue between the view controllers but the destination view controller is defined in the storyboard file, first load the view controller programmatically using the
UIStoryboard. Then present the view controller using any of the existing programmatic means, such as by pushing it on a navigation stack.
If the destination view controller is not in the storyboard file, create it programmatically and present it as described in View Controller Programming Guide for iOS.
Newsstand provides a central place for users to read magazines and newspapers. Publishers who want to deliver their magazine and newspaper content through Newsstand can create their own iOS apps using the Newsstand Kit framework (
NewsstandKit.framework), although doing so is not required. A big advantage of the Newsstand Kit framework, however, is that you can use it to initiate background downloads of new magazine and newspaper issues. After you start a download, the system handles the download operation and notifies your app when the new content is available.
Unlike other iOS apps, Newsstand apps appears only in Newsstand itself, not in a user’s Home screen. And instead of displaying an app icon, the app typically displays the cover of their most recent issue, with some additional adornments provided by Newsstand. When the user taps that cover art, your app launches normally to present the current issue or any back issues that were downloaded and are still available.
Creating an app that uses Newsstand Kit requires some interplay between the actual app and the content servers that you manage. Your servers are responsible for notifying the app when new content is available, typically using a push notification. If your Newsstand app includes the
UIBackgroundModes key with the
newsstand-content value in its
Info.plist file, your Newsstand app is launched in the background so that it can start downloading the latest issue. The download process itself is managed by the system, which notifies your app when the content is fully downloaded and available.
When your server is alerting your app of a new issue, that server should include the
content-available property (with a value of
1) in the JSON payload. This property tells the system that it should launch your app so that it can begin downloading the new issue. Apps are launched and alerted to new issues once in a 24-hour period at most, although if your app is running when the notification arrives, it can begin downloading the content immediately.
In addition to your server providing content for each new issue, it should also provide cover art to present in Newsstand when that issue is available. This cover art is displayed in place of the app’s default icon, which is specified by the Newsstand-specific icons in the
CFBundleIcons key of your app’s
Info.plist file. Cover art gives users a more visual cue that a new issue is available. Your app can also add a badge to new issues.
For information about the classes you use to manage Newsstand downloads, see NewsstandKit Framework Reference. For information about how to use push notifications to notify your apps, see Local and Remote Notification Programming Guide. For more information about setting up Newsstand subscriptions, see In-App Purchase Programming Guide.
AirPlay lets users stream audio and video from an iOS-based device to AirPlay–enabled devices such as televisions and audio systems. In iOS 5, developers can now use AirPlay to present app content on a nearby Apple TV 2. Users can now mirror the content of an iPad 2 to an Apple TV using AirPlay for any app. And developers who want to display different content (instead of mirroring) can assign a new window object to any
UIScreen objects connected to an iPad 2 via AirPlay.
In addition, you can now take advantage of AirPlay in the following ways:
The Media Player framework includes support for displaying “Now Playing” information in several locations, including as part of the content delivered over AirPlay; see MPNowPlayingInfoCenter Class Reference.
UIWebViewclass now supports the presentation of multimedia content over AirPlay. This support is enabled by default, but you can opt out of it if you want to.
For more information about delivering content over AirPlay, and the support media formats, see AirPlay Overview.
In iOS 5.0, there are several new frameworks you should investigate.
The GLKit framework (
GLKit.framework) contains a set of Objective-C based utility classes that simplify the effort required to create an OpenGL ES 2.0 app. GLKit provides support for four key areas of app development:
GLKViewControllerclasses provide a standard implementation of an OpenGL ES–enabled view and associated rendering loop. The view manages the underlying framebuffer object on behalf of the app; your app just draws to it.
GLKTextureLoaderclass provides image conversion and loading routines to your app, allowing it to automatically load texture images into your context. It can load textures synchronously or asynchronously. When loading textures asynchronously, your app provides a completion handler block to be called when the texture is loaded into your context.
The GLKit framework provides implementations of vector, matrix, and quaternions as well as a matrix stack operation to provide the same functionality found in OpenGL ES 1.1.
GLKReflectionMapEffectclasses provide configurable graphics shaders that implement commonly used graphics operations. In particular, the
GLKBaseEffectclass implements the lighting and material model found in the OpenGL ES 1.1 specification, simplifying the effort required to migrate an app from OpenGL ES 1.1 to OpenGL ES 2.0.
For information about the classes of the GLKit framework, see GLKit Framework Reference.
Core Image Framework
The Core Image framework (
CoreImage.framework) provides a powerful set of built-in filters for manipulating video and still images. You can use the built-in filters for everything from simple operations (like touching up and correcting photos) to more advanced operations (like face and feature detection). The advantage of using these filters is that they operate in a nondestructive manner so that your original images are never changed directly. In addition, Core Image takes advantage of the available CPU and GPU processing power to ensure that operations are fast and efficient.
CIImage class provides access to a standard set of filters that you can use to improve the quality of a photograph. To create other types of filters, you can create and configure a
CIFilter object for the appropriate filter type.
For information about the classes and filters of the Core Image framework, see Core Image Reference Collection.
The Twitter framework (
Twitter.framework) provides support for sending Twitter requests on behalf of the user and for composing and sending tweets. For requests, the framework handles the user authentication part of the request for you and provides a template for creating the HTTP portion of the request. (Refer to the Twitter API for populating the content of the request.) The composition of tweets is accomplished using the
TWTweetComposeViewController class, which is a view controller that you post with your proposed tweet content. This class gives the user a chance to edit or modify the tweet before sending it.
Users control whether an app is allowed to communicate with Twitter on their behalf using Settings. The Twitter framework also works in conjunction with the Accounts framework (
Accounts.framework) to access the user’s account.
For information about the classes of the Twitter framework, see Twitter Framework Reference. For information about the Accounts framework, see Accounts Framework.
The Accounts framework (
Accounts.framework) provides a single sign-on model for certain user accounts. Single sign-on improves the user experience, because apps no longer need to prompt a user separately for login information related to an account. It also simplifies the development model for you by managing the account authorization process for your app. In iOS 5.0, apps can use this framework in conjunction with the Twitter framework to access a user’s Twitter account.
For more information about the classes of the Accounts framework, see Accounts Framework Reference.
Generic Security Services Framework
The Generic Security Services framework (
GSS.framework) provides a standard set of security-related services to iOS apps. The basic interfaces of this framework are specified in IETF RFC 2743 and RFC 4401. In addition to offering the standard interfaces, iOS includes some additions for managing credentials that are not specified by the standard but that are required by many apps.
For information about the interfaces of the GSS framework, see the header files.
The Core Bluetooth framework (
CoreBluetooth.framework) allows developers to interact specifically with Bluetooth Low-Energy ("LE") accessories. The Objective-C interfaces of this framework allow you to scan for LE accessories, connect and disconnect to ones you find, read and write attributes within a service, register for service and attribute change notifications, and much more.
For more information about the interfaces of the Core Bluetooth framework, see the header files.
App Design-Level Improvements
The following sections describe new capabilities that you can incorporate into the model, view, and controller layers of your app.
Cocoa Touch now includes a
UIDocument class for managing the data associated with user documents. If you are implementing a document-based app, this class reduces the amount of work you must do to manage your document data. In addition to providing a container for all of your document-related data, the
UIDocument class provides built-in support for a number of features:
Asynchronous reading and writing of data on a background queue, allowing your app to remain responsive to users while reading and writing operations occur.
Support for coordinated reading and writing of documents, which is required for documents in iCloud storage.
Safe saving of document data by writing data first to a temporary file and then replacing the current document file with it.
Support for resolving conflicts between different versions of your document if a conflict occurs.
Automatic saving of document data at opportune moments.
Support for flat file and package file representations on disk.
For apps that use Core Data to manage their content, there is also a
UIManagedDocumentsubclass to manage interactions with documents in the database.
If you are implementing an app that supports iCloud storage, the use of document objects makes the job of storing files in iCloud much easier. Document objects are file presenters and handle many of iCloud-related notifications that you might otherwise have to handle yourself. For more information about supporting iCloud storage, see iCloud Storage APIs.
For information about the
UIDocument class, see UIDocument Class Reference. For information about the
UIManagedDocument class, see UIManagedDocument Class Reference.
Data Protection Improvements
Introduced in iOS 4.0, data protection lets you store app and user data files on disk in an encrypted format so that they can be accessed only when a user’s device is unlocked. In iOS 5.0, you now have more flexibility regarding when your app can access protected files.
NSFileProtectionCompleteUnlessOpenoption, your app can access a file while the device is unlocked and, if you keep the file open, continue to access that file after the user locks the device.
NSFileProtectionCompleteUntilFirstUserAuthenticationoption, your app cannot access the file while the device is booting or until the user unlocks the device. After the user unlocks the device for the first time, you can access the file even if the user subsequently locks the device again.
You should protect files as soon as possible after creating them. For information about how to protect files, see App Programming Guide for iOS.
Custom Appearance for UIKit Controls
You can now customize the appearance of many UIKit views and controls to give your app a unique look and feel. For example, you might use these customizations to make the standard system controls match the branding for the rest of your app.
UIKit supports the following customizations:
You can set the tint color, background image, and title position properties (among other) on a wide variety of objects, including toolbars, navigation bars, search bars, buttons, sliders, and some other controls.
You can set attributes of some objects directly, or you can set the default attributes to use for a class using an appearance proxy.
An appearance proxy is an object you use to modify the default appearance of visual objects such as views and bar items. Classes that adopt the
UIAppearanceprotocol support the use of an appearance proxy. To modify the default appearance of such a class, retrieve its proxy object using the
appearanceclass method and call the returned object’s methods to set new default values. A proxy object implements those methods and properties from its proxied class that are tagged with the
UI_APPEARANCE_SELECTORmacro. For example, you can use a proxy object to change the default tint color (through the
trackTintColorproperties) of the
If you want to set a different default appearance based on how a given object is used in your app, you can do so using the proxy object returned by the
appearanceWhenContainedIn:method instead. For example, you use this proxy object to set specific default values for a button only when it is contained inside a navigation bar.
Any changes you make with a proxy object are applied, at view layout time, to all instances of the class that exist or that are subsequently created. However, you can still override the proxy defaults later using the methods and properties of a given instance.
For information about the methods for customizing the appearance of a class, see the description of that class in UIKit Framework Reference.
Container View Controller Support
UIViewController class now allows you to define your own custom container view controllers and present content in new and interesting ways. Examples of existing container view controllers include
UISplitViewController. These view controllers mix custom content with content provided by one or more separate view controller objects to create a unique presentation for app content. Container view controllers act as a parent for their contained view controllers, forwarding important messages about rotations and other relevant events to their children.
For more information about view controllers and the methods you need to implement for container view controllers, see UIViewController Class Reference.
Apps that deliver custom preferences can now use a new radio group element. This element is similar to the multivalue element for selecting one item from a list of choices. The difference is that the radio group element displays its choices inline with your preferences instead of on a separate page.
For more information about displaying app preferences using the Settings app, see App-Related Resources in App Programming Guide for iOS. For information about the property-list keys you use to build your Settings bundle, see Settings Application Schema Reference.
The following sections describe the improvements to the Xcode tools and the support for developing iOS apps. For detailed information about the features available in Xcode 4.2, see What’s New in Xcode.
Xcode 4.2 adds support for many features that are available in iOS 5.0.
The LLVM compiler supports Automatic Reference Counting (ARC), and Xcode includes a menu item to convert targets to use ARC. (For more information about ARC and about how to use it in your apps, see Automatic Reference Counting.)
The Interface Builder user interface provides support for creating storyboard files for your iOS apps. (For more information about using storyboards in your iOS apps, see Storyboards.)
In iOS Simulator, you can now simulate different locations for apps using the Core Location framework.
You can download your app data from an iOS device and automatically restore that data when debugging or testing in iOS simulator or on a device.
OpenGL ES Debugging
The debugging experience in Xcode has been updated to include a new workflow for debugging OpenGL ES apps. You can now use Xcode to do the following for your OpenGL ES apps:
Introspect OpenGL ES state information and objects such as view textures, shaders, and so on.
Set breakpoints on OpenGL ES errors, set conditional OpenGL ES entry point breakpoints, break on frame boundaries, and so on.
UI Automation Enhancements
The Automation instrument now includes a script editor and the ability to capture (record) actions into your script as you perform them on a device. There are also enhancements to the objects that you use in the Automation instrument to automate UI testing:
UIATargetobject can now simulate rotate gestures and location changes.
UIAHostobject supports executing a task from the Instruments app itself.
UIAElementobject can now simulate a rotate gesture centered on the element.
Several functions that were previously available only in
UIAPopoverwere moved to
UIAElementbecause they are common to all element objects.
UIAKeyboardobject now supports performing a sequence of keyboard taps to simulate the typing of a string.
The Instruments app in Xcode 4.2 adds several new instruments for iOS developers:
System Trace for iOS—Uses several instruments to profile aspects of Mac OS X or iOS that could be affecting app performance, such as system calls, thread scheduling, and VM operations.
Network Connections instrument—Inspect how your app is using TCP/IP and UDP/IP connections. With this instrument it is possible to see how much data is flowing through each connection and for each app. You can also use it to display interesting statistics, such as round trip times and retransmission request information.
Network Activity (located in Energy Diagnostics)—Helps bridge the gap between networking (cellular and WiFi) and energy usage. You use it to display device-wide data flow through each network interface alongside energy usage level data that is taken directly from the battery.
For information about using these new instruments, see Instruments New Features User Guide
Additional Framework Enhancements
In addition to the items discussed in the preceding sections, the following frameworks have additional enhancements. For a complete list of new interfaces, see iOS 5.0 API Diffs.
The UIKit framework (
UIKit.framework) includes the following enhancements:
The UIKit framework provides support for loading and using storyboards; see Storyboards.
Bars and bar button items can now be tinted and customized for your app; see Custom Appearance for UIKit Controls.
UIPageViewControllerclass is a new container view controller for creating page-turn transitions between view controllers.
UIReferenceLibraryViewControllerclass adds support for presenting a custom dictionary service to the user.
UIImagePickerControllerclass supports new options for specifying the quality of video recordings.
UIStepperclass is a new control for incrementing a floating-point value value up or down within a configurable range.
View-based animations now support cross-dissolve, flip-from-top, and flip-from-bottom animations; see UIView Class Reference.
UIApplicationclass now reports the language directionality of the running app.
UITableViewclass adds support for automatic row animations, moving rows and sections, multiselection, and copy and paste behaviors for cells.
UIScreenclass lets you specify overscan compensation behaviors for video delivered over AirPlay or through an attached HDMI cable. You can also programmatically set a screen’s brightness.
UIScrollViewclass now exposes its gesture recognizers so that you can configure them more precisely for your app.
UISegmentedControlclass now supports proportional segment widths.
UIAlertViewclass now supports password-style text entry and special configurations for entering text securely.
UIColorclass includes support for Core Image and new methods to retrieve individual color values.
UIImageclass includes support for Core Image, support for stretching an image by tiling part of its content, and support for looping animations.
UITextInputTraitsprotocol adds support for a Twitter-specific keyboard and separate spell-checking behavior.
UIAccessibilityprotocol includes new interfaces that define the activation point within an element and indicate whether an element is modal or contains hidden elements. There are also new notifications that inform you of changes in system-provided accessibility features, such as zoom, audio status, and closed captioning.
UIAccessibilityReadingContentprotocol allows you to provide a continuous, page-turning reading experience to VoiceOver users.
UIAccessibilityIdentificationprotocol allows you to uniquely identify elements in your app so that you can refer to them in automation scripts.
UIWebViewclass automatically supports the presentation of multimedia content over AirPlay. You can opt out of this behavior by changing the value in the
mediaPlaybackAllowsAirPlayproperty of the class. This class also exposes a
scrollViewproperty so that you can access the scrolling properties of your web interfaces.
For information about the classes of the UIKit framework, see UIKit Framework Reference.
OpenGL ES (
OpenGLES.framework) now includes the following new extensions:
The EXT_debug_label and EXT_debug_marker extensions allow you to annotate your OpenGL ES drawing code with information specific to your app. The OpenGL ES Performance Detective, the OpenGL ES Debugger, and the OpenGL ES Analyzer tools provided by Xcode all take advantage of these annotations.
The EXT_color_buffer_half_float extension allows 16-bit floating point formats to be specified for a frame buffer's color renderbuffer.
The EXT_occlusion_query_boolean extension allows your app to determine whether any pixels would be drawn by a primitive or by a group of primitives.
The EXT_separate_shader_objects extension allows your app to specify separate vertex and fragment shader programs.
The EXT_shadow_samplers extension provides support for shadow maps.
The EXT_texture_rg extension adds one-component and two-component texture formats suitable for use in programmable shaders.
As always, check for the existence of an extension before using it in your app.
The OpenAL framework (
OpenAL.framework) has two significant extensions:
You can get notifications about source state changes and changes regarding the number of audio buffers that have been processed.
The Apple Spatial Audio extension in iOS 5.0 adds three audio effects that are especially useful for games: reverberation, obstruction effects, and occlusion effects.
For information about the OpenAL interfaces, see the header files.
The Message UI framework (
MessageUI.framework) adds a new notification for tracking changes to the device’s ability to send text messages. For information about the interfaces of the Message UI framework, see Message UI Framework Reference.
The Media Player framework (
MediaPlayer.framework) includes the following enhancements:
There is now support for displaying “Now Playing” information in the lock screen and multitasking controls. This information can also be displayed on an Apple TV and with content delivered via AirPlay.
Apps can now use the framework to play content from iTunes University.
For information about the classes in the Media Player framework, see Media Player Framework Reference.
The Map Kit framework (
MapKit.framework) supports the ability to use heading data to rotate a map based on the user’s current orientation. As you can with the Maps app, you can configure your map view to scroll the map according to the user’s current location. For example, a walking tour app might use this to show the user their current location on the tour.
For information on the interfaces you use to implement map scrolling and rotation, see MapKit Framework Reference.
The iAd framework (
iAd.framework) provides new callback methods for developers who use multiple ad networks and want to be notified when a new ad is available. The
bannerViewWillLoadAd: method (defined in the
ADBannerViewDelegate protocol) is called when a banner has confirmed that an ad is available but before the ad is fully downloaded and ready to be presented. The
interstitialAdWillLoad: method (defined in the
ADInterstitialAdDelegate protocol) offers similar behavior for interstitial ads.
For information about the classes of the iAd framework, see iAd Framework Reference.
The Game Kit framework (
GameKit.framework) and Game Center now have the following features:
GKTurnBasedMatchclass provides support for turn-based gaming, which allows games to create persistent matches whose state is stored in iCloud. Your game manages the state information for the match and determines which player must act to advance the state of the match.
Your game can now adjust the default leaderboard (implemented by the
GKLeaderboardclass) shown to each player. If your game does not change the default leaderboard for a player, that player sees the leaderboard configured for your app in iTunes Connect.
GKNotificationBannerclass implements a customizable banner similar to the banner shown to players when they log in to Game Center. Your game may use this banner to display messages to the player.
When your game reports an achievement, it can automatically display a banner to the player using the
GKMatchmakerViewControllerobject can now add players to an existing match in addition to creating a new match.
GKMatchDelegateprotocol now includes a method to reconnect devices when a two-player match is disconnected.
For information about the classes of the Game Kit framework, see GameKit Framework Reference.
The Foundation framework (
Foundation.framework) includes the following enhancements:
NSFileManagerclass includes new methods for moving a file to a user’s iCloud storage.
NSFileVersionclass reports and manages conflicts between different versions of a file in iCloud.
NSURLclass includes new methods and constants to support syncing items to a user’s iCloud storage.
NSMetadataQueryclass supports attributes for items synced to a user’s iCloud storage. Several other metadata-related classes were also added, including
NSLinguisticTaggerclass is a new class lets you break down a sentence into its grammatical components, allowing the determination of nouns, verbs, adverbs, and so on. This tagging works fully for English and the class also provides a method to find out what capabilities are available for other languages.
This framework now includes the
NSFileWrapperclass for managing file packages—that is, files implemented as an opaque directory.
NSOrderedSetcollection class offers the semantics of sets, whereby each element occurs at most once in the collection, but where elements are in a specific order.
Most delegate methods are now declared using formal protocols instead of as categories on
For information about the classes of the Foundation framework, see Foundation Framework Reference.
Apps that use the External Accessory framework to communicate with external accessories can now ask to be woken up if the app is suspended when its accessory delivers new data. Including the
UIBackgroundModes key with the
external-accessory value in your app’s
Info.plist file keeps your accessory sessions open even when your app is suspended. (Prior to iOS 5, these sessions were closed automatically at suspend time.) When new data arrives for a given session, a suspended app is woken up and given time to process the new data. This type of behavior is designed for apps that work with heart-rate monitors and other types of accessories that need to deliver data at regular intervals.
For more information about the
UIBackgroundModes key, see Information Property List Key Reference. For information about interacting with external accessories, see External Accessory Programming Topics.
Event Kit and Event Kit UI
The Event Kit framework (
EventKit.framework) includes the following enhancements:
EKEventStoreclass, you can now create and delete calendars programmatically, fetch calendars based on their identifier, save and remove events in batches, and trigger a programmatic refresh of calendar data.
EKSourceclass represents the source for creating new calendars and events.
EKCalendarclass now provides access to a calendar’s UUID, source, and other attributes.
The Event Kit UI framework (
EventKitUI.framework) now includes the
EKCalendarChooser class, which provides a standard way for selecting from the user’s iCal calendars.
For information about the classes of the Event Kit framework, see EventKit Framework Reference. For information about the classes of the Event Kit UI framework, see EventKit UI Framework Reference.
The Core Motion framework (
CoreMotion.framework) now supports reporting heading information and magnetometer data for devices that have the corresponding hardware.
For information about the classes of the Core Motion framework, see Core Motion Framework Reference.
The Core Location framework (
CoreLocation.framework) now includes support for both forward and reverse geocoding location data. This support allows you to convert back and forth between a set of map coordinates and information about the street, city, country (and so on) at that coordinate.
For information about the classes of the Core Location framework, see Core Location Framework Reference.
The Core Graphics framework (
CoreGraphics.framework) includes some new interfaces to support the creation of paths. Specifically, there are new interfaces for creating paths with an ellipse or rectangle and for adding arcs to existing paths.
For more information about the Core Graphics interfaces, see Core Graphics Framework Reference.
The Core Data framework includes the following enhancements:
Core Data provides integration with the iOS document architecture and iCloud storage. The
UIManagedDocumentclass is a concrete subclass of
UIDocumentthat uses a Core Data persistent store for document data storage.
For apps built for iOS 5.0 or later, persistent stores now store data by default in an encrypted format on disk. The default protection level prevents access to the data until after the user unlocks the device for the first time. You can change the protection level by assigning a custom value to the
NSPersistentStoreFileProtectionKeykey when configuring your persistent stores. For additional information about the data protection that are new in iOS 5.0, see Data Protection Improvements.
Core Data formalizes the concurrency model for the
NSManagedObjectContextclass with new options. When you create a context, you can specify the concurrency pattern to use with it: thread confinement, a private dispatch queue, or the main dispatch queue. The
NSConfinementConcurrencyTypeoption provides the same behavior that was present on versions of iOS prior to 5.0 and is the default. When sending messages to a context created with a queue association, you must use the
performBlockAndWait:method if your code is not already executing on that queue (for the main queue type) or within the scope of a
performBlock...invocation (for the private queue type). Within the blocks passed to those methods, you can use the methods of
performBlockAndWait:method supports API reentrancy. The
performBlock:method includes an autorelease pool and calls the
processPendingChangesmethod upon completion.
You can create nested managed object contexts, in which the parent object store of a context is another managed object context rather than the persistent store coordinator. This means that fetch and save operations are mediated by the parent context instead of by a coordinator. This pattern has a number of usage scenarios, including performing background operations on a second thread or queue and managing discardable edits, such as in an inspector window or view
Nested contexts make it more important than ever that you adopt the “pass the baton” approach of accessing a context (by passing a context from one view controller to the next) rather than retrieving it directly from the app delegate.
Managed objects support two significant new features: ordered relationships, and external storage for attribute values. If you specify that the value of a managed object attribute may be stored as an external record, Core Data heuristically decides on a per-value basis whether it should save the data directly in the database or store a URI to a separate file that it manages for you.
There are two new classes,
NSIncrementalStoreNode, that you can use to implement support for nonatomic persistent stores. The store does not have to be a relational database—for example, you could use a web service as the back end.
For more details about the new features in Core Data, see Core Data Release Notes for OS X v10.7 and iOS 5.0.
The Core Audio family of frameworks includes the following changes in iOS 5.0:
Audio-session routing information is now specified using dictionary keys. There are also new modes for managing your app’s audio behavior:
Voice chat mode optimizes the system for two-way voice conversation.
Video recording mode configures the device for video capture.
Measurement mode disables automatic compression and limiting for audio input.
Default mode provides iOS 4.3.3 behavior.
Core Audio adds seven new audio units for handling advanced audio processing features in your app, such as reverb, adjustable equalization, and time compression and stretching. The new Sampler unit lets you create music instruments, for which you can provide your own sounds. The new AUFilePlayer unit lets you play sound files and feed them directly to other audio units.
The 3D Mixer audio unit is enhanced in iOS 5.0 to provide reverb and other effects useful in game audio.
You can automate audio unit parameters in an audio processing graph, which lets you build a music mixer that remembers fader positions and changes.
You can now use the advanced features of Apple Core Audio Format files in iOS. For example, you might create new voices for the Sampler audio unit.
There is now programmatic support for adjusting the audio input gain.
Core Audio now supports 32-bit floating-point audio data for apps that need to provide high quality audio.
For information about the audio technologies available in iOS, see Multimedia Programming Guide. For information about the new audio units, see Audio Unit Component Services Reference.
The AV Foundation framework includes the following enhancements:
There is automatic support for playing audio and video content over AirPlay. Apps can opt out of transmitting video over AirPlay using the
allowsAirPlayVideoproperty of the
New properties on the
AVPlayerItemclass indicate whether the item supports fast-forwarding or rewinding of the content.
For information about the classes of the AV Foundation framework, see AV Foundation Framework Reference.
The Assets Library framework includes the following enhancements:
Support for accessing photo streams
Support for creating new albums in the user’s photo library
Support for adding assets to albums
The ability to get an aspect ratio thumbnail for an asset
The ability to modify saved assets
For information about the classes of the Assets Library framework, see Assets Library Framework Reference.
The Address Book framework adds support for importing and exporting vCard data. It also adds new keys to associate social network affiliations with a user record.
For more information about the new features in the Address Book framework, see Address Book Framework Reference for iOS.
The Security framework (
Security.framework) now includes the Secure Transport interfaces, which are Apple’s implementation of the SSL/TLS protocols. You can use these interfaces to configure and manage SSL sessions, manage ciphers, and manage certificates.
For information about the Secure Transport interfaces, see the
SecureTransport.h header file of the Security framework.