iOS 7 is a major update with compelling features for developers to incorporate into their apps. The user interface has been completely redesigned. In addition, iOS 7 introduces a new animation system for creating 2D and 2.5D games. Multitasking enhancements, peer-to-peer connectivity, and many other important features make iOS 7 the most significant release since the first iPhone SDK.
This article summarizes the key developer-related features introduced in iOS 7. This version of the operating system runs on current iOS devices. In addition to describing the key new features, this article lists the documents that describe those features in more detail.
For late-breaking news and information about known issues, see iOS 7.0 Release Notes. For the complete list of new APIs added in iOS 7, see iOS 7.0 API Diffs.
User Interface Changes
iOS 7 includes many new features intended to help you create great user interfaces.
The iOS 7 user interface has been completely redesigned. Throughout the system, a sharpened focus on functionality and on the user’s content informs every aspect of design. Translucency, refined visual touches, and fluid, realistic motion impart clarity, depth, and vitality to the user experience. Whether you are creating a new app or updating an existing one, keep these qualities in mind as you work on the design.
Apps compiled against the iOS 7 SDK automatically receive the new appearance for any standard system views when the app is run on iOS 7. If you use Auto Layout to set the size and position of your views, those views are repositioned as needed. But there may still be additional work to do to make sure your interface has the appearance you want. Similarly, if you customize your app’s views, you may need to make changes to support the new appearance fully.
For guidance on how to design apps that take full advantage of the new look in iOS 7, see iOS 7 Design Resources.
Dynamic Behaviors for Views
Apps can now specify dynamic behaviors for
UIView objects and for other objects that conform to the
UIDynamicItem protocol. (Objects that conform to this protocol are called dynamic items.) Dynamic behaviors offer a way to improve the user experience of your app by incorporating real-world behavior and characteristics, such as gravity, into your app’s animations. UIKit supports the following types of dynamic behaviors:
UIAttachmentBehaviorobject specifies a connection between two dynamic items or between an item and a point. When one item (or point) moves, the attached item also moves. The connection is not completely static, though. An attachment behavior has damping and oscillation properties that determine how the behavior changes over time.
UICollisionBehaviorobject lets dynamic items participate in collisions with each other and with the behavior’s specified boundaries. The behavior also lets those items respond appropriately to collisions.
UIGravityBehaviorobject specifies a gravity vector for its dynamic items. Dynamic items accelerate in the vector’s direction until they collide with other appropriately configured items or with a boundary.
UIPushBehaviorobject specifies a continuous or instantaneous force vector for its dynamic items.
UISnapBehaviorobject specifies a snap point for a dynamic item. The item snaps to the point with a configured effect. For example, it can snap to the point as if it were attached to a spring.
Dynamic behaviors become active when you add them to an animator object, which is an instance of the
UIDynamicAnimator class. The animator provides the context in which dynamic behaviors execute. A given dynamic item can have multiple behaviors, but all of those behaviors must be animated by the same animator object.
For information about the behaviors you can apply, see UIKit Framework Reference.
Text Kit is a full-featured set of UIKit classes for managing text and fine typography. Text Kit can lay out styled text into paragraphs, columns, and pages; it easily flows text around arbitrary regions such as graphics; and it manages multiple fonts. Text Kit is integrated with all UIKit text-based controls to enable apps to create, edit, display, and store text more easily—and with less code than was previously possible in iOS.
Text Kit comprises new classes and extensions to existing classes, including the following:
NSAttributedStringclass has been extended to support new attributes.
NSLayoutManagerclass generates glyphs and lays out text.
NSTextContainerclass defines a region where text is laid out.
NSTextStorageclass defines the fundamental interface for managing text-based content.
For more information about Text Kit, see Text Programming Guide for iOS.
Apps can now be compiled for the 64-bit runtime. All system libraries and frameworks are 64-bit ready, meaning that they can be used in both 32-bit and 64-bit apps. When compiled for the 64-bit runtime, apps may run faster because of the availability of extra processor resources in 64-bit mode.
iOS uses the same LP64 model that is used by OS X and other 64-bit UNIX systems, which means fewer problems when porting code. For information about the iOS 64-bit runtime and how to write 64-bit apps, see 64-Bit Transition Guide for Cocoa Touch.
iOS 7 supports two new background execution modes for apps:
Apps that regularly update their content by contacting a server can register with the system and be launched periodically to retrieve that content in the background. To register, include the
UIBackgroundModeskey with the
fetchvalue in your app’s
Info.plistfile. Then, when your app is launched, call the
setMinimumBackgroundFetchInterval:method to determine how often it receives update messages. Finally, you must also implement the
application:performFetchWithCompletionHandler:method in your app delegate.
Apps that use push notifications to notify the user that new content is available can fetch the content in the background. To support this mode, include the
UIBackgroundModeskey with the
remote-notificationvalue in your app’s
Info.plistfile. You must also implement the
application:didReceiveRemoteNotification:fetchCompletionHandler:method in your app delegate.
Apps supporting either the
remote-notification background modes may be launched or moved from the suspended to background state at appropriate times. In the case of the
fetch background mode, the system uses available information to determine the best time to launch or wake apps. For example, it does so when networking conditions are good or when the device is already awake. You can also send silent push notifications—that is, notifications that do not display alerts or otherwise disturb the user.
For small content updates, use the
NSURLRequest class. To upload or download larger pieces of content in the background, use the new
NSURLSession class. This class improves on the existing
NSURLConnection class by providing a simple, task-based interface for initiating and processing
NSURLRequest objects. A single
NSURLSession object can initiate multiple download and upload tasks, and use its delegate to handle any authentication requests coming from the server.
For more information about the new background modes, see App States and Multitasking in App Programming Guide for iOS.
iOS 7 includes enhanced support for games.
Sprite Kit Framework
The Sprite Kit framework (
SpriteKit.framework) provides a hardware-accelerated animation system optimized for creating 2D and 2.5D games. Sprite Kit provides the infrastructure that most games need, including a graphics rendering and animation system, sound playback support, and a physics simulation engine. Using Sprite Kit frees you from creating these things yourself, and it lets you focus on the design of your content and the high-level interactions for that content.
Content in a Sprite Kit app is organized into scenes. A scene can include textured objects, video, path-based shapes, Core Image filters, and other special effects. Sprite Kit takes those objects and determines the most efficient way to render them onscreen. When it is time to animate the content in your scenes, you can use Sprite Kit to specify explicit actions you want performed, or you can use the physics simulation engine to define physical behaviors (such as gravity, attraction, or repulsion) for your objects.
In addition to the Sprite Kit framework, there are Xcode tools for creating particle emitter effects and texture atlases. You can use the Xcode tools to manage app assets and update Sprite Kit scenes quickly.
For more information about how to use Sprite Kit, see SpriteKit Programming Guide. To see an example of how to use Sprite Kit to build a working app, see code:Explained Adventure.
Game Controller Framework
The Game Controller framework (
GameController.framework) lets you discover and configure Made-for-iPhone/iPod/iPad (MFi) game controller hardware in your app. Game controllers can be devices connected physically to an iOS device or connected wirelessly over Bluetooth. The Game Controller framework notifies your app when controllers become available and lets you specify which controller inputs are relevant to your app.
For more information about supporting game controllers, see Game Controller Programming Guide.
Game Center Improvements
Game Center includes the following improvements:
Turn-based matches now support a new feature known as exchanges. Exchanges let players initiate actions with other players, even when it is not their turn. You can use this feature to implement simultaneous turns, player chats, and trading between players.
The limit on per-app leaderboards has been raised from 25 to 100. You can also organize your leaderboards using a
GKLeaderboardSetobject, which increases the limit to 500.
You can add conditions to challenges that define when the challenge has been met. For example, a challenge to beat a time in a driving game might stipulate that other players must use the same vehicle.
The framework has improved its authentication support and added other features to prevent cheating.
For more information about how to use the new Game Center features, see Game Center Programming Guide. For information about the classes of the Game Kit framework, see Game Kit Framework Reference.
The Map Kit framework (
MapKit.framework) includes numerous improvements and features for apps that use map-based information. Apps that use maps to display location-based information can now take full advantage of the 3D map support found in the Maps app, including controlling the viewing perspective programmatically. Map Kit also enhances maps in your app in the following ways:
Overlays can be placed at different levels in the map content so that they appear above or below other relevant data.
You can apply an
MKMapCameraobject to a map to add position, tilt, and heading information to its appearance. The information you specify using the camera object imparts a 3D perspective on the map.
MKDirectionsclass lets you ask for direction-related route information from Apple. You can use that route information to create overlays for display on your own maps.
MKGeodesicPolylineclass lets you create a line-based overlay that follows the curvature of the earth.
Apps can use the
MKMapSnapshotterclass to capture map-based images.
The visual representation of overlays is now based on the
MKOverlayRendererclass, which replaces overlay views and offers a simpler rendering approach.
For more information about the classes of the Map Kit framework, see Map Kit Framework Reference.
AirDrop lets users share photos, documents, URLs, and other kinds of data with nearby devices. AirDrop support is now built in to the existing
UIActivityViewController class. This class displays different options for sharing the content that you specify. If you are not yet using this class, you should consider adding it to your interface.
To receive files sent via AirDrop, do the following:
In Xcode, declare support for the document types your app supports. (Xcode adds the appropriate keys to your app’s
Info.plistfile.) The system uses this information to determine whether your app can open a given file.
application:openURL:sourceApplication:annotation:method in your app delegate. (The system calls this method when a new file is received.)
Files sent to your app are placed in the
Documents/Inbox directory of your app’s home directory. If you plan to modify the file, you must move it out of this directory before doing so. (The system allows your app to read and delete files in this directory only.) Files stored in this directory are encrypted using data protection, so you must be prepared for the file to be inaccessible if the device is currently locked.
For more information about using an activity view controller to share data, see UIActivityViewController Class Reference.
The Audio Unit framework (
AudioUnit.framework) adds support for Inter-App Audio, which enables the ability to send MIDI commands and stream audio between apps on the same device. For example, you might use this feature to record music from an app acting as an instrument or use it to send audio to another app for processing. To vend your app’s audio data, publish a I/O audio unit (
AURemoteIO) that is visible to other processes. To use audio features from another app, use the audio component discovery interfaces in iOS 7.
For information about the new interfaces, see the framework header files. For general information about the interfaces of this framework, see Audio Unit Framework Reference.
The Multipeer Connectivity framework (
MultipeerConnectivity.framework) supports the discovery of nearby devices and the direct communication with those devices without requiring Internet connectivity. This framework makes it possible to create multipeer sessions easily and to support reliable in-order data transmission and real-time data transmission. With this framework, your app can communicate with nearby devices and seamlessly exchange data.
The framework provides programmatic and UI-based options for discovering and managing network services. Apps can integrate the
MCBrowserViewController class into their user interface to display a list of peer devices for the user to choose from. Alternatively, you can use the
MCNearbyServiceBrowser class to look for and manage peer devices programmatically.
For more information about the interfaces of this framework, see Multipeer Connectivity Framework Reference.
iOS 7 includes the following new frameworks:
The Game Controller framework (
GameController.framework) provides an interface for communicating with game-related hardware; see
Game Controller Framework.
The Sprite Kit framework (
SpriteKit.framework) provides support for sprite-based animations and graphics rendering; see Sprite Kit Framework.
The Multipeer Connectivity framework (
MultipeerConnectivity.framework) provides peer-to-peer networking for apps; see Peer-to-Peer Connectivity.
The Media Accessibility framework (
MediaAccessibility.framework) manages the presentation of closed-captioned content in your media files. This framework works in conjunction with new settings that let the user enable the display of closed captions.
The Safari Services framework (
SafariServices.framework) provides support for programmatically adding URLs to the user’s Safari reading list. For information about the class provided by this framework, see the framework header files.
Enhancements to Existing Frameworks
In addition to its new features, iOS 7 also includes significant enhancements, organized here by framework. For a complete list of new interfaces, see iOS 7.0 API Diffs.
The UIKit framework (
UIKit.framework) includes the following enhancements:
All UI elements have been updated to present the new look associated with iOS 7.
UIKit Dynamics lets you mimic real-world effects such as gravity in your animations; see Dynamic Behaviors for Views.
Text Kit provides sophisticated text editing and display capabilities; see Text Kit.
UIViewclass defines the following additions:
tintColorproperty applies a tint color to both the view and its subviews. For information on how to apply tint colors, see iOS 7 UI Transition Guide.
You can create keyframe-based animations using views. You can also make changes to your views and specifically prevent any animations from being performed.
UIViewControllerclass defines the following additions:
View controller transitions can be customized, driven interactively, or replaced altogether with ones you designate.
View controllers can now specify their preferred status bar style and visibility. The system uses the provided information to manage the status bar style as new view controllers appear. You can also control how this behavior is applied using the
UIViewControllerBasedStatusBarAppearancekey in your app’s
UIMotionEffectclass defines the basic behavior for motion effects, which are objects that define how a view responds to device-based motion.
Collection views add support for intermediate layout transitions and invalidation contexts (invalidation contexts help you improve the performance of your custom layout code). You can also apply UIKit Dynamics to collection view layout attributes to animate the items in the collection.
UIImagesupports retrieving images stored in asset catalogs, which are a way to manage and optimize assets that have multiple sizes and resolutions. You create asset catalogs in Xcode.
There are methods on
UIScreenfor creating a snapshot of their contents. Generating snapshots using these new interfaces is significantly faster than rendering the view or screen contents yourself.
Gesture recognizers can specify dependencies dynamically to ensure that one gesture recognizer fails before another is considered.
UIKeyCommandclass wraps keyboard events received from an external hardware keyboard. These events are delivered to the app’s responder chain for processing.
UIFontDescriptorobject describes a font using a dictionary of attributes. Use font descriptors to interoperate with other platforms.
UIFontDescriptorclasses support dynamic text sizing, which improves legibility for text in apps. With this feature, the user controls the desired font size that all apps in the system should use.
UIActivityclass now supports new activity types, including activities for sending items via AirDrop, adding items to a Safari reading list, and posting content to Flickr, Tencent Weibo, and Vimeo.
UIApplicationDelegateprotocol adds methods for handling background fetch behaviors.
UIScreenEdgePanGestureRecognizerclass is a new gesture recognizer that tracks pan gestures that originate near an edge of the screen.
UIKit adds support for running in a guided-access mode, which allows an app to lock itself to prevent modification by the user. This mode is intended for institutions such as schools, where users bring their own devices but need to run apps provided by the institution.
State restoration now allows the saving and restoration of any object. Objects adopting the
UIStateRestoringprotocol can write out state information when the app moves to the background and have that state restored during subsequent launches.
Table views now support estimating the height of rows and other elements, which improves scrolling performance.
For information about the classes of this framework, see UIKit Framework Reference.
Store Kit Framework
The Store Kit framework (
StoreKit.framework) has migrated to a new receipt system that developers can use to verify in-app purchases on the device itself. You can also use it to verify the app purchase receipt on the server.
For more information about how to use this new receipt system, see Receipt Validation Programming Guide.
Pass Kit Framework
The Pass Kit framework (
PassKit.framework) includes new APIs for adding multiple passes in a single operation.
These new features were added to the pass file format:
New keys specify the expiration date for a pass.
You can specify that a pass is relevant only when it is in the vicinity of specific iBeacons.
New attributes control how a pass is displayed. You can group passes together, display links with custom text on the back of a pass, and control how time values are displayed on the pass.
You can now associate extra data with a pass. This data is available to your app but is not displayed to the user.
You can designate which data detectors to apply to the fields of your passes.
For information about how to use Pass Kit in your app, see Wallet Developer Guide. For information about the pass file format, see PassKit Package Format Reference.
iOS 7 adds support for OpenGL ES 3.0 and adds new features to OpenGL ES 2.0.
OpenGL ES 3.0 includes as core functionality the features of many extensions supported in OpenGL ES 2.0 on iOS. But OpenGL ES 3.0 also adds new features to the OpenGL ES shading language and new core functionality that has never been available on mobile processors before, including multiple render targets and transform feedback. You can use OpenGL ES 3 to more easily implement advanced rendering techniques, such as deferred rendering.
To create an OpenGL ES 3 context on devices that support it, pass the
kEAGLRenderingAPIOpenGLES3constant to the
OpenGL ES 2 adds the following new extensions:
The EXT_sRGB extension adds support for sRGB framebuffer operations.
The GL_EXT_pvrtc_sRGB extension adds support for sRGB texture data compressed in the PVRTC texture compression format. (This extension is also supported in OpenGL ES 3.0).
The GL_EXT_draw_instanced and GL_EXT_instanced_arrays extensions can improve rendering performance when your app draws multiple instances of the same object. You use a single call to draw instances of the same object. You add variation to each instance by specifying how fast each vertex attribute advances or by referencing an ID for each instance in your shader.
Textures can be accessed in vertex shaders in both OpenGL ES 2.0 and 3.0. Query the value of the
MAX_VERTEX_TEXTURE_IMAGE_UNITSattribute to determine the exact number of textures you can access. In earlier versions of iOS, this attribute always had a value of
For more information, see OpenGL ES Programming Guide for iOS and iOS Device Compatibility Reference.
Message UI Framework
In the Message UI framework, the
MFMessageComposeViewController class adds support for attaching files to messages.
For information about the new interfaces, see the framework header files. For information about the classes of this framework, see Message UI Framework Reference.
Media Player Framework
In the Media Player framework, the
MPVolumeView class provides support for determining whether wireless routes such as AirPlay and Bluetooth are available for the user to select. You can also determine whether one of these wireless routes is currently active. For information about the new interfaces, see the framework header files.
For information about the classes of Media Player framework, see Media Player Framework Reference.
Map Kit Framework
The Map Kit framework (
MapKit.framework) includes changes that are described in Maps.
For information about the classes of this framework, see Map Kit Framework Reference.
Image I/O Framework
The Image I/O framework (
ImageIO.framework) now has interfaces for getting and setting image metadata.
For information about the new interfaces, see the framework header files. For information about the classes of this framework, see Image I/O Reference Collection.
The iAd framework (
iAd.framework) includes two extensions to other frameworks that make it easier to incorporate ads into your app’s content:
The framework introduces new methods on the
MPMoviePlayerControllerclass that let you run ads before a movie.
The framework extends the
UIViewControllerclass to make it easier to create ad-supported content. You can now configure your view controllers to display ads before displaying the actual content they manage.
For information about the new interfaces, see the framework header files. For information about the classes of this framework, see Ad Support Framework Reference.
Game Kit Framework
The Game Kit framework (
GameKit.framework) includes numerous changes, which are described in Game Center Improvements.
For information about the classes of this framework, see Game Kit Framework Reference.
The Foundation framework (
Foundation.framework) includes the following enhancements:
NSDataclass adds support for Base64 encoding.
NSURLSessionclass is a new class for managing the acquisition of network-based resources. (You can use it to download content even when your app is suspended or not running.) This class serves as a replacement for the
NSURLConnectionclass and its delegate; it also replaces the
NSURLDownloadclass and its delegate.
NSURLComponentsclass is a new class for parsing the components of a URL. This class supports the URI standard (rfc3986/STD66) for parsing URLs.
NSURLCredentialStorageclasses let you create credentials with a synchronizable policy, and they provide the option of removing credentials with a synchronizable policy from iCloud.
NSCalendarclass supports new calendar types.
NSProgressclass provides a general-purpose way to monitor the progress of an operation and report that progress to other parts of your app that want to use it.
For information about the new interfaces, see the framework header files and Foundation release notes. For general information about the classes of this framework, see Foundation Framework Reference.
Core Telephony Framework
The Core Telephony framework (
CoreTelephony.framework) lets you get information about the type of radio technology in use by the device. Apps developed in conjunction with a carrier can also authenticate against a particular subscriber for that carrier.
For information about the new interfaces, see the framework header files. For general information about the classes of the Core Telephony framework, see Core Telephony Framework Reference.
Core Motion Framework
The Core Motion framework (
CoreMotion.framework) adds support for step counting and motion tracking. With step counting, the framework detects movements that correspond to user motion and uses that information to report the number of steps to your app. Because the system detects the motion, it can continue to gather step data even when your app is not running. Alongside this feature, the framework can also distinguish different types of motion, including different motions reflective of travel by walking, by running, or by automobile. Navigation apps might use that data to change the type of directions they give to users.
For information about the classes of this framework, see Core Motion Framework Reference.
Core Location Framework
The Core Location framework (
CoreLocation.framework) supports region monitoring and ranging using Bluetooth devices. Region monitoring lets you determine whether the iOS device enters a specific area, and ranging lets you determine the relative range of nearby Bluetooth devices. For example, an art museum might use region monitoring to determine whether a person is inside a particular gallery, and then place iBeacons near each painting. When the person is standing by a painting, the app would display information about it.
The framework also supports deferring the delivery of location updates until a specific time has elapsed or the user has moved a minimum distance.
For general information about the classes of this framework, see Core Location Framework Reference.
Core Foundation Framework
The Core Foundation framework (
CoreFoundation.framework) now lets you schedule stream objects on dispatch queues.
For information about the new interfaces, see the framework header files. For general information about the interfaces of this framework, see Core Foundation Framework Reference.
Core Bluetooth Framework
The Core Bluetooth framework (
CoreBluetooth.framework) includes the following enhancements:
The framework supports saving state information for central and peripheral objects and restoring that state at app launch time. You can use this feature to support long-term actions involving Bluetooth devices.
The central and peripheral classes now use an
NSUUIDobject to store unique identifiers.
You can now retrieve peripheral objects from a central manager synchronously.
For information about the classes of this framework, see Core Bluetooth Framework Reference and Core Bluetooth Programming Guide.
AV Foundation Framework
The AV Foundation framework (
AVFoundation.framework) includes the following enhancements:
AVAudioSessionclass supports the following new behaviors:
Selecting the preferred audio input, including audio from built-in microphones
Multichannel input and output
AVVideoCompositingprotocol and related classes let you support custom video compositors.
AVSpeechSynthesizerclass and related classes provide speech synthesis capabilities.
The capture classes add support and interfaces for the following features:
Discovery of a camera’s supported formats and frame rates
High fps recording
Still image stabilization
Video zoom (true and digital) in recordings and video preview, including custom ramping
Real-time discovery of machine-readable metadata (barcodes)
Autofocus range restriction
Smooth autofocus for capture
Sharing your app’s audio session during capture
Access to the clocks used during capture
Access to capture device authorization status (user must now grant access to the microphone and camera)
Recommended settings for use with data outputs and asset writer
There are new metadata key spaces for supported ISO formats such as MPEG-4 and 3GPP, and improved support for filtering metadata items when copying those items from source assets to output files using the
AVAssetWriterclass provides assistance in formulating output settings, and there are new level constants for H.264 encoding.
AVPlayerLayerclass adds the
videoRectproperty, which you can use to get the size and position of the video image.
AVPlayerItemclass supports the following changes:
Asset properties can be loaded automatically when
AVPlayerItemobjects are prepared for playback.
When you link your app against iOS 7 SDK, the behavior when getting the values of player item properties—such as the
presentationSizeproperties—is different from the behaviors in previous versions of iOS. The properties of this class now return a default value and no longer block your app if the
AVPlayerItemobject is not yet ready to play. As soon as the player item’s status changes to
AVPlayerItemStatusReadyToPlay, the getters reflect the actual values of the underlying media resource. If you use key-value observing to monitor changes to the properties, your observers are notified as soon as changes are available.
AVPlayerItemLegibleOutputclass can process timed text from media files.
AVAssetResourceLoaderDelegateprotocol now supports loading of arbitrary ranges of bytes from a media resource.
For information about the new interfaces, see the framework header files. For general information about the classes of this framework, see AV Foundation Framework Reference.
The Accelerate framework (
Accelerate.framework) includes the following enhancements:
Improved support for manipulating Core Graphics data types
Support for working with grayscale images of 1, 2, or 4 bits per pixel
New routines for converting images between different formats and transforming image contents
Support for biquad (IIR) operations
For information about the new interfaces, see the framework header files. For general information about the functions and types of this framework, see Accelerate Framework Reference.
The Objective-C programming language has been enhanced to support modules, which yield faster builds and shorter project indexing times. Module support is enabled in all new projects created using Xcode 5. If you have existing projects, you must enable this support explicitly by modifying your project’s Enable Modules setting.
From time to time, Apple adds deprecation macros to APIs to indicate that those APIs should no longer be used in active development. When a deprecation occurs, it is not an immediate end-of-life to the specified API. Instead, it is the beginning of a grace period for transitioning off that API and onto newer and more modern replacements. Deprecated APIs typically remain present and usable in the system for a reasonable amount of time past the release in which they were deprecated. However, active development on them ceases and the APIs receive only minor changes—to accommodate security patches or to fix other critical bugs. Deprecated APIs may be removed entirely from a future version of the operating system.
As a developer, it is important that you avoid using deprecated APIs in your code as soon as possible. At a minimum, new code you write should never use deprecated APIs. And if you have existing code that uses deprecated APIs, update that code as soon as possible. Fortunately, the compiler generates warnings whenever it spots the use of a deprecated API in your code, and you can use those warnings to track down and remove all references to those APIs.
This release includes deprecations in the following technology areas:
The Map Kit framework includes deprecations for the
MKOverlayViewclass and its various subclasses. The existing overlay views have been replaced with an updated set of overlay renderer objects that descend from the
MKOverlayRendererclass. For more information about the classes of this framework, see Map Kit Framework Reference.
The Audio Session API in the Audio Toolbox framework is deprecated. Apps should use the
AVAudioSessionclass in the AV Foundation framework instead.
CLRegionclass in the Core Location framework is replaced by the
CLRegionclass continues to exist as an abstract base class that supports both geographic and beacon regions.
The Game Kit framework contains assorted deprecations intended to clean up the existing API and provide better support for new features.
The UIKit framework contains the following deprecations:
UIViewControlleris deprecated. In iOS 7 and later, view controllers always support full screen layout.
UIColorobjects that provided background textures for earlier versions of iOS are gone.
Many drawing additions to the
NSStringclass are deprecated in favor of newer variants.
gethostuuidfunction in the
libsyscalllibrary is deprecated.
In iOS 7 and later, if you ask for the MAC address of an iOS device, the system returns the value
02:00:00:00:00:00. If you need to identify the device, use the
UIDeviceinstead. (Apps that need an identifier for their own advertising purposes should consider using the
For a complete list of specific API deprecations, see iOS 7.0 API Diffs.