AV Foundation Release Notes for iOS 4.3
This article summarizes some of the new features and changes in functionality in AV Foundation in iOS 4.3.
Contents:
Determining Whether an Operation Can Be Performed on an Asset
Four new properties have been defined on AVAsset
to help you determine whether a operations supported by AV Foundation can be performed on a particular asset.
playable
indicates whether anAVPlayerItem
initialized with asset can be played by anAVPlayer
.exportable
indicates whether anAVAssetExportSession
initialized with an asset can produce an output file.readable
indicates whether anAVAssetReader
initialized with an asset can provide media derived or extracted from the asset.composable
indicates whether the asset or any of itsAVAssetTrack
objects can be inserted into anAVMutableComposition
.
Note that the value of each of these properties is YES
even if the associated operation is only conditionally supported. Examples:
playable
isYES
even if the asset has protected content and requires authorization of both the application and the content for playback. You can determine whether an asset has protected content viahasProtectedContent
(AVAsset
).exportable
isYES
even if only some of the export presets are compatible with the asset. You can obtain an array of export presets that can be used with an asset viaexportPresetsCompatibleWithAsset:
(AVAssetExportSession
).readable
isYES
even if only some classes ofAVAssetReaderOutput
or only some configurations of settings on the outputs can be used with the asset. You can determine whether a particular instance ofAVAssetReaderOutput
with its settings can be used by invokingcanAddOutput:
.
Enhancements for HTTP Live Streaming
The inspection features of AVURLAsset
have been enhanced to handle HTTP Live Streaming Media resources. For this reason, starting with iOS 4.3 you can prepare any asset for playback in a uniform way, according to the best practices originally outlined for file-based assets in the AVFoundation Programming Guide. Those steps are as follows:
Create an asset using
AVURLAsset
and load its tracks usingloadValuesAsynchronouslyForKeys:completionHandler:
.When the asset has loaded its tracks, create an instance of
AVPlayerItem
using the asset.Associate the item with an instance of
AVPlayer
.Wait until the item’s status indicates that it’s ready to play.
Typically you use key-value observing to receive a notification when the status changes.
While you can still prepare stream-based assets for playback according to the steps described specifically for them in the AVFoundation Programming Guide, starting with iOS 4.3 it’s no longer necessary to do so and therefore no longer necessary for you to determine whether a URL references HTTP Live Streaming Media.
Note that AVURLAsset
provides information about the persistent state of a timed media resource. Because of the dynamic nature of HTTP Live Streaming Media, the duration of the media and the specifics of the tracks available can change during playback. Therefore URL assets initialized with URLs that reference HTTP Live Streaming Media may have values for their duration and tracks properties that are different from the values of the duration and tracks properties of AVPlayerItem
objects that play them. In particular, the duration reported by the URL asset for streaming-based media is typically kCMTimeIndefinite
, while the duration of a corresponding AVPlayerItem
may be different and may change while it plays. Similarly, the array of AVAssetTrack
objects available via the tracks
property of an URL asset is typically empty for streaming-based media, while the array of AVPlayerItemTrack
objects available via the tracks
property on the corresponding player item may have a different count and may change while it plays. If you need to, you can observe both the duration
and tracks
keys of AVPlayerItem
to remain in sync with the current state of playback.
AVPlayerItem
has been enhanced to provide access and error information during HTTP Live Streaming playback. A network access log is available via accessLog
. A log of error information is available via errorLog
.
Duration of Timed Media Resources for Playback
Because of the dynamic nature of HTTP Live Streaming Media our best practice for obtaining the duration of an AVPlayerItem
object has changed in iOS 4.3. Prior to iOS 4.3, you would obtain the duration of a player item by fetching the value of the duration property of its associated AVAsset
object. As above, however, note that for HTTP Live Streaming Media the duration of a player item during any particular playback session may differ from the duration of its asset. For this reason a new key-value observable duration
property has been defined on AVPlayerItem
.
To make your code compatible with all available revisions of AV Foundation, you can check whether the duration property of an AVPlayerItem
instance is available and obtain the duration for playback as follows:
CMTime itemDuration = kCMTimeInvalid; |
|
// Once the AVPlayerItem becomes ready to play, i.e. [playerItem status] == AVPlayerItemStatusReadyToPlay), |
// its duration can be fetched from the item as follows. |
|
if ([AVPlayerItem instancesRespondToSelector:@selector (duration)]) { |
|
// Fetch the duration directly from the AVPlayerItem. |
|
itemDuration = [playerItem duration]; |
} |
else { |
// Reach through the AVPlayerItem to its asset to get the duration. |
itemDuration = [[playerItem asset] duration]; |
} |
Determining Whether an Item Has Played Successfully
AVPlayerItem
posts the notification AVPlayerItemDidPlayToEndTimeNotification
when it successfully reaches its end time during normal playback. Starting with iOS 4.3, AVPlayerItem
will post the notification AVPlayerItemFailedToPlayToEndTimeNotification
if playback is interrupted by an unrecoverable error. The userInfo
dictionary of the notification will contain an NSError
object describing the problem, which can be obtained by using the key AVPlayerItemFailedToPlayToEndTimeErrorKey
. For example, if the underlying timed media resource contains corrupted data associated with a particular playback time that prevents playback from proceeding beyond that time, AVPlayerItem
will post AVPlayerItemFailedToPlayToEndTimeNotification
.
Access to Chapter Metadata
In iOS 4.3, AVAsset
has been enhanced to provide metadata information associated with chapters, including chapter titles and chapter images. The class AVTimedMetadataGroup
has been defined to organize collections of metadata items by time range; each chapter of an asset will be represented by a corresponding instance of AVTimedMetadataGroup
.
To load chapter information for an asset, request the value for the key availableChapterLocales
in a call to loadValuesAsynchronouslyForKeys:completionHandler:
. Once the value of availableChapterLocales
has been successfully loaded, you can obtain the chapter information for any specific locale via chapterMetadataGroupsWithTitleLocale:containingItemsWithCommonKeys:
. If you wish to obtain chapter images along with chapter titles, include the metadata key AVMetadataCommonKeyArtwork
in the array of common keys that you specify. This method will return an array of instances of AVTimedMetadataGroup
, one per chapter.
You can obtain the timeRange of a chapter via timeRange
. The metadata information associated with the chapter, typically including an item that has the common metadata key AVMetadataCommonKeyTitle
, is available via items
, which returns an array of instances of AVMetadataItem
.
Note that some metadata items included in an AVTimedMetadataGroup
may not have a time and duration that correspond exactly with the timeRange of the group. This can occur, for example, with a QuickTime movie file that has a chapter image track with times that do not align precisely with the times of its chapter text track. So that all information for a group’s timeRange is accessible to you, all items that overlap in time with the timeRange of the group will be included in the group, and you can decide which item to use at any particular time according to the included items' times and durations.
Because of the way chapter information can be stored within timed media resources, additional I/O may be required to obtain the value of an AVMetadataItem
included in an AVTimedMetadataGroup
. To avoid potentially lengthy (and risky) blocking, you can load the values of AVMetadataItem
objects asynchronously using the same AVAsynchronousKeyValueLoading
loading protocol already supported by AVAsset
and AVAssetTrack
. Call loadValuesAsynchronouslyForKeys:completionHandler:
] and include the key @"value"
in the specified array of keys to trigger the loading of a value.
Attempting to Run an AVCaptureSession in the Background
AV Foundation does not support running an AVCaptureSession
in the background. Starting with iOS 4.3, if you attempt to run a capture session in the background, the session will post the notification AVCaptureSessionRuntimeErrorNotification
with a payload that contains an NSError
object with the error code
AVErrorDeviceIsNotAvailableInBackground
.
Copyright © 2018 Apple Inc. All rights reserved. Terms of Use | Privacy Policy | Updated: 2011-10-12