Presenting Chapter Markers

Add chapter markers to enable users to quickly navigate your content.


Chapter markers enable users to quickly navigate your content. AVPlayerViewController in tvOS and macOS automatically presents a chapter-selection interface if markers are found in the currently played asset. You can also directly retrieve this data whenever you want to create your own custom chapter-selection interface.

Retrieve the Timed Metadata

Chapter markers are a type of timed metadata that apply only to ranges of time within the asset’s timeline. You retrieve an asset’s chapter metadata using either the chapterMetadataGroups(bestMatchingPreferredLanguages:) or chapterMetadataGroups(withTitleLocale:containingItemsWithCommonKeys:) methods. These methods become callable without blocking after you asynchronously load the value of the asset’s availableChapterLocales key.

let asset = AVAsset(url: <# Asset URL #>)
let chapterLocalesKey = "availableChapterLocales"
asset.loadValuesAsynchronously(forKeys: [chapterLocalesKey]) {
    var error: NSError?
    let status = asset.statusOfValue(forKey: chapterLocalesKey, error: &error)
    if status == .loaded {
        let languages = Locale.preferredLanguages
        let chapterMetadata = asset.chapterMetadataGroups(bestMatchingPreferredLanguages: languages)
        // Process chapter metadata
    else {
        // Handle other status cases

Convert Timed Metadata into Chapter Data

The value returned from these methods is an array of AVTimedMetadataGroup objects, each representing an individual chapter marker. An AVTimedMetadataGroup object is composed of a CMTimeRange, defining the time range to which its metadata applies, an array of AVMetadataItem objects representing the chapter’s title, and optionally, its thumbnail image. The following example shows how to convert the AVTimedMetadataGroup data into an array of custom model objects, called Chapter, to be presented in the app’s view layer.

func convertTimedMetadataGroupsToChapters(groups: [AVTimedMetadataGroup]) -> [Chapter] {
    return { group -> Chapter in
        // Retrieve AVMetadataCommonIdentifierTitle metadata items
        let titleItems = AVMetadataItem.metadataItems(from: group.items, filteredByIdentifier: AVMetadataCommonIdentifierTitle)
        // Retrieve AVMetadataCommonIdentifierTitle metadata items
        let artworkItems = AVMetadataItem.metadataItems(from: group.items, filteredByIdentifier: AVMetadataCommonIdentifierArtwork)

        var title = "Default Title"
        var image = UIImage(named: "placeholder")!

        if let titleValue = titleItems.first?.stringValue {
            title = titleValue

        if let imgData = artworkItems.first?.dataValue, imageValue = UIImage(data: imgData) {
            image = imageValue

        return Chapter(time: group.timeRange.start, title: title, image: image)

With the relevant data converted, you can build a chapter-selection interface and use the time value of the chapter object to seek the current presentation using the player’s seek(to:) method.

See Also

Using Chapter Metadata

var availableChapterLocales: [Locale]

The locales available for chapter metadata in the asset.

func chapterMetadataGroups(withTitleLocale: Locale, containingItemsWithCommonKeys: [AVMetadataKey]?) -> [AVTimedMetadataGroup]

Returns an array of chapters with a given title locale and containing specified keys.

func chapterMetadataGroups(bestMatchingPreferredLanguages: [String]) -> [AVTimedMetadataGroup]

Returns an array of chapters whose locale best matches the list of preferred languages.