Framework

AVFoundation

Record, edit, and play audio and video; configure your audio session; and respond to changes in the device audio environment. If necessary, customize the default system behavior that you implement with AVKit.

Overview

The AV Foundation framework provides an Objective-C interface for managing and playing audio-visual media in iOS and macOS applications. To learn more about AV Foundation, see AVFoundation Programming Guide.

Symbols

Classes

AVAsset

AVAsset is an abstract, immutable class used to model timed audiovisual media such as videos and sounds. An asset may contain one or more tracks that are intended to be presented or processed together, each of a uniform media type, including but not limited to audio, video, text, closed captions, and subtitles.

AVAssetCache

An AVAssetCache is used to inspect the state of an asset’s locally cached media data.

AVAssetDownloadTask

AVAssetDownloadTask is an URLSessionTask subclass used to download HTTP Live Streaming assets. Instances of this class are created using the makeAssetDownloadTask(asset:assetTitle:assetArtworkData:options:) method of AVAssetDownloadURLSession.

AVAssetDownloadURLSession

A subclass of URLSession used to support creating and executing instances of AVAssetDownloadTask.

AVAssetExportSession

An AVAssetExportSession object transcodes the contents of an AVAsset source object to create an output of the form described by a specified export preset.

AVAssetImageGenerator

An AVAssetImageGenerator object provides thumbnail or preview images of assets independently of playback.

AVAssetReader

You use an AVAssetReader object to obtain media data of an asset, whether the asset is file-based or represents an assemblage of media data from multiple sources (as with an AVComposition object).

AVAssetReaderAudioMixOutput

AVAssetReaderAudioMixOutput is a concrete subclass of AVAssetReaderOutput that defines an interface for reading audio samples that result from mixing the audio from one or more tracks of an AVAssetReader object's asset.

AVAssetReaderOutput

AVAssetReaderOutput is an abstract class that defines an interface for reading a single collection of samples of a common media type from an AVAssetReader object.

AVAssetReaderOutputMetadataAdaptor

The AVAssetReaderOutputMetadataAdaptor class defines an interface for reading metadata, packaged as instances of AVTimedMetadataGroup, from a single AVAssetReaderTrackOutput object.

AVAssetReaderSampleReferenceOutput

AVAssetReaderSampleReferenceOutput is a concrete subclass of the AVAssetReaderOutput class that defines an interface for reading sample references from a single AVAssetTrack of an AVAsset instance contained in an AVAssetReader object.

AVAssetReaderTrackOutput

AVAssetReaderTrackOutput defines an interface for reading media data from a single AVAssetTrack object of an asset reader's asset.

AVAssetReaderVideoCompositionOutput

AVAssetReaderVideoCompositionOutput is a subclass of AVAssetReaderOutput you use to read video frames that have been composited together from the frames in one or more tracks of an AVAssetReader object's asset.

AVAssetResourceLoader

An AVAssetResourceLoader object mediates resource requests from an AVURLAsset object with a delegate object that you provide. When a request arrives, the resource loader asks your delegate if it is able to handle the request and reports the results back to the asset.

AVAssetResourceLoadingContentInformationRequest

The AVAssetResourceLoadingContentInformationRequest class represents a query for essential information about a resource referenced by an asset resource loading request.

AVAssetResourceLoadingDataRequest

Use the AVAssetResourceLoadingDataRequest class to request data from a resource referenced by an AVAssetResourceLoadingRequest instance.

AVAssetResourceLoadingRequest

An AVAssetResourceLoadingRequest object encapsulates information about a resource request issued from a resource loader object.

AVAssetResourceRenewalRequest

The AVAssetResourceRenewalRequest class is a subclass of AVAssetResourceLoadingRequest that encapsulates information about a resource request issued by a resource loader for the purpose of renewing a request previously issued.

AVAssetTrack

An AVAssetTrack object provides the track-level inspection interface for an asset’s media tracks.

AVAssetTrackGroup

The AVAssetTrackGroup class encapsulates a single group of related tracks in an asset.

AVAssetTrackSegment

An AVAssetTrackSegment object represents a segment of an AVAssetTrack object, comprising of a time mapping from the source to the asset track timeline.

AVAssetWriter

You use an AVAssetWriter object to write media data to a new file of a specified audiovisual container type, such as a QuickTime movie file or an MPEG-4 file, with support for automatic interleaving of media data for multiple concurrent tracks.

AVAssetWriterInput

You use an AVAssetWriterInput to append media samples packaged as CMSampleBuffer objects (see CMSampleBuffer), or collections of metadata, to a single track of the output file of an AVAssetWriter object.

AVAssetWriterInputGroup

The AVAssetWriterInputGroup class associates tracks corresponding to inputs with each other in a mutually exclusive relationship.

AVAssetWriterInputMetadataAdaptor

The AVAssetWriterInputMetadataAdaptor class defines an interface for writing metadata packaged as instances of AVTimedMetadataGroup to a single AVAssetWriterInput object.

AVAssetWriterInputPassDescription

The AVAssetWriterInputPassDescription class defines an interface for querying information about the requirements of the current pass, such as the time ranges of media data to append.

AVAssetWriterInputPixelBufferAdaptor

You use an AVAssetWriterInputPixelBufferAdaptor to append video samples packaged as CVPixelBuffer objects to a single AVAssetWriterInput object.

AVAsynchronousCIImageFilteringRequest

An AVAsynchronousCIImageFilteringRequest object provides for using Core Image filters to process an individual video frame in a video composition (a AVVideoComposition or AVMutableVideoComposition object).

AVAsynchronousVideoCompositionRequest

An AVAsynchronousVideoCompositionRequest instance contains the information necessary for a video compositor to render an output pixel buffer.

AVAudioBuffer

The AVAudioBuffer class represents a buffer of audio data and its format.

AVAudioChannelLayout

The AVAudioChannelLayout class describes the roles of a set of audio channels.

AVAudioCompressedBufferAVAudioConnectionPointAVAudioConverter
AVAudioEngine

The AVAudioEngine class defines a group of connected AVAudioNode objects, known as audio nodes. You use audio nodes to generate audio signals, process them, and perform audio input and output.

AVAudioEnvironmentDistanceAttenuationParameters

The AVAudioEnvironmentDistanceAttenuationParameters class specifies the attenuation distance, the gradual loss in audio intensity, and characteristics.

AVAudioEnvironmentNode

The AVAudioEnvironmentNode class is a mixer node that simulates a 3D audio environment. Any node that conforms to the AVAudioMixing protocol (for example, AVAudioPlayerNode) can act as a source in this environment.

AVAudioEnvironmentReverbParameters

The AVAudioEnvironmentReverbParameters class encapsulates the parameters that you use to control the reverb of the AVAudioEnvironmentNode class.

AVAudioFile

The AVAudioFile class represents an audio file that can be opened for reading or writing.

AVAudioFormat

The AVAudioFormat class wraps a Core Audio AudioStreamBasicDescription struct, with convenience initializers and accessors for common formats, including Core Audio’s standard deinterleaved 32-bit floating point format.

AVAudioIONode

The AVAudioIONode class is the base class for nodes that connects to the system's audio input or output.

AVAudioInputNode

The AVAudioInputNode class represents a node that connects to the system's audio input.

AVAudioMix

An AVAudioMix object manages the input parameters for mixing audio tracks. It allows custom audio processing to be performed on audio tracks during playback or other operations.

AVAudioMixInputParameters

An AVAudioMixInputParameters object represents the parameters that should be applied to an audio track when it is added to a mix.

AVAudioMixerNode

The AVAudioMixerNode class represents a node that mixes its inputs to a single output.

AVAudioMixingDestination
AVAudioNode

The AVAudioNode class is an abstract class for an audio generation, processing, or I/O block.

AVAudioOutputNode

The AVAudioOutputNode class represents a audio node that connects to the system's audio output.

AVAudioPCMBuffer

The AVAudioPCMBuffer class is a subclass of AVAudioBuffer for use with PCM audio formats.

AVAudioPlayer

An instance of the AVAudioPlayer class, called an audio player, provides playback of audio data from a file or memory.

AVAudioPlayerNode

The AVAudioPlayerNode class plays buffers or segments of audio files.

AVAudioRecorder

An instance of the AVAudioRecorder class, called an audio recorder, provides audio recording capability in your application. Using an audio recorder you can:

AVAudioSequencer
AVAudioSession

An audio session is a Singleton object that you employ to set the audio context for your app and to express to the system your intentions for your app’s audio behavior.

AVAudioSessionChannelDescription

The AVAudioSessionChannelDescription class provides descriptive information about a hardware channel on the current device. You typically do not create instances of this class yourself but can retrieve them from the port AVAudioSessionPortDescription object used to reference the intended input or output port.

AVAudioSessionDataSourceDescription

The AVAudioSessionDataSourceDescription class defines a data source for an audio input or output, providing information such as the source’s name, location and orientation.

AVAudioSessionPortDescription

An AVAudioSessionPortDescription object describes a single input or output port associated with an audio route. You can use the information in this class to obtain information about the capabilities of the port and the hardware channels it supports.

AVAudioSessionRouteDescription

An AVAudioSessionRouteDescription manages the input and output ports associated with the current audio route for a session.

AVAudioTime

The AVAudioTime class is used by AVAudioEngine to represent time. Instances of the class are immutable.

AVAudioUnit

The AVAudioUnit class is a subclass of the AVAudioNode class that, depending on the type of the audio unit, processes audio either in real-time or non real-time.

AVAudioUnitComponent

The AVAudioUnitComponent class provides details about an audio unit such as: type, subtype, manufacturer, and location. User tags can be added to the AVAudioUnitComponent which can be queried later for display.

AVAudioUnitComponentManager

The AVAudioUnitComponentManager class is a singleton object that provides a way to find audio components that are registered with the system. It provides methods to search and query various information about the audio components without opening them. Currently, only audio components that are audio units can only be searched.

AVAudioUnitDelay

The AVAudioUnitDelay class is an AVAudioUnitEffect subclass that implements a delay effect.

AVAudioUnitDistortion

The AVAudioUnitDistortion class is an AVAudioUnitEffect subclass that implements a multi-stage distortion effect.

AVAudioUnitEQ

The AVAudioUnitEQ class is an AVAudioUnitEffect subclass that implements a multi-band equalizer.

AVAudioUnitEQFilterParameters

The AVAudioUnitEQFilterParameters class encapsulates the parameters used by an AVAudioUnitEQ instance.

AVAudioUnitEffect

An AVAudioUnitEffect class that processes audio in real-time using AudioUnits of type: effect, music effect, panner, remote effect or remote music effect. These effects run in real-time and process some number of audio input samples to produce number of audio output samples. A delay unit is an example of an effect unit.

AVAudioUnitGenerator

The AVAudioUnitGenerator is an AVAudioUnit subclass that generates audio output.

AVAudioUnitMIDIInstrument

The AVAudioUnitMIDIInstrument class is an abstract class representing music devices or remote instruments.

AVAudioUnitReverb

The AVAudioUnitReverb class is an AVAudioUnitEffect subclass that implements a reverb effect.

AVAudioUnitSampler

The AVAudioUnitSampler class encapsulates Apple's Sampler Audio Unit. The sampler audio unit can be configured by loading different types of instruments such as an “.aupreset” file, a DLS or SF2 sound bank, an EXS24 instrument, a single audio file or with an array of audio files. The output is a single stereo bus.

AVAudioUnitTimeEffect

The AVAudioUnitTimeEffect class is an AVAudioUnit subclass that processes audio in non-realtime.

AVAudioUnitTimePitch

The AVAudioUnitTimePitch class is an AVAudioUnitTimeEffect subclass that provides good quality playback rate and pitch shifting independent of each other.

AVAudioUnitVarispeed

The AVAudioUnitVarispeed class is an AVAudioUnitTimeEffect subclass that allows control of the playback rate.

AVCaptureAudioChannel

You use an AVCaptureAudioChannel to monitor the average and peak power levels in an audio channel in a capture connection (see AVCaptureConnection).

AVCaptureAudioDataOutput

AVCaptureAudioDataOutput is a concrete sub-class of AVCaptureOutput that you use, via its delegate, to process audio sample buffers from the audio being captured.

AVCaptureAudioFileOutput

AVCaptureMovieFileOutput is a concrete sub-class of AVCaptureFileOutput that writes captured audio to any audio file type supported by CoreAudio.

AVCaptureAudioPreviewOutput

AVCaptureAudioPreviewOutput is a concrete subclass of AVCaptureOutput that you use to preview audio being captured.

AVCaptureAutoExposureBracketedStillImageSettings

The AVCaptureAutoExposureBracketedStillImageSettings class is a concrete subclass of the AVCaptureBracketedStillImageSettings class that is used when bracketing exposure target bias.

AVCaptureBracketedStillImageSettings

AVCaptureBracketedStillImageSettings is an abstract class that defines an interface for settings pertaining to a bracketed capture.

AVCaptureConnection

An AVCaptureConnection object represents a connection between capture input and capture output objects associated with a capture session.

AVCaptureDevice

An AVCaptureDevice object represents a physical capture device and the properties associated with that device. You use a capture device to configure the properties of the underlying hardware. A capture device also provides input data (such as audio or video) to an AVCaptureSession object.

AVCaptureDeviceDiscoverySession

A query for finding and monitoring available capture devices.

AVCaptureDeviceFormat

An AVCaptureDeviceFormat object provides information about a media capture format for use with an AVCaptureDevice instance, such as video frame rates and zoom factors.

AVCaptureDeviceInput

AVCaptureDeviceInput is a concrete sub-class of AVCaptureInput you use to capture data from an AVCaptureDevice object.

AVCaptureDeviceInputSource

An AVCaptureDeviceInputSource object represents a distinct input source on an AVCaptureDevice object.

AVCaptureFileOutput
AVCaptureInput

AVCaptureInput is an abstract base-class describing an input data source to an AVCaptureSession object.

AVCaptureInputPort

An AVCaptureInputPort represents a stream of data from a capture input.

AVCaptureManualExposureBracketedStillImageSettings

The AVCaptureManualExposureBracketedStillImageSettings class is a concrete subclass of the AVCaptureBracketedStillImageSettings class used when bracketing exposure duration and ISO.

AVCaptureMetadataInput
AVCaptureMetadataOutput

An AVCaptureMetadataOutput object intercepts metadata objects emitted by its associated capture connection and forwards them to a delegate object for processing. You can use instances of this class to process specific types of metadata included with the input data. You use this class the way you do other output objects, typically by adding it as an output to an AVCaptureSession object.

AVCaptureMovieFileOutput

AVCaptureMovieFileOutput is a concrete sub-class of AVCaptureFileOutput you use to capture data to a QuickTime movie.

AVCaptureOutput

AVCaptureOutput is an abstract base class describing an output destination of an AVCaptureSession object.

AVCapturePhotoBracketSettings

An AVCapturePhotoBracketSettings object describes desired features and settings for a photo capture request that involves capturing multiple images together with varied settings. To take a bracketed capture, you create and configure an AVCapturePhotoBracketSettings object, using AVCaptureBracketedStillImageSettings objects to describe the individual captures in the bracket, and then pass it to the AVCapturePhotoOutput capturePhoto(with:delegate:) method.

AVCapturePhotoOutput

AVCapturePhotoOutput is a concrete subclass of AVCaptureOutput that provides a modern interface for most capture workflows related to still photography. In addition to basic capture of still images, a photo output supports RAW-format capture, bracketed capture of multiple images, Live Photos, and wide-gamut color. You can have images delivered in RAW format, a compressed format such as JPEG, or both. You can also enable automatic delivery of preview-sized images in addition to a main image. In addition, the AVCapturePhotoOutput class can format captured photos for output in the JPEG/JFIF and DNG file format.

AVCapturePhotoSettings

A AVCapturePhotoSettings instance is a mutable object describing all the desired features and settings for a single photo capture request. To take a photo, you create and configure a AVCapturePhotoSettings object, then pass it to the AVCapturePhotoOutput capturePhoto(with:delegate:) method.

AVCaptureResolvedPhotoSettings

An AVCaptureResolvedPhotoSettings object provides an immutable description of the photo settings for a photo capture request that is either in progress or has completed. When you request a photo capture using the AVCapturePhotoOutput capturePhoto(with:delegate:) method, you describe the settings for that capture request in an AVCapturePhotoSettings object. When the capture begins, the photo output calls your delegate methods and provides an AVCaptureResolvedPhotoSettings object detailing the settings that are in effect for that capture.

AVCaptureScreenInput

AVCaptureScreenInput is a concrete subclass of AVCaptureInput that provides an interface for capturing media from a screen or a portion of a screen.

AVCaptureSession

You use an AVCaptureSession object to coordinate the flow of data from AV input devices to outputs.

AVCaptureStillImageOutput

AVCaptureStillImageOutput is a concrete sub class of AVCaptureOutput that you use to capture a high-quality still image with accompanying metadata.

AVCaptureVideoDataOutput

AVCaptureVideoDataOutput is a concrete sub-class of AVCaptureOutput you use to process uncompressed frames from the video being captured, or to access compressed frames.

AVCaptureVideoPreviewLayer

AVCaptureVideoPreviewLayer is a subclass of CALayer that you use to display video as it is being captured by an input device.

AVComposition

An AVComposition object combines media data from multiple file-based sources in a custom temporal arrangement, in order to present or process media data from multiple sources together. All file-based audiovisual assets are eligible to be combined, regardless of container type. The tracks in an AVComposition object are fixed; to change the tracks, you use an instance of its subclass, AVMutableComposition.

AVCompositionTrack

An AVCompositionTrack object provides the low-level representation of tracks a track in an AVComposition object, comprising a media type, a track identifier, and an array of AVCompositionTrackSegment objects, each comprising a URL, and track identifier, and a time mapping.

AVCompositionTrackSegment

An AVCompositionTrackSegment object represents a segment of an AVCompositionTrack object, comprising a URL, and track identifier, and a time mapping from the source track to the composition track.

AVDateRangeMetadataGroup

AVDateRangeMetadataGroup is used to represent a collection of metadata items that are valid for use within a specific range of dates.

AVFragmentedAssetAVFragmentedAssetMinderAVFragmentedAssetTrackAVFragmentedMovieAVFragmentedMovieMinderAVFragmentedMovieTrack
AVFrameRateRange

An AVFrameRateRange object expresses a range of valid frame rates as minimum and maximum rate and minimum and maximum duration.

AVMIDIPlayer

The AVMIDIPlayer class is a player for music file formats such as MIDI and iMelody.

AVMediaDataStorage
AVMediaSelection

An AVMediaSelection represents a complete rendition of media selection options on an AVAsset.

AVMediaSelectionGroup

An AVMediaSelectionGroup represents a collection of mutually exclusive options for the presentation of media within an asset.

AVMediaSelectionOption

An AVMediaSelectionOption object represents a specific option for the presentation of media within a group of options.

AVMetadataFaceObject

The AVMetadataFaceObject class is a concrete subclass of AVMetadataObject that defines the features of a single detected face. You can retrieve instances of this class from the output of an AVCaptureMetadataOutput object on devices that support face detection.

AVMetadataGroup

AVMetadataGroup is the common superclass for objects representing a collection of metadata items associated with a segment of a timeline.

AVMetadataItem

An AVMetadataItem object represents an item of metadata associated with an audiovisual asset or with one of its tracks. To create metadata items for your own assets, you use the mutable subclass, AVMutableMetadataItem.

AVMetadataItemFilter

The AVMetadataItemFilter class is used to filter selected information from AVMetadataItem objects.

AVMetadataItemValueRequest

An AVMetadataItemValueRequest is used to respond to a request to load the value for an AVMetadataItem created using the init(propertiesOf:valueLoadingHandler:) method.

AVMetadataMachineReadableCodeObject

The AVMetadataMachineReadableCodeObject class is a concrete subclass of AVMetadataObject defining the features of a detected one-dimensional or two-dimensional barcode.

AVMetadataObject

The AVMetadataObject class is an abstract class that defines the basic properties associated with a piece of metadata. These attributes reflect information either about the metadata itself or the media from which the metadata originated. Subclasses are responsible for providing appropriate values for each of the relevant properties.

AVMovieAVMovieTrackAVMusicTrack
AVMutableAudioMix

An AVMutableAudioMix object manages the input parameters for mixing audio tracks. It allows custom audio processing to be performed on audio tracks during playback or other operations.

AVMutableAudioMixInputParameters

An AVMutableAudioMixInputParameters object represents the parameters that should be applied to an audio track when it is added to a mix.

AVMutableComposition

AVMutableComposition is a mutable subclass of AVComposition you use when you want to create a new composition from existing assets. You can add and remove tracks, and you can add, remove, and scale time ranges.

AVMutableCompositionTrack

AVMutableCompositionTrack is a mutable subclass of AVCompositionTrack that lets you for insert, remove, and scale track segments without affecting their low-level representation (that is, the operations you perform are non-destructive on the original).

AVMutableDateRangeMetadataGroup

AVMutableDateRangeMetadataGroup is a mutable subclass of AVDateRangeMetadataGroup used to represent a mutable collection of metadata items that are valid for use within a specific range of dates.

AVMutableMediaSelection

AVMutableMediaSelection is a mutable subclass of AVMediaSelection allowing for the selection of a media option.

AVMutableMetadataItem

AVMutableMetadataItem is a mutable subclass of AVMetadataItem that lets you build collections of metadata to be written to asset files using AVAssetExportSession.

AVMutableMovieAVMutableMovieTrack
AVMutableTimedMetadataGroup

You use an AVMutableTimedMetadataGroup object to represent a mutable collection of metadata items.

AVMutableVideoComposition

The AVMutableVideoComposition class is a mutable subclass of AVVideoComposition.

AVMutableVideoCompositionInstruction

An AVMutableVideoCompositionInstruction object represents an operation to be performed by a compositor.

AVMutableVideoCompositionLayerInstruction

AVMutableVideoCompositionLayerInstruction is a mutable subclass of AVVideoCompositionLayerInstruction that is used to modify the transform, cropping, and opacity ramps to apply to a given track in a composition.

AVOutputSettingsAssistant

The AVOutputSettingsAssistant class specifies a set of parameters for configuring objects that use output settings dictionaries—so that the resulting media file conforms to a specific criteria.

AVPlayer

An AVPlayer is a controller object used to manage the playback and timing of a media asset. It provides the interface to control the player’s transport behavior such as its ability to play, pause, change the playback rate, and seek to various points in time within the media’s timeline. You can use an AVPlayer to play local and remote file-based media, such as QuickTime movies and MP3 audio files, as well as audiovisual media served using HTTP Live Streaming.

AVPlayerItem

AVPlayerItem models the timing and presentation state of an asset played by an AVPlayer object. It provides the interface to seek to various times in the media, determine its presentation size, identify its current time, and much more.

AVPlayerItemAccessLog

You use an AVPlayerItemAccessLog object to retrieve the access log associated with an AVPlayerItem object.

AVPlayerItemAccessLogEvent

An AVPlayerItemAccessLogEvent object represents a single entry in an AVPlayerItem object’s access log.

AVPlayerItemErrorLog

You use an AVPlayerItemErrorLog object to retrieve the error log associated with an AVPlayerItem object.

AVPlayerItemErrorLogEvent

An AVPlayerItemErrorLogEvent object represents a single item in an AVPlayerItem object’s error log.

AVPlayerItemLegibleOutput

The AVPlayerItemLegibleOutput class is a subclass of AVPlayerItemOutput that can vend media with a legible characteristic as an attributed string.

AVPlayerItemMediaDataCollector

AVPlayerItemMediaDataCollector is the abstract base of media data collectors such as AVPlayerItemMetadataCollector.

AVPlayerItemMetadataCollector

AVPlayerItemMetadataCollector is a subclass of AVPlayerItemMediaDataCollector used to capture the date range metadata defined for an HTTP Live Streaming (HLS) asset.

AVPlayerItemMetadataOutput

The AVPlayerItemMetadataOutput class is a subclass of AVPlayerItemOutput that vends collections of metadata items carried in metadata tracks.

AVPlayerItemOutput

The AVPlayerItemOutput class is an abstract class that defines the common interface for moving samples from an asset to an AVPlayer object. You do not create instances of this class directly but instead use one of the concrete subclasses that manage specific types of assets.

AVPlayerItemTrack

You use an AVPlayerItemTrack object to modify the presentation state of an asset track (AVAssetTrack) being presented by an AVPlayer object.

AVPlayerItemVideoOutput

The AVPlayerItemVideoOutput lets you coordinate the output of content associated with a Core Video pixel buffer.

AVPlayerLayer

AVPlayerLayer is a subclass of CALayer to which an AVPlayer object can direct its visual output. It can be used as the backing layer for a UIView or NSView or can be manually added to the layer hierarchy to present your video content on screen.

AVPlayerLooper

AVPlayerLooper is a helper class used to simplify playing looping media content using AVQueuePlayer.

AVPlayerMediaSelectionCriteria

The AVPlayerMediaSelectionCriteria class specifies the preferred languages and media characteristics for an AVPlayer instance.

AVQueuePlayer

AVQueuePlayer is a subclass of AVPlayer used to play a number of items in sequence. Using this class you can create and manage a queue of player items comprised of local or progressively downloaded file-based media, such as QuickTime movies or MP3 audio files, as well as media served using HTTP Live Streaming.

AVSampleBufferDisplayLayer

The AVSampleBufferDisplayLayer class is a subclass of CALayer that displays compressed or uncompressed video frames.

AVSampleBufferGenerator

The AVSampleBufferGenerator class is used to create CMSampleBuffer opaque objects.

AVSampleBufferRequest

An AVSampleBufferRequest instance describes a CMSampleBuffer creation request.

AVSampleCursor

An AVSampleCursor instance is always positioned at a specific media sample in a sequence of samples as defined by a higher-level construct, such as an AVAssetTrack. It can be moved to a new position in that sequence either backwards or forwards, either in decode order or in presentation order. Movement can be requested according to a count of samples or according to a delta in time.

AVSpeechSynthesisVoice

An AVSpeechSynthesisVoice object defines a distinct voice for use in speech synthesis. Voices are distinguished primarily by language and locale.

AVSpeechSynthesizer

The AVSpeechSynthesizer class produces synthesized speech from text on an iOS device, and provides methods for controlling or monitoring the progress of ongoing speech.

AVSpeechUtterance

An AVSpeechUtterance is the basic unit of speech synthesis. An utterance encapsulates some amount of text to be spoken and a set of parameters affecting its speech: voice, pitch, rate, and delay.

AVSynchronizedLayer

AVSynchronizedLayer a subclass of CALayer with layer timing that synchronizes with a specific AVPlayerItem.

AVTextStyleRule

An AVTextStyleRule object represents text styling rules that can be applied to text in a media item. You use text style objects to format subtitles, closed captions, and other text-related content of the item. The rules you specify can be applied to all or part of the text in the media item.

AVTimedMetadataGroup

The AVTimedMetadataGroup class represents a collection of metadata items that are valid for use during a specific range of time.

AVURLAsset

AVURLAsset is a concrete subclass of AVAsset that you use to initialize an asset from a local or remote URL.

AVVideoComposition

An AVVideoComposition object represents an immutable video composition.

AVVideoCompositionCoreAnimationTool

You use an AVVideoCompositionCoreAnimationTool object to incorporate Core Animation in a video composition.

AVVideoCompositionInstruction

An AVVideoCompositionInstruction object represents an operation to be performed by a compositor.

AVVideoCompositionLayerInstruction

An AVVideoCompositionLayerInstruction object represents the transform, opacity, and cropping ramps to apply to a given track.

AVVideoCompositionRenderContext

The AVVideoCompositionRenderContext class defines the context within which custom compositors render new output pixels buffers.

Protocols

AVAssetDownloadDelegate

The AVAssetDownloadDelegate protocol describes the methods that AVAssetDownloadURLSession objects call on their delegates to handle download-related events. These methods should be implemented to be notified of download progress and completion events.

AVAssetResourceLoaderDelegate

The AVAssetResourceLoaderDelegate protocol defines a method that lets your code handle resource loading requests coming from an AVURLAsset object.

AVAsynchronousKeyValueLoading

The AVAsynchronousKeyValueLoading protocol defines methods that let you use an AVAsset or AVAssetTrack object without blocking the calling thread. A “key” is any property of a class that implements this protocol. Using the protocol’s methods, you can find out the current status of a key (for example, whether the corresponding value has been loaded) and ask the object to load its values asynchronously, informing you when the operation has completed.

AVAudio3DMixing

The AVAudio3DMixing protocol defines 3D mixing properties. Currently these properties are only implemented by the AVAudioEnvironmentNode mixer.

AVAudioMixing

The AVAudioMixing protocol defines properties applicable to the input bus of a mixer node.

AVAudioPlayerDelegate

The delegate of an AVAudioPlayer object must adopt the AVAudioPlayerDelegate protocol. All of the methods in this protocol are optional. They allow a delegate to respond to audio interruptions and audio decoding errors, and to the completion of a sound’s playback.

AVAudioRecorderDelegate

The delegate of an AVAudioRecorder object must adopt the AVAudioRecorderDelegate protocol. All of the methods in this protocol are optional. They allow a delegate to respond to audio interruptions and audio decoding errors, and to the completion of a recording.

AVAudioSessionDelegate

The use of this protocol is deprecated in iOS 6 and later. Instead, you should use the notifications declared in AVAudioSession.

AVAudioStereoMixing

The AVAudioStereoMixing protocol defines stereo mixing properties used by mixers.

AVCaptureAudioDataOutputSampleBufferDelegate

The delegate of an AVCaptureAudioDataOutputSampleBuffer object must adopt the AVCaptureAudioDataOutputSampleBufferDelegate protocol. The method in this protocol is optional.

AVCaptureFileOutputDelegate

The AVCaptureFileOutputDelegate protocol defines an interface for delegates of anAVCaptureFileOutput object to monitor and control recordings along exact sample boundaries.

AVCaptureFileOutputRecordingDelegate

Defines an interface for delegates of AVCaptureFileOutput to respond to events that occur in the process of recording a single file.

AVCaptureMetadataOutputObjectsDelegate

The AVCaptureMetadataOutputObjectsDelegate protocol must be adopted by the delegate of an AVCaptureMetadataOutput object . The single method in this protocol is optional. The method allows a delegate to respond when a capture metadata output object receives relevant metadata objects through its connection.

AVCapturePhotoCaptureDelegate

You implement methods in the AVCapturePhotoCaptureDelegate protocol to be notified of progress and results when capturing photos with the AVCapturePhotoOutput class.

AVCaptureVideoDataOutputSampleBufferDelegate

This protocol defines an interface for delegates of an AVCaptureVideoDataOutput object to receive captured video sample buffers and be notified of late sample buffers that were dropped.

AVFragmentMinding
AVPlayerItemLegibleOutputPushDelegate

The AVPlayerItemLegibleOutputPushDelegate protocol extends the AVPlayerItemOutputPushDelegate protocol to provide additional methods specific to attributed string output.

AVPlayerItemMetadataCollectorPushDelegate

The AVPlayerItemMetadataCollectorPushDelegate protocol should be adopted by objects interested in receiving metadata callbacks from an AVPlayerItemMetadataCollector.

AVPlayerItemMetadataOutputPushDelegate

The AVPlayerItemMetadataOutputPushDelegate protocol extends the AVPlayerItemOutputPushDelegate protocol to provide additional methods specific to metadata output.

AVPlayerItemOutputPullDelegate

The AVPlayerItemOutputPullDelegate protocol defines the methods that are called by an AVPlayerItemVideoOutput object in response to pixel buffer changes.

AVPlayerItemOutputPushDelegate

The AVPlayerItemOutputPushDelegate protocol defines common delegate methods for objects participating in AVPlayerItemOutput push sample output acquisition.

AVSpeechSynthesizerDelegate

The AVSpeechSynthesizerDelegate protocol defines methods that the delegate of an AVSpeechSynthesizer object may implement; all methods in this protocol are optional. You can implement these methods to respond to events that occur during speech synthesis.

AVVideoCompositing

The AVVideoCompositing protocol defines properties and methods that custom video compositors must implement.

AVVideoCompositionInstructionProtocol

The AVVideoCompositionInstruction protocol represents operations to be performed by a compositor. An AVVideoComposition object maintains an array of instructions to perform its composition.

AVVideoCompositionValidationHandling

The AVVideoCompositionValidationHandling protocol declares methods that you can implement in the delegate of an AVVideoComposition object to indicate whether validation of a video composition should continue after specific errors have been found.

Extended Types

AVError
NSCoder

The NSCoder abstract class declares the interface used by concrete subclasses to transfer objects and other values between memory and some other format. This capability provides the basis for archiving (where objects and data items are stored on disk) and distribution (where objects and data items are copied between different processes or threads). The concrete subclasses provided by Foundation for these purposes are NSArchiver, NSUnarchiver, NSKeyedArchiver, NSKeyedUnarchiver, and NSPortCoder. Concrete subclasses of NSCoder are referred to in general as coder classes, and instances of these classes as coder objects (or simply coders). A coder object that can only encode values is referred to as an encoder object, and one that can only decode values as a decoder object.

NSNotification.Name

The type used for the name of a notification.

NSValue

An NSValue object is a simple container for a single C or Objective-C data item. It can hold any of the scalar types such as int, float, and char, as well as pointers, structures, and object id references. Use this class to work with such data types in collections (such as NSArray and NSSet), Key-value coding, and other APIs that require Objective-C objects. NSValue objects are always immutable.

Structures

AVCaptureDeviceType

Values identifying the general type of a capture device, used with the defaultDevice(withDeviceType:mediaType:position:) method and the AVCaptureDeviceDiscoverySession class.