Class

AVSampleBufferDisplayLayer

An object that displays compressed or uncompressed video frames.

Declaration

class AVSampleBufferDisplayLayer : CALayer

Topics

Initiating Media Data Requests

func requestMediaDataWhenReady(on: DispatchQueue, using: () -> Void)

Instructs the target to invoke a client-supplied block repeatedly, at its convenience, in order to gather sample buffers for display.

var isReadyForMoreMediaData: Bool

A Boolean value that indicates the readiness of the layer to accept more sample buffers.

func stopRequestingMediaData()

Cancels any current media data request.

Flushing Sample Buffers

func flush()

Instructs the layer to discard any enqueued sample buffers that are pending.

func flushAndRemoveImage()

Instructs the layer to discard pending enqueued sample buffers and remove any currently displayed image.

Configuring the Timebase

var controlTimebase: CMTimebase?

The layer's control timebase, which governs how timestamps are interpreted.

Enqueuing the Sample Buffer

func enqueue(CMSampleBuffer)

Sends a sample buffer for display.

Setting the Video Gravity

var videoGravity: AVLayerVideoGravity

A string defining how the video is displayed within the bounds rect of a sample buffer display layer.

struct AVLayerVideoGravity

A value that defines how the video is displayed within a layer’s bounds rectangle.

Getting Display Layer Settings

var status: AVQueuedSampleBufferRenderingStatus

The ability of the display layer to be used for enqueuing sample buffers.

enum AVQueuedSampleBufferRenderingStatus

The statuses for sample buffer rendering.

Handling Errors

var error: Error?

The error that caused the failure.

static let AVSampleBufferDisplayLayerFailedToDecode: NSNotification.Name

Posted when a buffer display layer failed to decode.

See Also

Media Playback

class AVPlayer

An object that provides the interface to control the player’s transport behavior.

class AVQueuePlayer

A player used to play a number of items in sequence.

class AVPlayerLayer

An object that manages a player's visual output.

class AVPlayerItem

An object used to model the timing and presentation state of an asset played by the player.

class AVPlayerItemMetadataCollector

An object used to capture the date range metadata defined for an HTTP Live Streaming asset.

class AVPlayerItemTrack

An object used to modify the presentation state of an asset track being presented by a player.

class AVSynchronizedLayer

An object used to synchronize with a specific player item.

class AVPlayerMediaSelectionCriteria

An object that specifies the preferred languages and media characteristics for a player.

class AVSampleBufferAudioRenderer

An object used to decompress audio and play compressed or uncompressed audio.

class AVSampleBufferRenderSynchronizer

An object used to synchronize multiple queued sample buffers to a single timeline.

class AVRouteDetector

An object that detects the presences of media playback routes.

Beta Software

This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.

Learn more about using Apple's beta software