AVPictureInPictureController with AVSampleBufferDisplayLayer: Video not scaled in PiP window on macOS

AVPictureInPictureController with AVSampleBufferDisplayLayer: Video not scaled in PiP window on macOS

Platform: macOS 26.4 (Tahoe) Framework: AVKit / AVFoundation Xcode: 26.4

Summary

When using AVPictureInPictureController with ContentSource(sampleBufferDisplayLayer:playbackDelegate:) on macOS, the video content in the PiP window is not scaled to fit — it renders at 1:1 pixel resolution, showing only the bottom-left portion of the video (zoomed/cropped). The same code works correctly on iOS.

Setup

let displayLayer = AVSampleBufferDisplayLayer()
displayLayer.videoGravity = .resizeAspect
// Host displayLayer as a sublayer of an NSView, enqueue CMSampleBuffers

let source = AVPictureInPictureController.ContentSource(
    sampleBufferDisplayLayer: displayLayer,
    playbackDelegate: self
)
let pip = AVPictureInPictureController(contentSource: source)
pip.delegate = self

The source display layer is 1280×720, matching the video stream resolution. PiP starts successfully — isPictureInPicturePossible is true, the PiP button works, and the PIPPanel window appears.

However, the video in the PiP window (~480×270) shows only the bottom-left 480×270 pixels of the 1280×720 content, rather than scaling the full frame to fit.

Investigation

Inspecting the PiP window hierarchy reveals:

PIPPanel (480×270)
  └─ AVPictureInPictureSampleBufferDisplayLayerView
       └─ AVPictureInPictureSampleBufferDisplayLayerHostView (layer = CALayerHost)
            └─ AVPictureInPictureCALayerHostView

The CALayerHost mirrors the source AVSampleBufferDisplayLayer at 1:1 pixel resolution. Unlike AVPlayerLayer-based PiP (which works correctly on macOS), the sample buffer display layer path does not apply any scaling transform to the mirrored content.

On iOS, PiP with AVSampleBufferDisplayLayer works correctly because the system reparents the layer into the PiP window, so standard layer scaling applies. On macOS, the system uses CALayerHost mirroring instead, and the scaling step is missing.

What I tried (none fix the issue)

  1. Setting autoresizingMask on all PiP internal subviews — views resize correctly, but CALayerHost content remains at 1:1 pixel scale
  2. Applying CATransform3DMakeScale on the CALayerHost layer — creates a black rectangle artifact; the mirrored content does not transform
  3. Setting CALayerHost.bounds to the source layer size — no effect on rendering
  4. Reparenting the internal AVPictureInPictureCALayerHostView out of the host view — video disappears entirely
  5. Hiding the CALayerHost — PiP window goes white (confirming it is the sole video renderer)
  6. Resizing the source AVSampleBufferDisplayLayer to match the PiP window size — partially works (1:1 mirror of a smaller source fits), but causes visible lag during resize, affects the main window's "This video is playing in Picture in Picture" placeholder, and didTransitionToRenderSize stops being called after the initial resize

Expected behavior

The video content should be scaled to fit the PiP window, respecting the display layer's videoGravity setting (.resizeAspect), consistent with:

  • iOS PiP with AVSampleBufferDisplayLayer (works correctly)
  • macOS PiP with AVPlayerLayer (works correctly)

Environment

  • macOS 26.4 (Tahoe)
  • Xcode 26.4
  • Apple Silicon (M-series)
  • Retina display (contentsScale = 2.0)
  • Video: H.264 1280×720, hardware decoded via VTDecompressionSession, enqueued as CMSampleBuffer
AVPictureInPictureController with AVSampleBufferDisplayLayer: Video not scaled in PiP window on macOS
 
 
Q