AVPictureInPictureController with AVSampleBufferDisplayLayer: Video not scaled in PiP window on macOS

AVPictureInPictureController with AVSampleBufferDisplayLayer: Video not scaled in PiP window on macOS

Platform: macOS 26.4 (Tahoe) Framework: AVKit / AVFoundation Xcode: 26.4

Summary

When using AVPictureInPictureController with ContentSource(sampleBufferDisplayLayer:playbackDelegate:) on macOS, the video content in the PiP window is not scaled to fit — it renders at 1:1 pixel resolution, showing only the bottom-left portion of the video (zoomed/cropped). The same code works correctly on iOS.

Setup

let displayLayer = AVSampleBufferDisplayLayer()
displayLayer.videoGravity = .resizeAspect
// Host displayLayer as a sublayer of an NSView, enqueue CMSampleBuffers

let source = AVPictureInPictureController.ContentSource(
    sampleBufferDisplayLayer: displayLayer,
    playbackDelegate: self
)
let pip = AVPictureInPictureController(contentSource: source)
pip.delegate = self

The source display layer is 1280×720, matching the video stream resolution. PiP starts successfully — isPictureInPicturePossible is true, the PiP button works, and the PIPPanel window appears.

However, the video in the PiP window (~480×270) shows only the bottom-left 480×270 pixels of the 1280×720 content, rather than scaling the full frame to fit.

Investigation

Inspecting the PiP window hierarchy reveals:

PIPPanel (480×270)
  └─ AVPictureInPictureSampleBufferDisplayLayerView
       └─ AVPictureInPictureSampleBufferDisplayLayerHostView (layer = CALayerHost)
            └─ AVPictureInPictureCALayerHostView

The CALayerHost mirrors the source AVSampleBufferDisplayLayer at 1:1 pixel resolution. Unlike AVPlayerLayer-based PiP (which works correctly on macOS), the sample buffer display layer path does not apply any scaling transform to the mirrored content.

On iOS, PiP with AVSampleBufferDisplayLayer works correctly because the system reparents the layer into the PiP window, so standard layer scaling applies. On macOS, the system uses CALayerHost mirroring instead, and the scaling step is missing.

What I tried (none fix the issue)

  1. Setting autoresizingMask on all PiP internal subviews — views resize correctly, but CALayerHost content remains at 1:1 pixel scale
  2. Applying CATransform3DMakeScale on the CALayerHost layer — creates a black rectangle artifact; the mirrored content does not transform
  3. Setting CALayerHost.bounds to the source layer size — no effect on rendering
  4. Reparenting the internal AVPictureInPictureCALayerHostView out of the host view — video disappears entirely
  5. Hiding the CALayerHost — PiP window goes white (confirming it is the sole video renderer)
  6. Resizing the source AVSampleBufferDisplayLayer to match the PiP window size — partially works (1:1 mirror of a smaller source fits), but causes visible lag during resize, affects the main window's "This video is playing in Picture in Picture" placeholder, and didTransitionToRenderSize stops being called after the initial resize

Expected behavior

The video content should be scaled to fit the PiP window, respecting the display layer's videoGravity setting (.resizeAspect), consistent with:

  • iOS PiP with AVSampleBufferDisplayLayer (works correctly)
  • macOS PiP with AVPlayerLayer (works correctly)

Environment

  • macOS 26.4 (Tahoe)
  • Xcode 26.4
  • Apple Silicon (M-series)
  • Retina display (contentsScale = 2.0)
  • Video: H.264 1280×720, hardware decoded via VTDecompressionSession, enqueued as CMSampleBuffer

Additional finding: I looked at how IINA (open-source macOS media player, github.com/iina/iina) implements PiP. They use Apple's private PIP.framework (PIPViewController), which physically reparents the video view into the PiP window — and their PiP works perfectly.

This confirms the root cause: PiP works when the view/layer is reparented (private PIPViewController on macOS, public API on iOS), but fails when using CALayerHost mirroring (the public AVPictureInPictureController + AVSampleBufferDisplayLayer path on macOS).

The fix would be either:

  1. Reparent the AVSampleBufferDisplayLayer into the PiP window (as iOS already does), or
  2. Apply a scaling transform to the CALayerHost mirror content to fit the PiP window bounds

Update — Private PIP.framework also affected

I tested an alternative approach using Apple's private PIP.framework (PIPViewController), which physically reparents the source NSView into the PiP window — unlike AVPictureInPictureController, which mirrors via CALayerHost.

The result is identical: the PiP window shows only the bottom-left portion of the video at 1:1 pixel resolution, without scaling to fit the window bounds.

This confirms the issue is with AVSampleBufferDisplayLayer itself, not with how the PiP window is created. The layer does not honour its videoGravity or respond to frame/bounds changes when hosted inside a PiP window on macOS. The same layer scales correctly in regular windows and works correctly on iOS.

For reference, apps like IINA that use PIPViewController with Metal/OpenGL rendering views work fine — the problem is specific to AVSampleBufferDisplayLayer.

This affects any macOS app using AVSampleBufferDisplayLayer for custom streaming (IPTV, HLS demuxers, hardware-decoded pipelines via VTDecompressionSession) that wants to support Picture in Picture.

Filed as FB22411168.

Update — Working workaround found

After extensive debugging, I found a two-part workaround that gives fully functional PiP with AVSampleBufferDisplayLayer on macOS.

Root cause analysis

The PiP window actually contains TWO stacked rendering paths:

PIPPanel (contentView: 484×272)
  └─ AVPictureInPictureSampleBufferDisplayLayerView (1280×720)
       └─ AVPictureInPictureSampleBufferDisplayLayerHostView (layer = CALayerHost, contentsScale = 1.0)
            ├─ AVSampleBufferDisplayLayerContentLayer (734×413) ← correctly scaled video
            └─ AVPictureInPictureCALayerHostView (1600×900)    ← broken overlay

The "good" path (AVSampleBufferDisplayLayerContentLayer) renders the video correctly scaled to the PiP window size.

The "bad" path (AVPictureInPictureCALayerHostView) is drawn ON TOP and is the source of all the visual problems. Inspecting its properties reveals:

  • hasContents = false, sublayers = 0 — it contains no actual video content
  • backgroundColor = solid black (CGColor gray=0, alpha=1)
  • transform = [m11=0.8, m22=0.8, m33=0.8] — Apple applies a scale transform, but since there's no content, it just draws a scaled black rectangle
  • contentsScale = 2.0 (while its parent CALayerHost has contentsScale = 1.0 — a retina mismatch)

This view appears to be a leftover from the AVPlayerLayer PiP code path where the CALayerHost mirror is the primary renderer. For AVSampleBufferDisplayLayer, the mirroring is never set up (0 sublayers, no contents), but the view is still displayed with its black background.

Workaround (two parts)

Part A — Resize the source window to match PiP content size:

The "good" content layer also copies pixels at 1:1 resolution. If the source window is larger than the PiP window, only the bottom-left corner is visible. Fix: resize the player window to match the PiP content dimensions (hidden with alphaValue = 0), and track PiP window resizes via NSView.frameDidChangeNotification to stay in sync.

func pictureInPictureControllerDidStartPictureInPicture(_ pip: AVPictureInPictureController) {
    guard let window = playerWindow,
          let pipWindow = NSApplication.shared.windows.first(where: {
              String(describing: type(of: $0)).contains("PIPPanel")
          }),
          let pipContentView = pipWindow.contentView else { return }
    
    let pipSize = pipContentView.bounds.size
    savedWindowFrame = window.frame
    window.setFrame(NSRect(
        x: window.frame.origin.x,
        y: window.frame.maxY - pipSize.height,
        width: pipSize.width,
        height: pipSize.height
    ), display: true)
    window.alphaValue = 0
    
    // Also observe PiP resize to keep window in sync
    pipContentView.postsFrameChangedNotifications = true
    // ... add NSView.frameDidChangeNotification observer
}

Part B — Hide the broken overlay view:

After PiP starts (with a short delay to let the system build its view hierarchy), walk the PiP window's views and hide AVPictureInPictureCALayerHostView. This removes the black rectangle drawn on top of the correctly-scaled video.

DispatchQueue.main.asyncAfter(deadline: .now() + 0.1) {
    guard let pipWindow = NSApplication.shared.windows.first(where: {
        String(describing: type(of: $0)).contains("PIPPanel")
    }), let cv = pipWindow.contentView else { return }
    
    func walkViews(_ view: NSView) {
        if String(describing: type(of: view)) == "AVPictureInPictureCALayerHostView" {
            view.isHidden = true
        }
        for sub in view.subviews { walkViews(sub) }
    }
    walkViews(cv)
}

Important: Do NOT hide the parent CALayerHost view — it carries the good rendering path.

Result

PiP works correctly: full video frame, proper scaling, play/pause controls functional, PiP window resizable. Restore the original window frame and alpha when PiP ends.

This workaround is safe — the hidden view is an empty black rectangle with no content, no sublayers, and no functional purpose for AVSampleBufferDisplayLayer-based PiP. The underlying bug remains: Apple should either disable this view for the sample buffer path, or properly set up its CALayerHost mirroring and fix the contentsScale mismatch.

Tested on macOS 26.4 (Tahoe), Xcode 26.4, Apple Silicon, Retina display.

AVPictureInPictureController with AVSampleBufferDisplayLayer: Video not scaled in PiP window on macOS
 
 
Q