Hi @Luuis: Yes, you can live-render tagged buffers as you describe.
In addition to the sample you referenced, you might also consider Converting projected video to Apple Projected Media Profile. This sample demonstrates how to bisect frames from a stereo side-by-side input file and appends tagged buffers to an output file.
Where the APMP sample uses AVAssetWriterInput.TaggedPixelBufferGroupReceiver
, you could instead enqueue sample buffers to an AVSampleBufferVideoRenderer. Both VideoPlayerComponent and VideoMaterial can be used with AVSampleBufferVideoRenderer
.
I'll close with a snippet that demonstrates creation of a sample buffer from individual pixel buffers.
// 1a. Create the tagged buffer for the left eye
let tags: [CMTag] = [
.mediaType(.video), // CMFormatDescription.MediaType
.stereoView(.leftEye), // CMStereoViewComponents
.videoLayerID(Int64(0))
]
let leftTaggedBuffer = CMTaggedDynamicBuffer(
tags: tags,
content: .pixelBuffer(CVReadOnlyPixelBuffer(leftPixelBuffer))
)
// 1b. Adapt the above & repeat for the right eye ...
// 2. Collect the tagged buffers, presentation timestamp, and duration
let taggedBuffers = [leftTaggedBuffer, rightTaggedBuffer]
let presentationTimeStamp: CMTime // derive from the input buffers
let duration = CMTime // derive from the input buffers
// 3. Assemble the sample buffer
let buffer = CMReadySampleBuffer(
taggedBuffers: taggedBuffers,
formatDescription: CMTaggedBufferGroupFormatDescription(taggedBuffers: taggedBuffers),
presentationTimeStamp: presentationTimeStamp,
duration: duration
)
Please let us know if you have questions, or need additional information.
Best,
Steve