We use an AVPlayer for video playback with a custom video composition to render the video frames. Occasionally, we replace the item's composition with a new one to alter the rendering. Then, we acquire a new pixel buffer with hasNewPixelBuffer(forItemTime:) and copyPixelBuffer(forItemTime:, itemTimeForDisplay:) on the AVPlayerItemVideoOutput instance to display the changes. We get the item time with itemTime(forHostTime: CACurrentMediaTime()) on the output. However, when the player is paused there is no new pixel buffer and hasNewPixelBuffer(forItemTime:) returns false. How can we re-render the current frame?
https://developer.apple.com/library/archive/qa/qa1966/_index.html here it says that resetting the video composition should enable the re-rendering, which does not seem to work for us. When setting the composition a second time, the one set just before is applied instead, causing it to be always being one behind. Also the video "jumps" a little, meaning a slightly different frame is rendered.
https://developer.apple.com/library/archive/qa/qa1966/_index.html here it says that resetting the video composition should enable the re-rendering, which does not seem to work for us. When setting the composition a second time, the one set just before is applied instead, causing it to be always being one behind. Also the video "jumps" a little, meaning a slightly different frame is rendered.