is the output frame rate of a CMIOExtension rounded or capped?

I made a CMIOExtension (a virtual camera) which generates its own output, for use in our in-house software testing. I wanted to make a video source with 29.97, 30, 59.94 and 60fps output.

To this end, I created a CMIOExtensionDeviceSource which creates a CMIOExtensionDevice with one CMIOExtensionStreamSource with various stream formats contained in [CMIOExtensionStreamFormat], including one with both maxFrameDuration and minFrameDuration = CMTimeMake(value: 1000, timescale: 30000) and another with both maxFrameDuration and minFrameDuration = CMTimeMake(value: 1001, timescale: 30000)

I've held off on the creation of the 59.94/60fps source for now until this problem is resolved.

my virtual camera works, it produces a signal, but when I examine its associated AVCaptureDevice in the debugger, I find

(lldb) po self.captureDevice?.formats[0].videoSupportedFrameRateRanges[0].maxFrameDuration
▿ Optional<CMTime>
  ▿ some : CMTime
    - value : 1000000
    - timescale : 30000000
    ▿ flags : CMTimeFlags
      - rawValue : 1
    - epoch : 0

I get the same value, 1000000/30000000, or exactly 30fps, for all the formats of my AVCaptureDevice.

Is there something I'm doing wrong, or do CMIOExtensionDevices always round the frame rates?

I can't force CoreMediaIO to produce frames at exactly my desired frame interval, but I'd like to ensure that the average frame rate is my desired rate. How can I do that? Frame emission is governed by a repeating DispatchSourceTimer with a repeat time specified in nanoseconds with the TimerFlags set to 'strict'.

Hi there, Can you share a little more information — what app are you testing with? Is it a native or catalyst app? macOS or iOS?

Sorry about the delay, I didn't see your message on the day it was posted (that little bell icon doesn't do much).

I figured out what I was doing wrong but hadn't got around to replying to my own post.

I was using this initializer:

@nonobjc public convenience init(formatDescription: CMFormatDescription, maxFrameDuration: CMTime, minFrameDuration: CMTime, validFrameDurations: [CMTime]?)

In previous code, I'd created a format like this (pseudocode):

CMIOExtensionStreamFormat(formatDescription: description, maxFrameDuration: 30 fps, minFrameDuration: 30 fps,                                                                validFrameDurations: nil)

which seemed to work fine - my virtual camera had a single format with 30fps. When I made a generator with multiple formats, I tried to add another format with a different frame rate like this:

CMIOExtensionStreamFormat(formatDescription: description, maxFrameDuration: 29.97 fps, minFrameDuration: 29.97 fps,                                                                validFrameDurations: nil)

and ended up with an AVCaptureDevice which offered two formats, both with the min and max frame duration of 30fps. That was the wrong thing to do.

I only need a new CMIOExtensionStreamFormat only for a different size of output stream (all my streams use the same pixel format). Each size gets one format with multiple frame rates - pseudocode:

CMIOExtensionStreamFormat(formatDescription: description, maxFrameDuration: 29.97 fps, minFrameDuration: 60 fps,                                                                validFrameDurations: [29.97fps, 30fps, 59.94fps, 60fps])

That works.

Apple's built-in webcam on my laptop can provide arbitrary frame rates between 1 and 30 fps , while most UVC cameras only support a limited, fixed set of frame rates.

With that mystery solved, my remaining question is how best to provide an actual frame rate which is closest to the promised value? Currently I do. this by counting frames since the start of generation and comparing setting a timer to fire at the anticipated time of the next frame (startTime + frameCount*frameDuration), rather than using a repeating timer.

is the output frame rate of a CMIOExtension rounded or capped?
 
 
Q