What is a good way to track the time ranges associated with a particular action classifier label?

After creating a custom action classifier in Create ML, previewing it (see the bottom of the page) with an input video shows the label associated with a segment of the video. What would be a good way to store the duration for a given label, say, each CMTimeRange of segment of video frames that are classified as containing "Jumping Jacks?"

I previously found that storing time ranges of trajectory results was convenient, since each VNTrajectoryObservation vended by Apple had an associated CMTimeRange.

However, using my custom action classifier instead, each VNObservation result's CMTimeRange has a duration value that's always 0.

func completionHandler(request: VNRequest, error: Error?) {
    guard let results = request.results as? [VNHumanBodyPoseObservation] else {
        return
    }

    if let result = results.first {
        storeObservation(result)
    }
    
    do {
        for result in results where try self.getLastTennisActionType(from: [result]) == .playing  {
            var fileRelativeTimeRange = result.timeRange
            fileRelativeTimeRange.start = fileRelativeTimeRange.start - self.assetWriterStartTime
            self.timeRangesOfInterest[Int(fileRelativeTimeRange.start.seconds)] = fileRelativeTimeRange
        }
    } catch {
        print("Unable to perform the request: \(error.localizedDescription).")
    }
}

In this case I'm interested in frames with the label "Playing" and successfully classify them, but I'm not sure where to go from here to track the duration of video segments with consecutive frames that have that label.

What is a good way to track the time ranges associated with a particular action classifier label?
 
 
Q