I am attempting to write an app which captures the flight of a ball from the iPhone's video preview, but I need some help.
I am using the following code:
request = VNDetectTrajectoriesRequest(frameAnalysisSpacing: frameCnt, trajectoryLength: trajLength, completionHandler: completionHandler)`
to initiate a request to capture a "ball" from a videoPreview.
In the "completionHandler" I use:
guard let observations = request.results as? [VNTrajectoryObservation] else {
//print("observations not set up#######")
return
}
to capture observations.
In the video capture setup I am using captureSession!.sessionPreset = .hd1920x1080
In the AVCaptureVideoDataOutputSampleBufferDelegate, I am using
trajectoryQueue.async { [self] in
do {
try sequenceHandler.perform([request], on: sampleBuffer, orientation: .right)
} catch {
print("VNSequenceRequestHandler perform error: \(error)")
}
}
I have also tried using VNImageRequestHandler
to "capture" observations in the Delegate.
A ball is "seen" only if the "ball" is rolling on the ground. If the ball in "flying" or "bouncing" no "observations" are provided. I have tried different FrameCounts & trajectory lengths with no effect.
I am now developing the app primarily using an iPhone 14Pro running iOS 26.3.1. It should be noted that I started development using an old iPhone 6plus running iOS 15.7 with
captureSession!.sessionPreset = .vga640x480. and I did get some good results.
If I try the VGA resolution on the iPhone 14pro, I still see no ball flight.
The basis for my app is software from 5 years ago, so I'm hoping that there has been some development on ball tracking since then.
Thanks in advance for any help/suggestions.
Topic:
Media Technologies
SubTopic:
Video
0
0
119