Hi,
I have two questions regarding the ActionAndVision sample application.
After setting up the live AVSession how exactly and in which function is the sample buffer given to a visonhandler to perform a vision request. (for e.g. the getLastThrowType request)
When and how is the captureOutput(...) func in the CameraViewController called? (line 268 ff)
I appreciate any help, thank you very much.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
in which measurement unit are the diameters expected by the VNDetectTrajectoriesRequest when specifying a minimum and maximum size?
For example: A soccer ball has a diameter of roughly 22 cm.
Is the expected size then (if its meters?):
request.minimumObjectSize = 0.17 / videoWidth
request.maximumObjectSize = 0.27 / videoWidth
?
Thanks for your help!
Hi,
I've set up a captureSession and am now trying to set the framerate to 60. I am using an iPhone 12 Pro Max.
I am trying to set the frame rate with: videoDevice?.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 60), but printing my .activeFormat tells me my iPhone only supports 30 fps.
What am I doing wrong?
Thanks
Hi,
does anyone know which type of network Apple uses for classification in the CreateML app? Do they use cnn's?
Any help is appreciated.
Thanks.