@Francesco_Esimple from my experience, I was able to get the PhotogrammetrySession running on the server-side for processing and reconstructing 3D model from the data that was captured from my iPhone. The data captured were 1. HEIC image and 2. "Snapshot" folder that consisted of data from ObjectCaptureSession that was saved to its checkpointDirectory(Check out the docs here).
I had Vapor as my server-side swift framework which imported RealityKit's PhotogrammetrySession, set its checkpointDirectory property to the one saved to the server from the iPhone's ObjectCaptureSession's checkpointDirectory. By doing this I was able to utilize Point Cloud and other bits of data to reconstruct the 3D model on the server-side PhotogrammetrySession with the images.
I wasn't able to get access to inner bits of data that was saved to the checkpointDirectory though. The folder itself was random collection of folders and .bin files. And opening them up in hex converter, I was only able to decipher bundle paths like com.oc.PointCloud. Hopefully more features and documentation will come out to help us figure these things out.
Topic:
Graphics & Games
SubTopic:
General
Tags: