Hello Apple Developer Team,
We are currently developing an enterprise medical navigation application for Apple Vision Pro and would like to request clarification regarding the currently available visionOS Enterprise APIs related to camera access.
Our application scenario involves real-time medical/surgical navigation and instrument tracking in a professional enterprise environment.
We would like to better understand the following:
How many cameras on Apple Vision Pro are currently accessible through the Enterprise APIs?
Which specific cameras are accessible?
For example:
Main RGB cameras
Passthrough cameras
Tracking cameras
Front-facing cameras
Depth sensors
LiDAR or structured-light related sensors
Are simultaneous multi-camera streams supported?
Does the Enterprise API provide:
Real-time image frames
Camera intrinsic/extrinsic parameters
Stereo camera data
Depth information
Low-latency tracking-related data
Are there any restrictions regarding the use of Vision Pro cameras for:
Medical navigation
Surgical guidance
Instrument tracking
Enterprise healthcare software
Is Apple Vision Pro currently permitted or recommended for medical enterprise spatial-navigation workflows under the Enterprise APIs?
We would greatly appreciate any official clarification regarding the current capabilities and limitations of camera access on Apple Vision Pro for enterprise medical applications.
0
0
35