Technical Inquiry regarding iPhone LiDAR Specifications and ARKit Data Integrity

  1. Hardware Specifications Regarding the LiDAR scanner in the iPhone 13/14/15/16/17 Pro series, could you please provide the following technical details for academic verification:
  • Point Cloud Density / Resolution: The effective resolution of the depth map.

  • Sampling Frequency: The sensor's refresh rate.

  • Accuracy Metrics: Official tolerance levels regarding depth accuracy relative to distance (specifically within 0.5m – 2m range).

  1. Data Acquisition Methodology For a scientific thesis requiring high data integrity: Does Apple recommend a custom ARKit implementation over third-party applications (e.g., Polycam) to access raw depth data? I need to confirm if third-party apps typically apply smoothing or post-processing that would obscure the sensor's native performance, which must be avoided for my error analysis.

Who do you ask the questions to ?

If the spec is not published (I assume you did perform a web search), developers here will not likely have the information.

If you want to ask directly to Apple, the forums is for developers and may not be the best place "for academic verification"

Googling 3 words of apple dtof doe will help you.

  • Apple LiDAR 3D camera has 64 (16x4) physical laser emitters (VCSEL)
  • They are multiplied with 3x3 by DOE to 576 laser pulses.
  • They are interpolated with live RGB images to generate 256x192 depthMap 60 Hz.

We used an empirical error model of base_error(distance) = a + b * distance * 2

// the distance is in meters.

a = 0.001 and b = 0.00005 .

For demo Apps, explore the GitHub repo; https://github.com/CurvSurf/FindSurface-iOS

Technical Inquiry regarding iPhone LiDAR Specifications and ARKit Data Integrity
 
 
Q