Post

Replies

Boosts

Views

Activity

Reply to LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question
Multiple Apple patents describe measuring where IR dots land on the SPAD. The algorithm is implemented as hardware switching beneath the SPAD sensor, meaning the position (SPAD pixel coordinates) and distance values are calculated instantly. These SPAD measurements are then interpolated in real time with RGB video images by Apple's advanced AI algorithms to create a depth map stream in real-time.
Topic: Spatial Computing SubTopic: ARKit Tags:
1w
Reply to Technical Inquiry regarding iPhone LiDAR Specifications and ARKit Data Integrity
Googling 3 words of apple dtof doe will help you. Apple LiDAR 3D camera has 64 (16x4) physical laser emitters (VCSEL) They are multiplied with 3x3 by DOE to 576 laser pulses. They are interpolated with live RGB images to generate 256x192 depthMap 60 Hz. We used an empirical error model of base_error(distance) = a + b * distance * 2 // the distance is in meters. a = 0.001 and b = 0.00005 . For demo Apps, explore the GitHub repo; https://github.com/CurvSurf/FindSurface-iOS
2w