Post

Replies

Boosts

Views

Activity

Reply to LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question
Multiple Apple patents describe measuring where IR dots land on the SPAD. The algorithm is implemented as hardware switching beneath the SPAD sensor, meaning the position (SPAD pixel coordinates) and distance values are calculated instantly. These SPAD measurements are then interpolated in real time with RGB video images by Apple's advanced AI algorithms to create a depth map stream in real-time.
Topic: Spatial Computing SubTopic: ARKit Tags:
Jan ’26
Reply to Technical Inquiry regarding iPhone LiDAR Specifications and ARKit Data Integrity
Googling 3 words of apple dtof doe will help you. Apple LiDAR 3D camera has 64 (16x4) physical laser emitters (VCSEL) They are multiplied with 3x3 by DOE to 576 laser pulses. They are interpolated with live RGB images to generate 256x192 depthMap 60 Hz. We used an empirical error model of base_error(distance) = a + b * distance * 2 // the distance is in meters. a = 0.001 and b = 0.00005 . For demo Apps, explore the GitHub repo; https://github.com/CurvSurf/FindSurface-iOS
Jan ’26
Reply to LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question
The question is whether the API is open for 111 or 576 raw laser points.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jan ’26
Reply to LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question
How 64 (with DOE to 576) or 111 (without DOE) VCSELs are arranged plays no role. The SPAD pixel coordinates and dToF timing (i.e. distance) is ultimate. Apple does not open the API. We said that the SPAD pixel coordinates and dToF timing (i.e. distance) must be opened.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jan ’26
Reply to LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question
ARKit offers developers the following: Depth data from raw laser points (111 or 576): currently unknown Depth map image (256x192): iOS/iPadOS Mesh data: iOS/iPadOS, visionOS. I've never heard of Apple opening up the API for raw laser points.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jan ’26
Reply to LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question
Multiple Apple patents describe measuring where IR dots land on the SPAD. The algorithm is implemented as hardware switching beneath the SPAD sensor, meaning the position (SPAD pixel coordinates) and distance values are calculated instantly. These SPAD measurements are then interpolated in real time with RGB video images by Apple's advanced AI algorithms to create a depth map stream in real-time.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jan ’26
Reply to LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question
Interessant ... My initial impression is that the IR dots on the door indicate that no DOE is used. 111 VCSELs are employed. The advantage of not-using DOE would be the uniform sensitivity across the SPAD image plane. Of course, LiDAR 3D camera will be thinner.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jan ’26
Reply to Technical Inquiry regarding iPhone LiDAR Specifications and ARKit Data Integrity
Googling 3 words of apple dtof doe will help you. Apple LiDAR 3D camera has 64 (16x4) physical laser emitters (VCSEL) They are multiplied with 3x3 by DOE to 576 laser pulses. They are interpolated with live RGB images to generate 256x192 depthMap 60 Hz. We used an empirical error model of base_error(distance) = a + b * distance * 2 // the distance is in meters. a = 0.001 and b = 0.00005 . For demo Apps, explore the GitHub repo; https://github.com/CurvSurf/FindSurface-iOS
Replies
Boosts
Views
Activity
Jan ’26
Reply to Spatial Computing, ARPointCloud (rawFeaturePoints)
Now, the distance limits of raw feature points generated by Apple ARKit and Google ARCore (YouTube videos). Apple ARKit: much larger than 10 m, 65 m? W2laF37YdB4 Google ARCore: ca. 65 m, UxBIon3GnXs
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jan ’26
Reply to Spatial Computing, ARPointCloud (rawFeaturePoints)
The distance limits of raw feature points generated by Apple ARKit and Google ARCore (YouTube videos). Apple ARKit: ca. 10 m, SIdQRiLj2jY Google ARCore: ca. 65 m, UxBIon3GnXs
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Nov ’25
Reply to Spatial Computing, ARPointCloud (rawFeaturePoints)
FB20473494
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to Spatial Computing, ARPointCloud (rawFeaturePoints)
The source code of the App is now public. A sample project for iOS/iPadOS that searches ARKit's rawFeaturePoints for geometries using FindSurface's real-time geometry detection capabilities.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Sep ’25
Reply to Spatial Computing, ARPointCloud (rawFeaturePoints)
Sorry. https://github.com/CurvSurf/FindSurface-iOS
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Sep ’25
Reply to Spatial Computing, ARPointCloud (rawFeaturePoints)
The source code of the app will be available on GitHub. https://github.com/CurvSurf/FindSurface-visionOS
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Sep ’25