LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question

Dear Apple Team, I’m a high school student (vocational upper secondary school) working on my final research project about LiDAR sensors in smartphones, specifically Apple’s iPhone implementation. My current understanding (for context):
I understand Apple’s LiDAR uses dToF with SPAD detectors: A VCSEL laser emits pulses, a DOE splits the beam into a dot pattern, and each spot’s return time is measured separately → point cloud generation. My specific questions:

  1. How many active projection dots does the LiDAR projector have in the iPhone 15 Pro vs. iPhone 12 Pro?
  2. Are the dots static or do they shift/move over time?
  3. How many depth measurement points does the system deliver internally (after processing)?
  4. What is the ranging accuracy (cm-level precision) of each measurement point?

Experimental background: Using an IR night vision camera, I counted approximately 111 dots on the 15 Pro vs. 576 dots on the 12 Pro. Do these match the internal specifications? Photos of my measurements are available if helpful.

Contact request: I would be very grateful if you could connect me with an Apple engineer or ARKit specialist who works with LiDAR technology. I would love to ask follow-up questions directly and would be happy to provide my contact details for this purpose.

These specifications would be essential for my research paper. Thank you very much in advance! Best regards,
Max! Vocational Upper Secondary School Hans-Leipelt-Schule Donauwörth Research Project: “LiDAR Sensor Technology in Smartphones”

Answered by DTS Engineer in 873109022

Thank you for reaching out and sharing your excellent research project! It's fantastic to see your dedication to understanding the intricacies of Apple's LiDAR technology, especially as a high school student. Your current understanding of dToF, SPAD detectors, VCSEL, and DOEs is very impressive and shows a solid grasp of the core concepts. I would recommend to work on general LiDAR technology instead of just Apple’s implementation.

Regarding your specific questions, while we appreciate your detailed experimental observations, please understand that many of the specifics you're asking about involve proprietary hardware designs and internal processing details that Apple does not publicly disclose.

While we truly appreciate your enthusiasm and the depth of your interest, I am unable to facilitate direct contact between external individuals and our internal engineering teams or specialists for LiDAR. Our engineers are focused on product development, and direct engagement in this manner is not part of our standard support or outreach.

However, I encourage you to continue exploring the publicly available resources that Apple provides. This is an excellent resource for understanding how developers interact with the depth data provided by the LiDAR Scanner. While it won't give you hardware specifics, it details the capabilities and outputs. Past Worldwide Developers Conference (WWDC) sessions often include presentations on ARKit and the LiDAR Scanner, explaining new features and how they work.

Your project sounds fascinating, and your hands-on approach is commendable. Keep up the great work, and we wish you the best of luck with your final research!

Albert Pascual
  Worldwide Developer Relations.

Interessant ...

My initial impression is that the IR dots on the door indicate that no DOE is used. 111 VCSELs are employed. The advantage of not-using DOE would be the uniform sensitivity across the SPAD image plane. Of course, LiDAR 3D camera will be thinner.

Multiple Apple patents describe measuring where IR dots land on the SPAD. The algorithm is implemented as hardware switching beneath the SPAD sensor, meaning the position (SPAD pixel coordinates) and distance values are calculated instantly. These SPAD measurements are then interpolated in real time with RGB video images by Apple's advanced AI algorithms to create a depth map stream in real-time.

ARKit offers developers the following:

  • Depth data from raw laser points (111 or 576): currently unknown

  • Depth map image (256x192): iOS/iPadOS

  • Mesh data: iOS/iPadOS, visionOS.

I've never heard of Apple opening up the API for raw laser points.

How 64 (with DOE to 576) or 111 (without DOE) VCSELs are arranged plays no role. The SPAD pixel coordinates and dToF timing (i.e. distance) is ultimate. Apple does not open the API.

We said that the SPAD pixel coordinates and dToF timing (i.e. distance) must be opened.

Accepted Answer

Thank you for reaching out and sharing your excellent research project! It's fantastic to see your dedication to understanding the intricacies of Apple's LiDAR technology, especially as a high school student. Your current understanding of dToF, SPAD detectors, VCSEL, and DOEs is very impressive and shows a solid grasp of the core concepts. I would recommend to work on general LiDAR technology instead of just Apple’s implementation.

Regarding your specific questions, while we appreciate your detailed experimental observations, please understand that many of the specifics you're asking about involve proprietary hardware designs and internal processing details that Apple does not publicly disclose.

While we truly appreciate your enthusiasm and the depth of your interest, I am unable to facilitate direct contact between external individuals and our internal engineering teams or specialists for LiDAR. Our engineers are focused on product development, and direct engagement in this manner is not part of our standard support or outreach.

However, I encourage you to continue exploring the publicly available resources that Apple provides. This is an excellent resource for understanding how developers interact with the depth data provided by the LiDAR Scanner. While it won't give you hardware specifics, it details the capabilities and outputs. Past Worldwide Developers Conference (WWDC) sessions often include presentations on ARKit and the LiDAR Scanner, explaining new features and how they work.

Your project sounds fascinating, and your hands-on approach is commendable. Keep up the great work, and we wish you the best of luck with your final research!

Albert Pascual
  Worldwide Developer Relations.

The question is whether the API is open for 111 or 576 raw laser points.

LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question
 
 
Q