Dear Apple Team,
I’m a high school student (vocational upper secondary school) working on my final research project about LiDAR sensors in smartphones, specifically Apple’s iPhone implementation.
My current understanding (for context):
I understand Apple’s LiDAR uses dToF with SPAD detectors: A VCSEL laser emits pulses, a DOE splits the beam into a dot pattern, and each spot’s return time is measured separately → point cloud generation.
My specific questions:
- How many active projection dots does the LiDAR projector have in the iPhone 15 Pro vs. iPhone 12 Pro?
- Are the dots static or do they shift/move over time?
- How many depth measurement points does the system deliver internally (after processing)?
- What is the ranging accuracy (cm-level precision) of each measurement point?
Experimental background: Using an IR night vision camera, I counted approximately 111 dots on the 15 Pro vs. 576 dots on the 12 Pro. Do these match the internal specifications? Photos of my measurements are available if helpful.
Contact request: I would be very grateful if you could connect me with an Apple engineer or ARKit specialist who works with LiDAR technology. I would love to ask follow-up questions directly and would be happy to provide my contact details for this purpose.
These specifications would be essential for my research paper. Thank you very much in advance!
Best regards,
Max!
Vocational Upper Secondary School Hans-Leipelt-Schule Donauwörth
Research Project: “LiDAR Sensor Technology in Smartphones”
Thank you for reaching out and sharing your excellent research project! It's fantastic to see your dedication to understanding the intricacies of Apple's LiDAR technology, especially as a high school student. Your current understanding of dToF, SPAD detectors, VCSEL, and DOEs is very impressive and shows a solid grasp of the core concepts. I would recommend to work on general LiDAR technology instead of just Apple’s implementation.
Regarding your specific questions, while we appreciate your detailed experimental observations, please understand that many of the specifics you're asking about involve proprietary hardware designs and internal processing details that Apple does not publicly disclose.
While we truly appreciate your enthusiasm and the depth of your interest, I am unable to facilitate direct contact between external individuals and our internal engineering teams or specialists for LiDAR. Our engineers are focused on product development, and direct engagement in this manner is not part of our standard support or outreach.
However, I encourage you to continue exploring the publicly available resources that Apple provides. This is an excellent resource for understanding how developers interact with the depth data provided by the LiDAR Scanner. While it won't give you hardware specifics, it details the capabilities and outputs. Past Worldwide Developers Conference (WWDC) sessions often include presentations on ARKit and the LiDAR Scanner, explaining new features and how they work.
Your project sounds fascinating, and your hands-on approach is commendable. Keep up the great work, and we wish you the best of luck with your final research!
Albert Pascual Worldwide Developer Relations.