LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question

Dear Apple Team, I’m a high school student (vocational upper secondary school) working on my final research project about LiDAR sensors in smartphones, specifically Apple’s iPhone implementation. My current understanding (for context):
I understand Apple’s LiDAR uses dToF with SPAD detectors: A VCSEL laser emits pulses, a DOE splits the beam into a dot pattern, and each spot’s return time is measured separately → point cloud generation. My specific questions:

  1. How many active projection dots does the LiDAR projector have in the iPhone 15 Pro vs. iPhone 12 Pro?
  2. Are the dots static or do they shift/move over time?
  3. How many depth measurement points does the system deliver internally (after processing)?
  4. What is the ranging accuracy (cm-level precision) of each measurement point?

Experimental background: Using an IR night vision camera, I counted approximately 111 dots on the 15 Pro vs. 576 dots on the 12 Pro. Do these match the internal specifications? Photos of my measurements are available if helpful.

Contact request: I would be very grateful if you could connect me with an Apple engineer or ARKit specialist who works with LiDAR technology. I would love to ask follow-up questions directly and would be happy to provide my contact details for this purpose.

These specifications would be essential for my research paper. Thank you very much in advance! Best regards,
Max! Vocational Upper Secondary School Hans-Leipelt-Schule Donauwörth Research Project: “LiDAR Sensor Technology in Smartphones”

Interessant ...

My initial impression is that the IR dots on the door indicate that no DOE is used. 111 VCSELs are employed. The advantage of not-using DOE would be the uniform sensitivity across the SPAD image plane. Of course, LiDAR 3D camera will be thinner.

Multiple Apple patents describe measuring where IR dots land on the SPAD. The algorithm is implemented as hardware switching beneath the SPAD sensor, meaning the position (SPAD pixel coordinates) and distance values are calculated instantly. These SPAD measurements are then interpolated in real time with RGB video images by Apple's advanced AI algorithms to create a depth map stream in real-time.

LiDAR Projector Pattern iPhone 15 Pro vs. 12 Pro – Research Project Question
 
 
Q