What are the values or data that can be obtained from a LiDAR sensor on an iPhone?
The LiDAR sensor can provide you with depth information about your environment. I am answering the question in the context of ARKit, but LiDAR data is also exposed through other frameworks, such as AVFoundation.
In ARKit, you can benefit from LiDAR data in the following ways:
- The Scene Depth frame semantics can provide you with a
ARDepthData
object that contains a depth buffer and confidence values for every frame (https://developer.apple.com/documentation/arkit/arframe/3566299-scenedepth). This lets you correlate the captured RGB image with depth information. This developer sample demonstrates how to create a fog effect using scene depth. - The Scene Reconstruction API utilizes depth data to create a 3D reconstruction of your environment (https://developer.apple.com/documentation/arkit/arconfiguration/scenereconstruction). This developer sample explains how to visualize and interact with a reconstructed scene.
- Furthermore, LiDAR data is used under the hood to improve the speed and accuracy for many other ARKit features, such as placing objects with raycasting, initialization of tracking, surface detection. It also enables scene understanding in RealityKit such as occlusion and physics based on 3D geometry, and 3D floor plan creation in RoomPlan.