How to get real world distance with/from LiDAR ?

Hello folks!

How can I get a real-world measurement between the device (iPad Pro 5th. gen) and an object measured with the LiDAR?

Let's say I have a reticle in the middle of my CameraView and want to measure precisely from my position to that point I'm aiming?. Almost like the "Measure App" from Apple.

sceneDepth doesn't give me anything. I also looked into the Sample Code "Capturing Depth Using the LiDAR Camera"

Any ideas how to do that? A push in to the right direction might also be very helpful

Thanks in advance!

Replies

Hi Erik13,

both ideas you're mentioning could be used to achieve this. The values in a depth buffer indicate the distance to the real-world geometry for each pixel, which is precisely what you're interested in. So you're almost there. Let's assume you want to measure from the center of the screen.

  1. You could use ARKit's scene depth API. The sceneDepth property in ARFrame is nil by default. Before you run your session, add the .sceneDepth frame semantic to your configuration’s frameSemantics to instruct ARKit to populate this value with ARDepthData captured by the LiDAR sensor. You can then retrieve the depth buffer from the ARDepthData's depthMap property. Each pixel gives you the depth value in meters at that location, which corresponds to the distance you want to measure (so you would want to look at the pixel corresponding to the screen center location).

  2. Alternatively, you can use ARKit's raycasting API to perform a raycast from the center of the screen. Then you can compute the distance between the raycast result transform (the location where the ray intersects with the real-world geometry) and the camera transform (the location where your device is).

  3. If you simply want to perform a measurement and do not want to render any 3D content in AR, you may not need ARKit at all and can use AVFoundation instead. Creating a capture session using the LiDAR camera will provide you with the depth buffers. The developer sample Capturing Depth Using the LiDAR Camera that you already found might be a good starting point. Similar to 1., you can inspect the depthData's depthDataMap and retrieve the depth value for each pixel you're interested in.

Hi, I'm working with same sample Capturing Depth Using the LiDAR Camera. Long distance is ok, but when its < 1 meter, data is not accurate. What is the recommendation to measure distance from 0.2 meter to 1 meter. Here is code:

    let depthData = syncedDepthData.depthData.converting(toDepthDataType: kCVPixelFormatType_DepthFloat16)
    let depthMapWidth = CVPixelBufferGetWidthOfPlane(depthData.depthDataMap, 0)
    let depthMapHeight = CVPixelBufferGetHeightOfPlane(depthData.depthDataMap, 0)
    let centerX = depthMapWidth / 2
    let centerY = depthMapHeight / 2
    print("Depth value at the center (x,y): \(centerX) \(centerY)")
    
    CVPixelBufferLockBaseAddress(depthData.depthDataMap, .readOnly)
    if let rowData = CVPixelBufferGetBaseAddress(depthData.depthDataMap)?.assumingMemoryBound(to: Float16.self) {
        let depthPoint = rowData[centerY * depthMapWidth + centerX]
        let depthPoint1 = rowData[centerY * depthMapWidth + centerX + 10]
        let depthPoint2 = rowData[centerY * depthMapWidth + centerX + 100]
        print("Depth value at the center in meters: \(depthPoint) \(depthPoint1) \(depthPoint2) meters")
    }
    CVPixelBufferUnlockBaseAddress(depthData.depthDataMap, .readOnly)