Extracting pixels with depth less than 3 meters from captured CVPixelBuffer type

Hello -

I'm new to Swift and SwiftUI and was trying to capture a video or image from the LiDAR sensor. I've looked at the examples provided by Apple and made a simple app that draws color to objects depending on the pixel's depth. Red being close, blue being the furthest. However, now I want to perform some filtering on the captured data buffer, a type of CVPixelBuffer - I'm not sure how to extract pixels with a depth less than 3, for example, or 0.3, after normalizing the pixels.

My goal is to have something similar to the code below:

        var depthDataMap = syncedDepthData.depthData

        if syncedDepthData.depthData.depthDataType != kCVPixelFormatType_DisparityFloat32 {

                    depthDataMap = syncedDepthData.depthData.converting(toDepthDataType: kCVPixelFormatType_DisparityFloat32)

                }
        depthDataMap.depthDataMap.normalize()
       
# after getting the depth map how can I extract all the pixels or indexes with depth less than some number
depthDataMap.depthDataMap.filter(depth=3)  
# result I'm looking for all the pixels with depth less than 3 I'm sure there is a way to call this method from the CPU.

Thanks for any help or information.

Matt

Extracting pixels with depth less than 3 meters from captured CVPixelBuffer type
 
 
Q