ARKit Lidar more accurate than AVFoundation LiDAR

The idea of using AVFoundation is because higher resolution depth maps and more accurate instead of ARKit, but me trying it on iPhone 12,13 pro isn't the case. the filtered map jumps too much from depth values and the unfiltered depth map has a lot of holes and sometimes It maybe go all black.

I'm I missing something? how I should be testing it?

Replies

As far as I observed, Apple has been improving the SOFTWARE performance of LiDAR and ARKit since their release in 2020 and 2017, respectively.

The overall performance of the combination of Apple's LiDAR and ARKit can be checked by using CurvSurf's FindSurface runtime library:

  1. 3D measurement accuracy of LiDAR
  2. Robustness of DepthMap against darkness
  3. Accuracy of the motion tracking of ARKit
  4. Robustness of the motion tracking of ARKit against darkness and device's shaking.

CurvSurf's FindSurface runtime library determines the shape, size, position, and orientation of an object surface by processing the 3D measurement points according to the orthogonal distance fitting.

The accuracy of the orthogonal distance fitting algorithms adopted by CurvSurf's FindSurface runtime library is approved by the German PTB according to ISO 10360-6 (see my PhD thesis, ISBN 3540239669, 2004).

If there are somewhat shortages in the overall performance of Apple's LiDAR and ARKit mentioned above, we will observe some misalignment between the real object surfaces and the virtual ads.

In the same way, we can check the overall performance of '3D measurement and motion tracking' of Google ARCore and Microsoft HoloLens.

Virtual Ads inside Chungmuro station Line 3 - iPhone Pro 12: YouTube Video: BmKNmZCiMkw

The source code of the App producing the above video is available on GitHub CurvSurf: FindSurface-SceneKit-ARDemo-iOS