Developer information about the new LiDAR

Hello fellow devs,


First post. Have tried searching docs and forums, but haven't found any useful specs on the new LiDAR capability. Needless to say, this is an amazing and revolutionary development - would like to know how/when this will be available for us as developers to start working on?


We have an existing CAD Viewer app that would benefit tremendously from the increase precision and stability that this might provide to AR Kit.


Any pointers where to start looking would be most appreciated! Cheers!

Replies

Hello Bobjt, can you elaborate on how/if the use of LiDAR hardware and ARConfiguration.SceneReconstruction.mesh or .meshWithClassification affects the worldmap generated in an AR session in terms of fidelity (e.g. robustness against varying lighting situations) for reusability on other devices (persistant AR)?


My use case: I am preproducing worldmaps for later use in the release version and am wondering if the worldmaps generated with the LiDAR iPad are any improvement over worldmaps from, say, iPhone 11 Pro. I am using ARKit with SceneKit.


Thanks!

I Have the new iPad Pro, and the latest XCode - there appears to be no beta at the moment. But when I down https://developer.apple.com/documentation/arkit/world_tracking/visualizing_and_interacting_with_a_reconstructed_scene

I get tons of errors, like it isn't supported. And when I try to manually make the scene reconstruction work in a new project I also get errors. Are we waiting on a beta to be released before we can develop with this?


Has anyone been able to get the demo project to work?


thanks,

Dale

Just got the new ipad pro and the tracking is the best I have scene out of any platform I've used pretty much on par with Hololens2. To that end can you export the mesh as a USDZ file or something? The documentation is pretty sparse, so love to develop for this, but not much to go on.

Yes you can, see https://forums.developer.apple.com/thread/130599, which explains how to create an SCNScene with the mesh geometry. Once you have an SCNScene, you can use https://developer.apple.com/documentation/scenekit/scnscene/1523577-write to export to a supported file format.

We are also very interested in accessing the LiDAR depth map directly.

I'm also curious how the Complete Anatomy app is taking advantage of the LiDAR:

https://www.youtube.com/watch?v=vkWdj9CNebg


The video suggests that the LiDAR is used to measure / pose track the body in real-time. However it is not a SLAM-type application where you walk around to mesh the environment.

So is Complete Anatomy only using the standard ARKit APIs for this or are there other APIs that take advantage of the LiDAR, e.g. body pose tracking?

I've now received a device and performed some initial testing, using the sample application, and it looks very promising indeed. The performance is generally comparable to that of Magic Leap One or HoloLens 2 which is great.


One thing I noticed is that the reconstruction does not seem to take previous passes into account when visiting a space the second time (loop closure). Is there a setting for the individual mesh anchors to not be "overwritten" when there is a slight mismatch, and instead try to use as much as possible of the existing mesh for reconciliation/relocalization?

There has been new released sample code and documentation on the ARKit 4.0 since WWDC2020.
Some resources are:
Creating a Fog Effect Using Scene Depth 
Visualizing a Point Cloud Using Scene Depth
https://developer.apple.com/videos/play/wwdc2020/10611/

I too am looking for this information, we are scanning objects at work to reconstruct. The ‘3D scanner App’ seems to be the only one that supports the liar well. I would like to use Autodesks ‘recap Pro’ but it doesn’t recognise the inbuilt Lidar.
the 3D scanner app shows a Resolution tolerance of 3mm to 20mm. I wonder if these numbers are arbitrary set by the developer or hard locked by the inbuilt hardware.. Can these numbers be pushed to greater tolerance.

It seems the only tool of use to Lidar is the measure tool that comes preloaded on the iPad.. I wish apple would develop or integrate into measure the ‘pointcloud’ mapping ability for creatives like me who do 3D printing, works with the Unreal engine & dabble in code.

Great questions & looking forward to hearing more from apple and other developers