ARKit 4 | Save point cloud

Hi,

I was playing around with this code released by apple at WWDC20. when I sweep my iPad pro around my room multiple times, a dense point cloud can be seen on the screen. I would like to save this point cloud for further processing. I've searched a lot for the last week but couldn't find anything. Does anyone know how this could be achieved?
What I want to save for each point:
  1. Coordinates

  2. Normals

  3. Color

Replies

Hi 3ventHoriz0n,

working on the same problem.
I am apparently to stupid to find out where I can get the point cloud data from and save it to a file or post it to an api for rendering.

Would be awesome if someone could post a solution.

Regards,
Vincent
Hey Vincent,

Well I figured that it's saved in the Metal Buffer but I have no idea how to get it from there.

Anand
Hello,

There are four pieces of information that you need to store to be able to recreate the point cloud for each frame:
  1. The sceneDepth depthMap texture of the current frame.

  2. The viewMatrix for the current frame and orientation.

  3. The camera intrinsics for the current frame.

  4. The capturedImage pixel buffer for the current frame (to color your points)

With that information stored, you can recreate the point cloud using the same approach that the sample you linked to uses.

Alternatively, you could store the information in the buffer which stores the ParticleUniforms, but that actually may cause a loss of information, since theoretically you could sample more points from the depth texture if you aren't running in real time. The sample you linked to only samples a certain number of points per frame, and only stores a certain number of points in the buffer.

Also note that points do not have normal vectors, if you would like to derive some sort of normal vector, then you will need to process the point cloud in some way to create surfaces, and then calculate the normals of those surfaces.
Hi gchiste,

Thanks for the reply, that helps a lot. I'll try this approach out.