With in Apple PhotogrammterySession, Variable related with real scale.

In ARKit, I took few Color CVPixelBuffers and Depth CVPixelBuffers, ran PhotogrammetrySession with PhotogrammetrySamples.

In my service, precise real scale is important, so I tried to figure out what is related to the rate of real scale model created.

I did some experiments, and I set same number of images(10 pics), same object, same shot angles, distance to object(30cm, 50cm, 100cm).

But even with above same controlled variables, sometimes, it generate real scale, and sometimes not.

Because I couldn't get to source code of photogrammetry and how it work inside, I wonder do I miss and how can I create real scale every time if it's possible.

Replies

you could request for Apple to support scale bar detection. that way you could define a scale from that bar and get very good accuracy with Photogrammetry. Even the lidar currently has a 3cm variance which is too much if you are after high accuracy.

  • Thanks for reply, but the thing is that when scale is right, it has error with in 5mm, in my experiments. I when doesn't scale right, too much difference, like few centimeters. I want to raise recall rate, and couldn't figure out what affects them.

Add a Comment