With in Apple PhotogrammterySession, Variable related with real scale.

In ARKit, I took few Color CVPixelBuffers and Depth CVPixelBuffers, ran PhotogrammetrySession with PhotogrammetrySamples.

In my service, precise real scale is important, so I tried to figure out what is related to the rate of real scale model created.

I did some experiments, and I set same number of images(10 pics), same object, same shot angles, distance to object(30cm, 50cm, 100cm).

But even with above same controlled variables, sometimes, it generate real scale, and sometimes not.

Because I couldn't get to source code of photogrammetry and how it work inside, I wonder do I miss and how can I create real scale every time if it's possible.

you could request for Apple to support scale bar detection. that way you could define a scale from that bar and get very good accuracy with Photogrammetry. Even the lidar currently has a 3cm variance which is too much if you are after high accuracy.

With in Apple PhotogrammterySession, Variable related with real scale.
 
 
Q