Automatic Plane Measurements like in the Apple Measure App

I’m working on an iOS app that needs to measure the area of planes or surfaces, like the length and width of objects, just like the Apple Measure app does. I’ve been exploring ARKit, but I’m curious if there are any APIs or techniques that can help automate the process of detecting and measuring planes.

Specifically, I’m looking for a way to automatically detect and measure planes (e.g., from a top-down view). For example: Measuring a box width and length. I have attached a screenshot and a video of the Apple Measure App doing it.

Does Apple provide any tools or APIs for this, or are there any best practices I should know about? I’d love to hear from anyone who’s tackled something similar.

Video: https://drive.google.com/file/d/1BxM7fIbFxsCsYwY7w8ZxIeq_4WTGkkwA/view?usp=drive_link

Answered by DTS Engineer in 822837022

Thank you for interesting conversations and details for possible solutions.

There seem to be two topics: box detection and "measure the area of planes or surfaces, like the length and width of objects, just like the Apple Measure app does". Box detection is more specific and complex so we’ll skip that.

ARKit world tracking can automatically detect planes and provide their dimensions. That’s all the API provides and it’s insufficient for your needs because it lacks scene understanding.

Re: scene understanding, a plane is a plane is a plane… is it the top of a box? Is it the “top” of some rectangular surface that we need to measure? We have no idea. All we know is that ARKit thought some compelling detected features comprised the surface of a plane.

If you add ARKit’s sceneReconstruction things get more interesting (although not required):

“When you enable scene reconstruction, ARKit provides a polygonal mesh that estimates the shape of the physical environment. … If you enable plane detection, ARKit applies that information to the mesh. Where the LiDAR scanner may produce a slightly uneven mesh on a real-world surface, ARKit smooths out the mesh where it detects a plane on that surface.”

Better yet, you can classify (identify) common types of detected planes with ARPlaneAnchor’s classification property.

Assuming you’re not measuring walls, floors ceilings, tables, seats, doors or windows anything with ARPlaneAnchor.Classification.none(_:) is a plane of interest.

So that’s what’s possible with ARKit API.

If need something more rigorous or specific you’ll want to consider techniques from the CoreML and Vision frameworks, which may require training a machine learning model for bespoke object detection and tracking.

Box measurement (size and 6DoF) is an unsolved industrial application problem in warehouse and airport automation. Once there is a solution, it must be great. It could be solved by processing both 3D point cloud and 2D image.

Following videos show partial solutions processing only point cloud.

YouTube Videos:

  • Real-Time Box Measurement, l-i2E7aZY6A
  • Measuring & Tracking of Boxes in Real-Time, mEHaR2mjm9c

Hello @JoonAhn

Thanks for your reply.

There's currently no simple solution for real-time box measurement. The videos you sent me mainly use OpenCV for processing point clouds. ARKit, with its LiDAR and depth capabilities, could be a potential tool to tackle this problem. I want to know if ARKit provides any documentation or demo app based on this scenario.

Thanks

The videos were recorded about 10 years ago with Intel RealSense 3D cameras.

We measure a box from the inside or outside. The videos show examples of external observations. Room measurement is internal observation.

OpenCV has nothing to do with the videos.

Real-time box measurement requires many problems to be considered and solved. Not an easy task.

If allowed, in a few months we may release the source code of a real-time box measurement app for iOS that uses our proprietary runtime library “FindBox” that processes Apple LiDAR 3D point clouds.

Thank you for interesting conversations and details for possible solutions.

There seem to be two topics: box detection and "measure the area of planes or surfaces, like the length and width of objects, just like the Apple Measure app does". Box detection is more specific and complex so we’ll skip that.

ARKit world tracking can automatically detect planes and provide their dimensions. That’s all the API provides and it’s insufficient for your needs because it lacks scene understanding.

Re: scene understanding, a plane is a plane is a plane… is it the top of a box? Is it the “top” of some rectangular surface that we need to measure? We have no idea. All we know is that ARKit thought some compelling detected features comprised the surface of a plane.

If you add ARKit’s sceneReconstruction things get more interesting (although not required):

“When you enable scene reconstruction, ARKit provides a polygonal mesh that estimates the shape of the physical environment. … If you enable plane detection, ARKit applies that information to the mesh. Where the LiDAR scanner may produce a slightly uneven mesh on a real-world surface, ARKit smooths out the mesh where it detects a plane on that surface.”

Better yet, you can classify (identify) common types of detected planes with ARPlaneAnchor’s classification property.

Assuming you’re not measuring walls, floors ceilings, tables, seats, doors or windows anything with ARPlaneAnchor.Classification.none(_:) is a plane of interest.

So that’s what’s possible with ARKit API.

If need something more rigorous or specific you’ll want to consider techniques from the CoreML and Vision frameworks, which may require training a machine learning model for bespoke object detection and tracking.

ARKit generates and processes LiDAR 3D data in 3 phases (data formats):

  1. 576 (= 4 x 16 x 3 x 3) raw distance points at very high frequency (2,000 Hz?): 4 VCSEL (Vertical-Cavity Surface Emitting Laser), 16 stacks, 3 x 3 DOE (Diffractive Optical Elements).
  2. DepthMap (256 x 192) at 60 Hz.
  3. MeshAnchor at about 1 Hz.

Apple provides the ARKit API to app developers:

  • iOS/iPadOS: DepthMap and MeshAnchor.
  • visionOS: MeshAnchor.

The undisclosed ARKit API for 576 raw distance points will open up new applications.

Object box, plane, sphere, cylinder, cone and torus can be detected and measured at a speed of max. 1,000 Hz on Vision Pro: https://github.com/CurvSurf/FindSurface-RealityKit-visionOS-Real-Time

Any content/materials/images/videos can be drawn on the object surfaces: YouTube video, Hv2uA6k8Oig

Is there any way to get the access API to 576 raw distance points from LIDAR?

Automatic Plane Measurements like in the Apple Measure App
 
 
Q