Hardware Technical Specifications about Vision Pro

does anyone knows the details about Vision Pro's CAMERAS, TOF, LiDAR and IMU? I can't find anything on any websites(almost)

As an ARKit developer, I was able to confirm that the overall accuracy of cameras, LiDAR and IMU is good enough for AR applications. Of course, our FindSurface framework for determining the shape, size and 6DoF of objects is also accurate enough for AR applications. If any of the cameras (data capture and content display), LiDAR (3D measurement), IMU (device tracking) and FindSurface (object detection and measurement) are not accurate enough, the digital content cannot be seamlessly blended with the physical space (see the videos below).

  • FindSurface for Spatial Computing

https://youtu.be/bcIXoTEeKek

  • Virtual ads inside Chungmuro station Line 3 - iPhone Pro 12

https://youtu.be/BmKNmZCiMkw

As you mentioned, visionOS ARKit provides developers with restricted sensor data:

  • WorldAnchor
  • DeviceAnchor
  • MeshAnchor
  • PlaneAnchor
  • HandAnchor
  • ImageAnchor.

visionOS app developers must endure the above conditions of app development.

About 2 weeks ago, we got a set of Vision Pro. What we are using are:

  • DeviceAnchor
  • MeshAnchor.

DeviceAnchor is generated by using image and IMU sensor data streams. MeshAnchors are generated by using LiDAR and image sensor data streams.

FindSurface framework recognizes and measures in real time the shapes, sizes, and 6DoFs of object surfaces by processing the vertex points of MeshAnchors.

The overall accuracy of FindSurface framework, image, IMU, LiDAR sensor data has been internally confirmed. The 6DoF poses of the Vision Pro and of the object surfaces are accurate in sub-centimeter accuracy.

But, the problem is that MeshAnchors cut/fill the convex/concave object surfaces. Consequently, the radii of object sphere/cylinder/cone/ring are estimated smaller than those of actual object surfaces by FindSurface.

The project folder including the app source code together with the FindSurface framework will be available on GitHub end June 2024.

The first result of the visionOS app with FindSurface framework. The shape, size and 6DoF pose of the object in the line of sight are recognized and measured in real time by the FindSurface framework. Inlier vertex points are colored pink. The project folder including the source code of the App together with the FindSurface framework will be available on GitHub CurvSurf at the end of June 2024.

FindSurface 1 - Apple Vision Pro https://youtu.be/p5msrVsEpa0

Hardware Technical Specifications about Vision Pro
 
 
Q