Can iOS and visionOS Devices Share the Same Spatial World in a Multiuser AR Session?

Hey everyone,

I'm working on an object viewer where users can place objects in a real room using AR, and I want both visionOS (Apple Vision Pro) and iOS devices (iPad, iPhone) to participate in the same shared spatial experience. The idea is that a user with a Vision Pro can place an object, and peers using iPhones/iPads can see the same object in the same position in their AR view.

I've looked into ARKit's Shared ARWorldMap and MultipeerConnectivity, but I'm not sure if this extends seamlessly to visionOS or if Apple has an official way to sync spatial data between visionOS and iOS devices.

Has anyone tried sharing a spatial world between visionOS and iOS? Are there any built-in frameworks that allow for a shared multiuser AR session across these devices? If not, what would be the best way to sync object positions between them? Would love to hear if anyone has insights or experience with this! 🚀

Thanks!

Answered by JoonAhn in 827362022

At the idea level, could a solution like the following be developed?

Examples of indoor physical objects such as floors, walls, ceilings, columns, and tables can act as a kind of environmental anchor. For example, a single wall (plane) can only provide the normal and a point on it (incomplete 6DoF anchor). But 3 non-parallel planes as a set can provide a complete 6DoF anchor.

Would it be possible to calculate the 6DoF of individual devices using such environmental anchors as intermediaries? Gravitational and magnetic directions would be useful information.

YouTube video: Hv2uA6k8Oig

Accepted Answer

At the idea level, could a solution like the following be developed?

Examples of indoor physical objects such as floors, walls, ceilings, columns, and tables can act as a kind of environmental anchor. For example, a single wall (plane) can only provide the normal and a point on it (incomplete 6DoF anchor). But 3 non-parallel planes as a set can provide a complete 6DoF anchor.

Would it be possible to calculate the 6DoF of individual devices using such environmental anchors as intermediaries? Gravitational and magnetic directions would be useful information.

YouTube video: Hv2uA6k8Oig

Hi @bbthegreat

While there's no built-in abstraction for automatically synchronizing content location across iPhone and Apple Vision Pro, it's achievable. Here's how:

  • Establish a common origin by using an image as a shared reference point. Both visionOS and iOS apps can locate this image using image anchors: ImageAnchor in visionOS and ARKImageAnchor in iOS. This allows content to be placed relative to the detected image in a consistent, shared spatial context.
  • Synchronize the content's position across devices using either MultipeerConnectivity or the Network framework, both are compatible with visionOS and iOS.

Goal: Multiple users in a REAL 3D space, e.g. conference/operation room, share spatial experiences using AR.

Given:

  • The real concrete 3D space is given.
  • Real-time RGB depth data is collected from individual user sensors.
  • Additionally, real-time accelerometer and magnetic sensor data.
  • The real objects (floor, wall, ceiling, table, ...) are motionless.
  • Users move in the space.
  • Virtual objects/content are moved by users.
  • Users, real/virtual objects/content interact with each other.

Solutions:

  • The real indoor space must be shared by users and serve as reference data.
  • Floors, walls, ceilings, columns, tables, etc. serve as permanent environmental anchors.
  • Users and virtual objects/content are moving.
Can iOS and visionOS Devices Share the Same Spatial World in a Multiuser AR Session?
 
 
Q