Hey everyone,
I'm working on an object viewer where users can place objects in a real room using AR, and I want both visionOS (Apple Vision Pro) and iOS devices (iPad, iPhone) to participate in the same shared spatial experience. The idea is that a user with a Vision Pro can place an object, and peers using iPhones/iPads can see the same object in the same position in their AR view.
I've looked into ARKit's Shared ARWorldMap and MultipeerConnectivity, but I'm not sure if this extends seamlessly to visionOS or if Apple has an official way to sync spatial data between visionOS and iOS devices.
Has anyone tried sharing a spatial world between visionOS and iOS? Are there any built-in frameworks that allow for a shared multiuser AR session across these devices? If not, what would be the best way to sync object positions between them? Would love to hear if anyone has insights or experience with this! 🚀
Thanks!
At the idea level, could a solution like the following be developed?
Examples of indoor physical objects such as floors, walls, ceilings, columns, and tables can act as a kind of environmental anchor. For example, a single wall (plane) can only provide the normal and a point on it (incomplete 6DoF anchor). But 3 non-parallel planes as a set can provide a complete 6DoF anchor.
Would it be possible to calculate the 6DoF of individual devices using such environmental anchors as intermediaries? Gravitational and magnetic directions would be useful information.
YouTube video: Hv2uA6k8Oig