Building a Full Space app that enables sharing a visionOS experience with nearby users.

Hello,

I am currently considering developing a Full Space app that enables a shared visionOS experience with nearby users.

Intended Features

A Mixed Full Space app in which dozens of 3D models are placed in the space. These 3D models may play embedded animations when tapped, be programmatically moved or rotated, or be controlled via Reality Composer Pro timelines.

The app also includes audio, spatial audio, videos with audio, and videos without audio, which are rendered as VideoTextures on planes and played back in the space. Some media elements play automatically, while others are triggered by user interaction.

However, it is unclear whether AVPlaybackCoordinator supports shared playback across multiple types of media, such as:

  • audio only
  • spatial audio
  • video without audio
  • video with audio

I am also unsure whether there are alternative or recommended approaches for synchronizing playback in this scenario.

Questions

Is it technically possible to implement the experience described above using visionOS?

Are there any important implementation considerations or limitations that should be taken into account?

For example, when two participants experience the app simultaneously, how is the content positioned for each participant? Is the spatial placement of content shared across participants, or is it positioned relative to each participant’s viewpoint?

For nearby participants, is it necessary to register a spatial Persona? My understanding is that spatial Personas are not visible for nearby users during the experience; is this correct?

When experiencing SharePlay with nearby users, is it possible to share the experience without registering the other participant’s contact information?

I have watched the following session, but I was unable to fully understand the feasibility of the above use case or the concrete implementation details: https://developer.apple.com/videos/play/wwdc2025/318/

Thank you.

Hey @sadaotokuyama,

It is technically possible to implement the described experience, however, AVPlaybackCoordinator synchronizes the timing of a single AVPlayer object across devices. In your use case you will need to perform the synchronization across multiple AVPlayer objects yourself. Review Synchronizing data during a SharePlay activity to understand how to accomplish this. Consider sending synchronization timing messages frequently as unreliable so that each user sees the same content, however, when a video starts or stops playing due to user interaction use the reliable delivery mode.

The supportsGroupImmersiveSpace flag ensures entities with identical transforms appear in the same relative location for all participants, including both nearby and remote participants. For more information please review Configure your visionOS app for sharing with people nearby.

Nearby participants appear naturally via passthrough. Nearby participants do not need to register a spatial Persona.

You can share with someone nearby not in your contacts using a PIN, but once this is complete you will see the name and photo of the individual you are sharing with.

You may also want to review the sample code Building a guessing game for visionOS.

Let me know if you have additional questions,
Michael

Building a Full Space app that enables sharing a visionOS experience with nearby users.
 
 
Q