RealityView and Persistent World Data?

I was watching the Developer videos, and there was mention that RealityView handles persistent world data differently and also automatically for us.

I am having an issue finding the material I need to get up to speed on that.

In ARKit, I was able to place a model with the world data and recall that .map data. It even stored a reference image for the scene to help match the world data.

I'm looking for the information on how to implement and work with those same features with RealityView, as it seems to be better/automatically integrated?

I need help being pointed in the right direction. Sample code would be amazing.

I wanted to update this post with resources I found.

It appears the automation for persistent anchors and world data maps has been configured as WorldAnchors. Currently, it looks like this is only supported in visionOS.

https://developer.apple.com/documentation/visionos/tracking-points-in-world-space

It appears that by simply adding a WorldAnchor that visionOS automatically tracks the world map, unloading and loading based on your location automatically in the background. This is amazing.

Though, I'm not sure why this wouldn't be supported on iOs and iPadOS as well. Perhaps in the future it will be implemented as a core ARKit feature as well.

To the best of my limited knowledge, it appears we will have to continue to use the previous methods for persistent data, which can be found here:

https://developer.apple.com/documentation/arkit/arkit_in_ios/data_management/saving_and_loading_world_data

However, I still have to try and implement this with RealityView. As it is my understanding that only RealityView supports Reality Composer Pro packages.

The goal here is to simply place a Reality Composer Pro package with AR Persistence...

I wanted to update this thread since I have learned more about RealityView.

Technically there is a way to use RealityView and access the world map data on iOS and iPadOS, but I can't seem to figure that out. Someone with a higher degree of experience would be better suited to answer that question. It would involve knowing how to bridge between low-level and high-level kits, and at this point in time, at least on iOS and iPadOS RealityView doesn't necessarily provide any additional benefits. Though I think it will in the future, if not just simplicity and ease of use.

The other correction is that ARView does support RCP projects, you just need to use Entity() to load them from the bundle. So, using the known methods for persistent data with RCP projects will still get you there.

import yourproject

let rcpProject = try await Entity(named: "Scene", in: yourprojectBundle)

Thank you for sharing this. I'm finding it's incrediable how little to no documentation exists on how to translate existing basic AR concepts Apple used in ARKit in UIKit applications to RealityKit.
I'll post here if I make any progress

RealityView and Persistent World Data?
 
 
Q