I wanted to update this thread since I have learned more about RealityView.
Technically there is a way to use RealityView and access the world map data on iOS and iPadOS, but I can't seem to figure that out. Someone with a higher degree of experience would be better suited to answer that question. It would involve knowing how to bridge between low-level and high-level kits, and at this point in time, at least on iOS and iPadOS RealityView doesn't necessarily provide any additional benefits. Though I think it will in the future, if not just simplicity and ease of use.
The other correction is that ARView does support RCP projects, you just need to use Entity() to load them from the bundle. So, using the known methods for persistent data with RCP projects will still get you there.
import yourproject
let rcpProject = try await Entity(named: "Scene", in: yourprojectBundle)
Post
Replies
Boosts
Views
Activity
I am no longer at a loss! haha
Entity() is different than ModelEntity()
Entity() keeps the hierarchy (independent model transforms).
ModelEntity() flattens it (single model transforms).
You have to make sure you're interacting with your root element in the RCP scene. If you're not, the anchor or model transforms will have no effect.
//import your bundle
import yourproject
//Load async
let scene = try await Entity(named: "Scene", in: yourprojectBundle)
//Target root object
let rootEntity = scene.children.first?.children.first(where: { $0.name == "rootObject" }
Bonus: you can print out your hierarchy with this...
// Helper function to print entity hierarchy
func printEntityHierarchy(_ entity: Entity, level: Int) {
let indent = String(repeating: " ", count: level)
print("\(indent)Entity: \(entity.name) Transform: \(entity.transform)")
for child in entity.children {
printEntityHierarchy(child, level: level + 1)
}
}
Note: the same thing applies for gesture control. You have to make sure you're targeting the root object or the specific object you want.
@lijiaxu Thank you for your input. Definitely, ARView for now on iOS and iPadOS.
In the project I was working on, I had decided to abandon RealityView. The question was still lingering in my mind, but you have answered it here. I think I was worried about missing out on the latest realism/rendering features, but I don't think there are any missing in ARView at this point.
Perhaps, it was wishful thinking on my part that things like persistent data would become automated in the background with a simple boolean control. That said, with VisionOS that appears to be the direction, and that is exciting. that was the primary functionality I was after.
To correct or update my original note: When using RealityView, after applying the anchor to a specific surface type, such as floor or table, the tracking was much improved and matched what I am familiar with in ARView.
I wanted to update this post with resources I found.
It appears the automation for persistent anchors and world data maps has been configured as WorldAnchors. Currently, it looks like this is only supported in visionOS.
https://developer.apple.com/documentation/visionos/tracking-points-in-world-space
It appears that by simply adding a WorldAnchor that visionOS automatically tracks the world map, unloading and loading based on your location automatically in the background. This is amazing.
Though, I'm not sure why this wouldn't be supported on iOs and iPadOS as well. Perhaps in the future it will be implemented as a core ARKit feature as well.
To the best of my limited knowledge, it appears we will have to continue to use the previous methods for persistent data, which can be found here:
https://developer.apple.com/documentation/arkit/arkit_in_ios/data_management/saving_and_loading_world_data
However, I still have to try and implement this with RealityView. As it is my understanding that only RealityView supports Reality Composer Pro packages.
The goal here is to simply place a Reality Composer Pro package with AR Persistence...
I was struggling to get QuickLook functioning in my app. It would display with the text "No File to View." The OP helped me fix my issues. My issues were in the coordinator setup. So, here's a functioning (as of 12/'24) QuickLook wrapped in SwiftUI using URL pass through from button if anyone needs it.
import QuickLook
import ARKit
// 1. Create a SwiftUI view that will present the QLPreviewController.
struct QuickLookView: View {
var fileURL: URL // Accept the URL as a parameter from button
var body: some View {
QuickLookPreview(fileURL: fileURL) // Pass the fileURL to the QuickLookPreview
.edgesIgnoringSafeArea(.all)
}
}
// 2. Create a UIViewControllerRepresentable that wraps the UIKit QLPreviewController.
struct QuickLookPreview: UIViewControllerRepresentable {
var fileURL: URL // Accept the fileURL parameter
func makeUIViewController(context: Context) -> QLPreviewController {
let previewController = QLPreviewController()
previewController.dataSource = context.coordinator
return previewController
}
func updateUIViewController(_ uiViewController: QLPreviewController, context: Context) {
// No updates required
}
// 3. Create a Coordinator to handle the QLPreviewControllerDataSource methods.
class Coordinator: NSObject, QLPreviewControllerDataSource {
var fileURL: URL // Store the fileURL in the Coordinator
init(fileURL: URL) {
self.fileURL = fileURL
}
func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
return 1
}
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
return fileURL as QLPreviewItem // Return the fileURL as the preview item
}
}
// Create a Coordinator instance
func makeCoordinator() -> Coordinator {
return Coordinator(fileURL: fileURL) // Pass the fileURL to the Coordinator
}
}
And the then button
NavigationLink(destination: QuickLookView(fileURL: Bundle.main.url(forResource: "modelName", withExtension: "usdz")!)) {
Text("Quick Look")
.font(.system(size: 22))
.fontWeight(.bold)
.padding(.horizontal, 40)
.padding(.vertical, 20)
.background(Color.purple)
.foregroundColor(.white)
.cornerRadius(6)
}
Well, I figured it out from the Happy Beam demo code here: Happy Beam Docs
Problem: the Bundle var wasn’t found in scope.
Solution:
Make sure that your Reality Composer Pro Package has been as a Framework in he General Project Settings
Import (your package name)
In the Sources directory that Reality Composer Pro created, there is a Swift file that contains var you’re looking for usually (your project name + Bundle; i.e. “projectnameBundle”)
Load by creating an entity; scene = Entity(named: “Scene”, in: projectnameBundle)
Add the entity to your RealityView; content.add(scene)
Note: those will place the scene at your cameras location. So, be sure to move the camera away from the starting point to verify, but it’d be best to add a horizontal anchor and add the entity to the anchor, then the anchor to the RealityView to be less confusing Visually.
import RealityKit
import projectName
struct ContentView : View {
var body: some View {
RealityView { content in
// Create horizontal plane anchor
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: SIMD2<Float>(0.2, 0.2)))
// Load Scene from Reality Composer Pro Package
do {
let scene = try await Entity(named: "Scene", in: projectnameBundle)
// Add model to anchor
anchor.addChild(scene)
// Add anchor to RealityView
content.add(anchor)
} catch is CancellationError {
// The entity initializer can throw this error if an enclosing
// RealityView disappears before the model loads. Exit gracefully.
return
} catch let error {
// Other errors indicate unrecoverable problems.
print("Failed to load scene: \(error)")
}
// View Settings
content.camera = .spatialTracking
}
.edgesIgnoringSafeArea(.all)
}
}