Having issue with two ar session togather

I have the following issue regarding running 2 AR service. I am trying to develop an app for my masters thesis. Case 1: I first scan the room using the roomplan api. Then I stop the roomplan api session and start the realitykit session. When the realitykit session starts, the camera is not showing anything but black screen. Case 2: When I had the issue with case one, I tried a seperate test app where I had 2 seperate screen for roomplan api and realitykit. There is no relation. but as soon as I introduced roomplan api, realitykit stopped working, having the same black screen as above.

There might be any states that changed by the roomplan api, that's why realitykit is not able to access the camera. Let me know if you have any idea about it or any sample.

I am using the following stack: Xcode - Latest; Swiftui; latest os in mac mini and iphone

Same goes here. Got any solutions or suggestions?

I have been having the same issue and I have done a lot of testing and tweaking... I could not solve the problem yet, but here's what I found: After doing the following:

  1. Enter and start RoomPlan session with a custom ARSession passed in the creation of the RoomCaptureView constructor.
  2. Stop the RoomPlan session with the captureSession.stop() (I do not continue the session) function and exit the view (or the tab: I use tabs)
  3. Enter another view non-related to RoomPlan with a custom ARSession to try to prevent ARSession conflict. I also set the ARWorldTrackingConfiguration. I also make sure the cameraMode to .ar in the ARView constructor for the second AR view: let arView = ARView(frame: .zero, cameraMode: .ar, automaticallyConfigureSession: false) arView.session = ARViewRepresentable.sharedARSession I make sure to start and stop the sessions of RoomPlan and the custom ARView so that they don't conflict.

However, the camera feed is not available when I go back to the ARView. However, if I start the app and go first in the custom ARView, the camera feed works, but as soon as I open the RoomPlan session, going back to the ARView "hijacks" the camera feed.

Please note that during my tests, I was able to dicern that only the camera feed seems to be missing because when I set the debugOptions of the arView to [.showSceneUnderstanding, .showWorldOrigin], I can see the origin of the session and move around it even though the screen background is black.

I would also point that Apple seems to want us to create an ARSession and pass it to the RoomPlan to continue a AR session. From the RoomCaptureSession in the RoomPlan framework

    /// Creates a room-capture session with the given AR session.
    ///
    /// By providing your own <doc://com.apple.documentation/documentation/arkit/arsession> object, you can continue your app's existing AR experience by seamlessly transitioning into a room-scanning session with RoomPlan. In addition, continuing an `ARSession` across multiple room-capture sessions — specifically, different rooms in the same vicinity — enables you to merge multiple ``CapturedRoom`` objects into a single captured structure. For more information, see ``CapturedStructure``.
    ///
    /// You can access the AR session at runtime with the ``RoomCaptureSession/arSession`` property.
    ///
    /// - parameter arSession: A world-tracking session that your app creates and runs with an <doc://com.apple.documentation/documentation/arkit/arworldtrackingconfiguration> before calling this function. If you pass an `ARSession` instance, RoomPlan preserves all of the AR session's settings. If you leave the argument blank, RoomPlan creates its own `ARSession` instance.
    @available(iOS 16.0, *)
    public init()
Having issue with two ar session togather
 
 
Q