When scanning multiple rooms (10+) in a single structure using ARWorldMap for coordinate space consistency, RoomCaptureSession throws CaptureError.exceedSceneSizeLimit. The instructions here (https://developer.apple.com/documentation/roomplan/scanning-the-rooms-of-a-single-structure) provide exactly what I am doing to keep the underlying ARSession alive (by calling captureSession.stop(pause: false)) and save the results before a user moves to the next room. Scanning 11 or so rooms will cause the user to hit the exceedSceneSizeLimit error. The ARWorldMap is about 58 MB and always is around this size when hitting this issue. No anchors are present and all the data seems to be from tracking data.
On iPad devices (where I do not see this issue) the ARWorldMap grows as a significantly slower rate in size.
I save the ARWorldMap after each room is scanned and confirmed by the user. If I use the ARMap to initialize the ARSession (as described in the docs) the session will immediately error with "exceedSceneSizeLimit" once the captureSession.run() is executed. Occasionally it will allow me/the user to scan again, but either breaks mid scan or the following.
This has been working fine for the past 2 years and users have been able to scan dozens of rooms without issue. It seems only lately that it has been a problem.
I would expect the ARWorldMap to be allowed for much bigger sizes. At this point I can just about scan more area of my house with a single scan than I can when I use different captureSessions.
Few observations:
This happens on my iPhone 15 Pro Max, my iPhone 17 Pro, but not my iPad M4 (maybe memory related?). It is possible if scanning many more rooms it would happen on the iPad too.
I have tried things such as resetting the ARConfig on the underlying ARSession to reset some, but this doesn't work.
I have tried to create a new ARWorldMap and move the origin to the older map to clear out tracking data. This almost works but causes a mess of issues when a user moves at all due to the unshared coordinate space.
I believe there are three active issues regarding this: FB14454922, FB15035788, FB20642944
Could we get an update for this issue? It is a production issue and severely limits my user experience in my production application.
                    
                  
                RoomPlan
RSS for tagCreate parametric 3D scans of rooms and room-defining objects.
Posts under RoomPlan tag
            
              
                29 Posts
              
            
            
              
                
              
            
          
          
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
              Post
Replies
Boosts
Views
Activity
                    
                      Hi,
We’ve been successfully using the RoomPlan API in our application for over two years. Recently, however, users have reported encountering persistent capture errors during their sessions. Specifically, the errors observed are:
CaptureError.worldTrackingFailure
CaptureError.exceedSceneSizeLimit
What we have observed:
Persistent Errors: The errors continue to occur even after initiating new capture sessions.
Normal Usage: Our implementation adheres to typical usage patterns of the RoomPlan API without exceeding any documented room size limits.
Limited Feature Usage: We are not utilizing the WorldTracking feature for the StructureBuilder functionality to stitch rooms together.
Potential State Caching: Given that these errors persist across sessions, we suspect that there might be memory or state cached between sessions that is not being cleared, particularly since we are not taking advantage of StructureBuilder.
Request:
Could you please advise if there is any internal caching or memory retention between capture sessions that might lead to these errors? Additionally, we would appreciate guidance on how to clear or manage this state when the StructureBuilder feature is not in use.
Here is a generalised version of our capture session initialization code to help diagnose the issue.
struct RoomARCaptureView: UIViewRepresentable {
    typealias Handler = (CapturedRoom, Error?) -> Void
    @Binding var stop: Bool
    @Binding var done: Bool
    let completion: Handler?
    func makeUIView(context: Self.Context) -> RoomCaptureView {
        let view = RoomCaptureView(frame: .zero)
        view.delegate = context.coordinator
        view.captureSession.run(configuration: .init())
        return view
    }
    func updateUIView(_ uiView: RoomCaptureView, context: Self.Context) {
        if stop {
            // Stop the session only once, multiple times causes issues with the final presentation
            uiView.captureSession.stop()
            stop = false
            done = true
        }
    }
    static func dismantleUIView(_ uiView: RoomCaptureView, coordinator: Self.Coordinator) {
        uiView.captureSession.stop()
    }
    func makeCoordinator() -> ARViewCoordinator {
        ARViewCoordinator(completion)
    }
    @objc(ARViewCoordinator)
    class ARViewCoordinator: NSObject, RoomCaptureViewDelegate {
        var completion: Handler?
        public required init?(coder: NSCoder) {}
        public func encode(with coder: NSCoder) {}
        public init(_ completion: Handler?) {
            super.init()
            self.completion = completion
        }
        public func captureView(shouldPresent roomDataForProcessing: CapturedRoomData, error: (Error)?) -> Bool {
            return true
        }
        public func captureView(didPresent processedResult: CapturedRoom, error: (Error)?) {
            completion?(processedResult, error)
        }
    }
}
Thank you for your assistance.
                    
                  
                
                    
                      Error:
RoomCaptureSession.CaptureError.exceedSceneSizeLimit
Apple Documentation Explanation:
An error that indicates when the scene size grows past the framework’s limitations.
Issue:
This error is popping up in my iPhone 14 Pro (128 GB) after a few roomplan scans are done. This error shows up even if the room size is small. It occurs immediately after I start the RoomCaptureSession after the relocalisation of previous AR session (in world tracking configuration). I am having trouble understanding exactly why this error shows and how to debug/solve it.
Does anyone have any idea on how to approach to this issue?
                    
                  
                
                    
                      The Section struct only publicly makes the center property available, but this is a SIMD3 that doesn't seem to line up with the rest of the model. All other objects have a 4x4 transform matrix that accurately gives each position and rotation.
When inspecting a Section in the debugger, many more properties are visible such as polygon and transform. Why are these not visible? The transform in particular seems necessary to make any sort of use of the Sections.
                    
                  
                
                    
                      When I run my app from Xcode on a device running iOS 26, the roomplan capture is corrupted and the recording is green and purple. This issue does not occur when I use an older version of iOS or when I run the app via testFlight or the App Store.
                    
                  
                
                    
                      Hello Community,
I'm encountering an issue with the latest iOS 17 update, specifically related to RoomPlan version-2. In iOS 16, when using RoomPlan version-1, we were able to display stairs in our app. However, after upgrading to iOS 17 and implementing RoomPlan version-2, the stairs are no longer visible.
Despite thorough investigation, I couldn't find any option within the code to show or hide stairs, or any other objects for that matter. It seems like a specific issue with the update rather than a coding error on our part.
Has anyone else encountered a similar problem? If so, I would greatly appreciate any insights or solutions you might have. It's crucial for our app functionality to have stairs displayed accurately, and we're currently at a loss on how to address this issue.
Thank you in advance for any assistance you can provide.
Best regards
                    
                  
                
                    
                      With iOS26 unveiled, has anyone noticed or found any changes related to RoomPlan?
I can't find anything myself, which is disappointing.
Has anyone found any improvements or changes?
                    
                  
                
                    
                      Has Roomplan been abandoned? Two years have gone by without comments from Apple on improvements. Are the improvements behind the scenes? Is there going to be any major updates?
                    
                  
                
                    
                      i have a specific requirement as below.
i have apartments in which i have 4 units and each unit will be of same layout. furniture etc.. might change but not walls and layout as far as i know.
Step1:
Now i will enter into one unit, scan the entire unit and save it using RoomPlan Api and at the end of the scan i will be saving the worldmap as well.
Step2:
i want to use the previously saved worldmap and enter into another unit and want to relocalize with only walls because the furniture might be different.
is that possible. if not is there any otherway to achieve this functionality? i really appreciate you valuable time. Thankyou.
                    
                  
                
                    
                      How can one match the walls and floor of a given CapturedRoom ?
The transform.eulerAngles of a floor z & y are always 0 !
And the polygons seems to have a different orientation than the walls.
So how to figure out the rotation and match the one from the walls ?
                    
                  
                
                    
                      I am allowing users to go through and capture different rooms, and add a custom label to that room. Is there a way to store data about this in the captured room so that it persists into the final merge? As it is now, My users mark all their merges with custom labels, but after merging there is no way to remember which room is which in the merging process so they have to go through and manually add the labels back. For larger floor plans this is not ideal.
                    
                  
                
                    
                      Hi everyone,
My app crashed when using the merge room feature.
I suspect the issue might be caused by a wall having more than 4 edges.
Has anyone experienced a similar problem and found a solution?
I’d really appreciate any advice or suggestions.
Thank you all, and have a great day!
let capturedStructure = try await structure.capturedStructure(from: self.rooms)
                    
                  
                
                    
                      Hi,
I'm encountering an issue in our app that uses RoomPlan and ARsession for scanning.
After prolonged use—especially under heavy load from both the scanning process and other unrelated app operations—the iPhone becomes very hot, and the following warning begins to appear more frequently:
"ARSession <0x107559680>: The delegate of ARSession is retaining 11 ARFrames. The camera will stop delivering camera images if the delegate keeps holding on to too many ARFrames. This could be a threading or memory management issue in the delegate and should be fixed."
I was able to reproduce this behavior using Apple’s RoomPlanExampleApp, with only one change: I introduced a CPU-intensive workload at the end of the startSession() function:
        DispatchQueue.global().asyncAfter(deadline: .now() + 5) {
            for i in 0..<4 {
                var value = 10_000
                DispatchQueue.global().async {
                    while true {
                        value *= 10_000
                        value /= 10_000
                        value ^= 10_000
                        value = 10_000
                    }
                }
            }
        }
I suspect this is some RoomPlan API problem that's why a filed an feedback: 17441091
                    
                  
                
                    
                      I've encountered an unexpected crash with RoomPlan on iOS 16 devices. The odd part is the code is protected by an available check, since I'm using newer RoomPlan features.
Xcode error
dyld[40588]: Symbol not found: _$s8RoomPlan08CapturedA0V16USDExportOptionsV5modelAEvgZ
I can repro using the Apple sample code.
https://developer.apple.com/documentation/roomplan/create-a-3d-model-of-an-interior-room-by-guiding-the-user-through-an-ar-experience
Modify RoomCaptureViewController.swift as follows.
Remove
try finalResults?.export(to: destinationURL, exportOptions: .parametric)
Add
            if #available(iOS 17.0, *) {
                try finalResults?.export(to: destinationURL, exportOptions: .model)
            } else {
                try finalResults?.export(to: destinationURL, exportOptions: .parametric)
            }
I would have expected this code to at least compile and run on older devices.
When the app was targeting iOS 15, the available checks worked as expected and the app is able to launch properly.
                    
                  
                
                    
                      Hi is there possibly to put text data or information into Roomplanapi  elements like data wall color etc and export to usdz after lidar scan
                    
                  
                
                    
                      First, I  scan first room using the roomplan api.  Because I need scan second room, I stop it by “captureSession.stop(pauseARSession: false)”, I think the Arsession is continue work at that time.
Second, before the another room will scan, I want to run another ARView. (in order to detect some objects which are not detected by Roomplan in first room)
But, at this time, the second ARView(there is an ARView in roomplan, I think) will always black screen, can’t normally work.  This is the question I want to resolve. Please help me let the second ARView go well.
                    
                  
                
                    
                      I'm unable to debug on iPhone 12 running iOS 17 beta 3 via network. I'm running Xcode 15 beta 6
Device shows in devices and simulators and I can debug when connected with cable.
However the "Connect Via Network" option is frayed out, oddly however the checkbox is ticked
I'm developing a app using RoomPlan so network connectivity is a must for debugging
Anyone else encountered this and know how to get around this problem
Other devices running iOS 16 connect via network just fine
                    
                  
                
                    
                      Is it possible to modify or mark elements in the room plan model generated by the framework?
                    
                  
                
                    
                      Hello,
We are using the RoomPlan API and our users are facing issues during scanning everytime.
Error: RoomCaptureSession.CaptureError.exceedSceneSizeLimit
Apple Documentation Explanation: An error that indicates when the scene size grows past the framework’s limitations.
Issue: This error is popping up in my iPhone 15 Pro (128 GB) after ONE roomplan scans are done. This error shows up even if the room size is small. It occurs immediately after I start the RoomCaptureSession after the relocalisation of previous AR session (in world tracking configuration). I am having trouble understanding exactly why this error shows and how to debug/solve it.
                    
                  
                
                    
                      I am currently working on creating a virtual interior design app.
Can an app made with the Room Plan API be used on iPhones without LIDAR? If so, how much accuracy would be lost compared to iPhones with LIDAR?