Streaming is available in most browsers,
and in the Developer app.
-
Create parametric 3D room scans with RoomPlan
RoomPlan can help your app quickly create simplified parametric 3D scans of a room. Learn how you can use this API to easily add a room scanning experience. We'll show you how to adopt this API, explore the 3D parametric output, and share best practices to help your app get great results with every scan.
Resources
Related Videos
WWDC23
Tech Talks
WWDC22
-
Download
♪ (Mellow instrumental hip-hop music) ♪ ♪ Praveen Sharma: Hi. My name is Praveen, and I'm from the Prototyping team here at Apple.
Kai Kang: Hi. My name is Kai and I'm from the Video Engineering team.
Praveen: Over the past few years Apple has enabled powerful new ways for people to bring the world into their apps.
Last year, we introduced Object Capture, which takes in photos of real-world objects, and using the Photogrammetry API in RealityKit, turns them into a 3D model ready for use in your app.
Previous to Object Capture, we released the Scene Reconstruction API which gives you a coarse understanding of the geometric structure of your space and enables brand-new augmented reality use cases in your apps.
This year, we are very excited to announce a brand-new framework called RoomPlan.
RoomPlan allows you to scan your room using your LiDAR-enabled iPhone or iPad.
It generates a parametric 3D model of the room and its room-defining objects which you can use in your app.
Let's take a look at what a RoomPlan scanning experience looks like.
RoomPlan uses sophisticated machine learning algorithms powered by ARKit to detect walls, windows, openings, and doors, as well as room-defining objects like fireplaces, couches, tables, and cabinets.
With our RoomCaptureView API, which uses RealityKit to render scanning progress in real time, you can easily integrate a scanning experience into your app.
And when you are finished scanning, RoomCaptureView presents the final post-processed results for you to use however best fits your use case.
For the first time, without the complexities of implementing machine learning and computer vision algorithms, people can now interact with their room in brand new ways.
For example, interior design apps can preview wall color changes and accurately calculate the amount of paint required to repaint a room.
Architecture apps can now easily allow someone to preview and edit changes to their room's layout in real time.
Real estate apps can now seamlessly enable agents to capture floor plans and 3D models of a listing.
And e-commerce apps can engage customers through product visualization in their physical spaces.
These are just a few examples of applications RoomPlan enables, and you'll be surprised to see how simple it is to integrate RoomPlan into your app.
Let's take a look.
There are two main ways you can use RoomPlan.
The first is our out-of-the-box scanning experience which allows you to seamlessly integrate RoomPlan into your app.
The second is our data API which enables your app to use the live parametric data from a scan however best suits your use case.
With both of these APIs, we recommend some best practices to help you achieve the best possible scan results, which we'll go over in the last section of this presentation.
First, let's talk about the scanning experience that you can bring into your app using our new RoomCaptureView API.
RoomCaptureView is a UIView subclass that you can easily place in your app.
It handles the presentation of world space scanning feedback, real-time room model generation, as well as coaching and user guidance.
Let's take a closer look at the design elements presented during a RoomCaptureView-based scan.
During an active RoomCaptureView session, animated lines outline detected walls, windows, openings, doors, and room-defining objects in real time.
The interactive 3D model generated in real time at the bottom of the RoomCaptureView, gives you an overview of your scanning progress at a glance.
Finally, text coaching guides you to the best possible scanning results.
Let's take a look at how you can start using RoomCaptureView in just four easy steps.
First, we create a RoomCaptureView reference in our ViewController.
Second, we create a reference to our RoomCaptureSession configuration object.
Third, we start our scan session, passing in our configuration to the capture session's run function.
And finally, our application tells the capture session to stop scanning.
Optionally, your app can adhere to our RoomCaptureViewDelegate protocol and opt out of post-processed results and their presentation or handle the post-processed scan results once they have been presented.
For example, you can export a USDZ of the results by calling the export function available on the provided CapturedRoom data struct.
And that's how simple it is to integrate RoomPlan into your app.
We are so excited to see what you make with this API.
Now my colleague Kai will talk about RoomCaptureSession and RoomPlan's Data API.
Kai: Thanks, Praveen.
In this section, we will walk you through the Data APIs that provide you the access to the underlying data structures during scanning and can help you build a custom visualization of the scanning experience from ground up.
The basic workflow consists of three parts: scan, process, and export.
For scanning, we will cover the basics of how to set up and start the capture session, as well as display and monitor the capture process.
Then we'll look at how your scanned data is processed and the final model is received for presentation.
Finally, we'll discuss how you can generate and export the output USD file which can also be used in your USD workflows.
Now, let's look into the Scan step in detail.
We will use the RoomCaptureSession API to set up the session and display the progress as we continue scanning.
Let me show you in code.
Here's a simple RealityKit app as an example.
To start, simply import RoomPlan into your Swift project.
In the ViewController of your app, you can have a custom type to visualize the results and to initiate a RoomCaptureSession instance.
Additionally, RoomCaptureSession provides a handle to the underlying AR session so that your apps can draw planes and object bounding boxes in the AR view.
RoomCaptureSession adopts the delegate pattern.
In your ViewController class, you can assign the ViewController itself as the captureSession's delegate.
This would allow the ViewController to get real-time updates from the RoomCaptureSession.
Theses updates include 3D models and instructions in order to guide people during the capture.
To get these updates, your ViewController needs to conform to the RoomCaptureSessionDelegate protocol and implement two methods.
The first one is captureSession(_ session: didUpdate room:) method in order to get the real-time CapturedRoom data structure.
Your visualizer can use it to update AR view of the 3D model, which provides real-time feedback to people on the progress.
We will dive into the CapturedRoom structure more in a later part of the talk.
This method will be called when we detect updates to the captured room.
The second method is captureSession(_ session: didProvide instruction:).
This method provides you with an instruction structure which contains real-time feedback.
Your visualizer can use the instruction to guide people during the scan.
Let's go through the instructions that this API provides.
These instructions include distance to the objects, scanning speed, lighting adjustment to the room, as well as focusing on specific areas of the room that have more textures.
These instructions will be provided during the scan in order to guide people with real-time feedback.
Next, we will move on to the process part.
In this section, we will use the RoomBuilder class to process the scanned data and generate the final 3D models.
To process the captured data, the first step is to initiate a RoomBuilder instance in your ViewController class.
Next, in order to receive the sensor data after the capture process, your app needs to implement the captureSession(_ session: didEndWith data: error:) method.
When the RoomCaptureSession is stopped, by calling the stop() function in your app, or due to an error, this function will be called to return a CaptureRoomData object and an optional error.
Finally, to process the captured data, we call the roomBuilder's async roomModel(from:) method with the await keyword.
The method runs asynchronously to process the scanned data and build the final 3D model.
It utilizes the Swift async/await function that we introduced in last year's WWDC.
Within just a few seconds, the model will be available for the final presentation in your app.
Now, let's dive into the details of the CapturedRoom data structure and how you can export it to use in your app.
At the top level, there is CapturedRoom which consists of Surfaces and Objects.
Surface contains unique attributes to represent curves such as radius; starting and ending angles; four different edges of the surface; and architecture categories of wall, opening, window, door.
Object contains furniture categories such as table, bed, sofa, etc.
Surface and Object share some common attributes such as dimensions; confidence, which gives you three levels of confidence for the scanned surface or object; the 3D transform matrix; as well as a unique identifier.
Let's see how they are represented in code.
The CapturedRoom structure is a fully parametric representation of the elements in the room.
It contains five properties including walls, openings, doors, windows, and objects in the room.
For the first four elements, they are represented as the Surface structure which represents 2D planar architectural structures.
On the right, you can see the various properties of Surface we covered earlier.
The last property is an array of 3D objects in the room, and they are represented as cuboids.
On the right, you can see the various properties of Object.
Here is the list of object types we support in RoomPlan.
These include a variety of common furniture types such as sofa, table, chair, bed, and many more.
Finally, the export function allows you to export this CapturedRoom into a USD or USDZ data for your existing workflows.
Here is an example to show how you can directly open the USD output in Cinema 4D to browse and edit the hierarchical data structure of the room, as well as the dimensions and location of each room element or object.
You can also leverage your existing USD and USDZ workflows to add renders of the captured room into a variety of applications such as real estate, e-commerce, utilities, and interior design.
So far, we covered the scanning experience and underlying RoomPlan APIs.
We'll now go through some best practices to help you get good results with RoomPlan.
We will cover the recommended conditions that allow for a good scan, room features to look out for while selecting a room, as well as a few scanning and thermal considerations to keep in mind.
RoomPlan API supports most common architectural structures and objects in a typical household.
It works best for a single residential room with a maximum room size of 30 feet by 30 feet or around 9 by 9 meters.
Lighting is also important for the API to get a clear video stream and good AR tracking performance.
A minimum 50 lux or higher is recommended for using the API, which is typical for a family living room at night.
For the hardware, RoomPlan API is supported on all LiDAR-enabled iPhone and iPad Pro models.
There are some special conditions that can present a challenge for the API.
For example, full-height mirrors and glass pose a challenge for the LiDAR sensor to produce the expected output.
Even high ceilings could exceed the limit of the scanning range of the LiDAR sensor.
Also, very dark surfaces could be hard for the device to scan.
There are some considerations to get better scanning results.
First, for applications that have high accuracy requirements, preparing the room before scanning can enhance the quality of the scan.
For example, opening the curtains can let more natural light in and reduce window occlusions, which works best for daytime scans.
Closing the doors can reduce the chance of scanning unnecessary area outside of the room.
Following a good scanning motion is also very important to achieving good scanning results with the API.
And that is why we provide the user instruction delegate method in order to provide feedback on textures, distance, speed, and lighting conditions to people during the scans.
Another thing to keep in mind is battery and thermals of the device.
We have done many optimizations on RoomPlan API to ensure a good scanning experience.
Nevertheless, it's best to avoid repeated scans or single long scans over 5 minutes.
These could not only cause fatigue but also drain out the battery and create thermal issues which might in turn impact the user experience of your app.
There is a lot that we covered today.
We introduced a brand-new API, RoomPlan.
It provides an intuitive scanning experience to capture your rooms, powerful machine learning models to understand the environment, as well as a fully parametric USD output format for easy integration in your apps.
For guidance on how to better design and implement your new RoomPlan experience, please check out the related talks below.
Praveen: It's time for you to try RoomPlan in your app.
We can't wait to see what you can create with this new API.
Kai: Thanks for watching! ♪
-
-
4:36 - RoomCaptureView API - Scan & Process
// RoomCaptureView API - Scan & Process import UIKit import RoomPlan class RoomCaptureViewController: UIViewController { var roomCaptureView: RoomCaptureView var captureSessionConfig: RoomCaptureSession.Configuration private func startSession() { roomCaptureView?.captureSession.run(configuration: captureSessionConfig) } private func stopSession() { roomCaptureView?.captureSession.stop() } }
-
5:00 - RoomCaptureView API - Export
// RoomCaptureView API - Export import UIKit import RoomPlan class RoomCaptureViewController: UIViewController { … func captureView(shouldPresent roomDataForProcessing: CapturedRoomData, error: Error?) -> Bool { // Optionally opt out of post processed scan results. return false } func captureView(didPresent processedResult: CapturedRoom, error: Error?) { // Handle final, post processed results and optional error. // Export processedResults … try processedResult.export(to: destinationURL) … } }
-
6:50 - RoomCaptureSession - setup previewVisualizer
import UIKit import RealityKit import RoomPlan import ARKit class ViewController: UIViewController { @IBOutlet weak var arView: ARView! var previewVisualizer: Visualizer! lazy var captureSession: RoomCaptureSession = { let captureSession = RoomCaptureSession() arView.session = captureSession.arSession return captureSession }() override func viewDidLoad() { super.viewDidLoad() captureSession.delegate = self // set up previewVisualizer } }
-
7:40 - RoomCaptureSession - live results and user instructions
// Getting live results and user instructions extension ViewController: RoomCaptureSessionDelegate { func captureSession(_ session: RoomCaptureSession, didUpdate room: CapturedRoom) { previewVisualizer.update(model: room) } func captureSession(_ session: RoomCaptureSession, didProvide instruction: Instruction) { previewVisualizer.provide(instruction) } }
-
9:12 - Setup RoomBuilder
// RoomBuilder import UIKit import RealityKit import RoomPlan import ARKit class ViewController: UIViewController { @IBOutlet weak var arView: ARView! var previewVisualizer: Visualizer! // set up RoomBuilder var roomBuilder = RoomBuilder(options: [.beautifyObjects]) }
-
9:30 - RoomBuilder - generate final 3D CapturedRoom
// RoomBuilder with the latest CapturedRoomData to generate final 3D CapturedRoom extension ViewController: RoomCaptureSessionDelegate { func captureSession(_ session: RoomCaptureSession, didEndWith data: CapturedRoomData, error: Error?) { if let error = error { print("Error: \(error)") } Task { let finalRoom = try! await roomBuilder.capturedRoom(from: data) previewVisualizer.update(model: finalRoom) } } }
-
11:20 - CapturedRoom and export
// CapturedRoom and export public struct CapturedRoom: Codable, Sendable { public let walls: [Surface] public let doors: [Surface] public let windows: [Surface] public let openings: [Surface] public let objects: [Object] public func export(to url: URL) throws // Surface definitions ... // Object definitions ... }
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.