- iOS 12.0+
- Xcode 10.0+
This sample app demonstrates a simple AR experience for iOS 12 devices. Before exploring the code, try building and running the app to familiarize yourself with the user experience it demonstrates:
Run the app. You can look around and tap to place a virtual 3D object on real-world surfaces. (Tap again to relocate the object.)
After you’ve explored the environment, the Save Experience button becomes available. Tap it to save ARKit’s world-mapping data to local storage.
Tap the Load Experience button. (You can do this immediately, or after quitting and relaunching the app, even if the app has been terminated in the background.)
While ARKit attempts to resume an AR session from the saved world-mapping data, the app displays a snapshot of the camera view from the time that data was saved. For best results, move the device so that the camera view matches the screenshot.
Follow the steps below to see how this app uses the
ARWorld class to save and restore ARKit’s spatial mapping state.
Requires Xcode 10.0, iOS 12.0 and an iOS device with A9 or later processor.
Run the AR Session and Place AR Content
This app extends the basic workflow for building an ARKit app. (For details, see Tracking and Visualizing Planes.) It defines an
ARWorld with plane detection enabled, then runs that configuration in the
ARSession attached to the
ARSCNView that displays the AR experience.
UITap detects a tap on the screen, the
handle method uses ARKit hit-testing to find a 3D point on a real-world surface, then places an
ARAnchor marking that position. When ARKit calls the delegate method
renderer:, the app loads a 3D model for
ARSCNView to display at the anchor’s position.
Capture and Save the AR World Map
ARWorld object contains a snapshot of all the spatial mapping information that ARKit uses to locate the user’s device in real-world space. To save a map that can reliably be used for restoring your AR session later, you’ll first need to find a good time to capture the map.
ARKit provides a
world value that indicates whether it’s currently a good time to capture a world map (or if it’s better to wait until ARKit has mapped more of the local environment). This app uses that value to provide visual feedback and choose when to make the Save Experience button available:
When the user taps the Save Experience button, the app calls
get to capture the map from the running ARSession, then serializes it to a
Data object with
NSKeyed and writes it to local storage:
To help a user resume the AR experience from this map later, the app also captures a snapshot of the camera view with the example
Snapshot class and stores it in the world map.
Load and Relocalize to a Saved Map
When the app launches, it checks local storage for a world map file it may have saved in an earlier session:
If that file exists and can be deserialized as an
ARWorld object, the app makes its Load Experience button available. When you tap the button, the app tells ARKit to attempt resuming the session captured in that world map, by creating and running an
ARWorld using that map as the
ARKit then attempts to relocalize to the new world map—that is, to reconcile the received spatial-mapping information with what it senses of the local environment. This process is more likely to succeed if the user moves to areas of the local environment that they visited during the previous session. To help the user successfully resume the saved experience, this app uses the example
Snapshot class to save a camera image in the world map, then displays that image while ARKit is relocalizing.
Restore AR Content After Relocalization
Saving a world map also archives all anchors currently associated with the AR session. After you successfully run a session from a saved world map, the session contains all anchors previously saved in the map. You can use saved anchors to restore virtual content from a previous session.
In this app, after relocalizing to a previously saved world map, the virtual object placed in the previous session automatically appears at its saved position. The same
ARSCNView delegate method
renderer: fires both when you directly add an anchor to the session and when the session restores anchors from a world map. To determine which saved anchor represents the virtual object, this app uses the
In your own AR experience, you can choose among various techniques for restoring virtual content associated with saved anchors. For example:
An app for visualizing furniture from a fixed catalog might save an identifier for each placed object in the corresponding anchor’s
name, then use that identifier to determine which 3D model to display when resuming a session from a saved map.
A game that places virtual characters to play in the user’s environment might create various custom
ARAnchorsubclasses to store gameplay data specific to each character, so that resuming a session from a saved map also restores the state of the game. (See
ARAnchorSubclassing Notes and