- iOS 11.0+
- Xcode 9.1+
This sample app runs an ARKit world tracking session with content displayed in a SceneKit view. To demonstrate plane detection, the app simply places an
SCNPlane object to visualize each detected
Configure and Run the AR Session
ARSCNView class is a SceneKit view that includes an
ARSession object that manages the motion tracking and image processing required to create an augmented reality (AR) experience. However, to run a session you must provide a session configuration.
ARWorld class provides high-precision motion tracking and enables features to help you place virtual content in relation to real-world surfaces. To start an AR session, create a session configuration object with the options you want (such as plane detection), then call the
run(_: method on the
session object of your
Run your session only when the view that will display it is onscreen.
Important: If your app requires ARKit for its core functionality, use the
arkit key in the
UIRequiredDeviceCapabilities section of your app’s
Info file to make your app available only on devices that support ARKit. If AR is a secondary feature of your app, use the
is property to determine whether to offer AR-based features.
Place 3D Content for Detected Planes
After you’ve set up your AR session, you can use SceneKit to place virtual content in the view.
When plane detection is enabled, ARKit adds and updates anchors for each detected plane. By default, the
ARSCNView class adds an
SCNNode object to the SceneKit scene for each anchor. Your view’s delegate can implement the
renderer(_: method to add content to the scene.
If you add content as a child of the node corresponding to the anchor, the
ARSCNView class automatically moves that content as ARKit refines its estimate of the plane’s position and extent. To show the full extent of the estimated plane, this sample app also implements the
renderer(_: method, updating the
SCNPlane object’s size to reflect the esitmate provided by ARKit.