The description of a real-world object you want ARKit to look for in the physical environment during an AR session.
- iOS 12.0+
Object detection in ARKit lets you trigger AR content when the session recognizes a known 3D object. For example, your app could detect sculptures in an art museum and provide a virtual curator, or detect tabletop gaming figures and create visual effects for the game.
To provide a known 3D object for detection, you scan a real-world object using ARKit:
Run an AR session using
ARObjectto enable collection of high-fidelity spatial mapping data.
In that session, point the device camera at the real-world object from various angles, allowing ARKit to build up an internal map of the object and its surroundings. For an example of guiding user interactions to produce good scan data, see Scanning and Detecting 3D Objects.
Determine the portion of the session's world coordinate space representing the object to be recognized, and call
createto get that portion as an
Reference Object With Transform: center: extent: completion Handler:
ARReferenceready for use in object detection.
To save the reference object for use later or elsewhere, use the
exportmethod to create an
Object To URL: preview Image: error:
To detect objects in an AR session, pass a collection of reference objects to your session configuration's
detection property. You need not scan and detect objects in the same app: For example, you might create one app for scanning museum collections that outputs
.arobject files, then bundle those files into another app meant for museum visitors.
To bundle reference objects into an app, use your Xcode project's asset catalog:
In your asset catalog, use the Add (+) button to create an AR Resource Group.
.arobjectinto the resource group to create AR Reference Object entries in the asset catalog.
Optionally, use the Xcode inspector panel to provide a descriptive name for the object, which appears as the
nameproperty at runtime and can be useful for debugging.