Annotate an AR experience with virtual sticky notes that you display onscreen over real and virtual objects.
- iOS 13.0+
- Xcode 11.2+
At times, the user may want to annotate real or virtual objects in your AR experience. For example, they might want to place a virtual name plate on paintings at a museum. By fixing annotations to the screen, you enable the user to annotate their AR experience in screen space. To demonstrate screen-space annotations, this sample app enables the capability to tap the screen to place one or more virtual sticky notes with text in the real world.
Text displayed in screen space remains readable at all viewing angles and distances. The sample app implements sticky notes using a
UIText that’s flush with the screen, which allows the user to quickly define the note’s text using regular touch input. Using UIKit to annotate an AR experience also provides the benefits of localization and accessibility.
To display text that’s anchored in world space instead, see Recognizing and Labeling Arbitrary Objects.
Resolve the User’s Tap to a 3D Location
To annotate an object in an AR experience, you first determine where it is in the physical environment. This sample app enables the user to tap the screen to place a sticky note by first adding a tap gesture recognizer to the view.
When the input handler is called, you read the tap screen coordinates by calling
To get a 3D world position that corresponds to the tap location, cast a ray from the camera’s origin through the touch location to check for intersection with any real-world surfaces along that ray.
If ARKit finds a planar surface where the user tapped, the ray-cast result provides you the 3D intersection point in
Anchor a Sticky Note in the Environment
To keep track of a real-world location, you create an anchor positioned there. RealityKit implements an anchor as an
Entity conforming to
Has. Thus, you implement those protocols when designing a sticky note in RealityKit.
Create the entity by calling its initializer and passing in the ray-cast result’s
In the sticky note entity’s
init function, position the entity at the tap location by setting its transformation matrix to the argument
Let RealityKit know about your entity by adding it to the scene hierarchy. RealityKit then registers an
ARAnchor for your entity with ARKit.
Display the Sticky Note’s Annotation
For the purposes of this sample app, the sticky note entity has no geometry and thus, no appearance. Its anchor provides a 3D location only, and it’s the sticky note’s screen-space annotation that has an appearance. To display it, you define the sticky note’s annotation. Following RealityKit’s entity-component model, design a component that houses the annotation, which in this case is a view. See
As a prepackaged UI element that renders text for you,
UIText is useful as a screen-space annotation.
Expose the screen-space component in its own protocol.
Implement the protocol in your entity; see
To display the entity’s annotation, add the sticky-note view to the view hierarchy.
To put the annotation in the right place on the screen, ask the
ARView to convert its entity’s world location to a 2D screen point.
To enhance visual accuracy, center the note’s view around the anchor’s projected world location.
To do that, calculate the midpoint and set the view’s origin.
Update the Annotation’s Position
Because users move their device during an AR experience, the annotation’s screen position quickly becomes out of sync with its anchor’s world position. To keep the annotation’s screen position accurate, call
project function every frame, updating the annotation’s position with the result.
Handle User Interaction
A benefit of using
UIView types for screen annotations is that they simplify user interaction. The sample implements sticky notes using
UIText, which enables users to more easily edit their text. The sample implements minimal gesture recognizer code to manage sticky notes.
The following code enables the capability to create a note by tapping the screen.
By implementing its own tap gesture recognizer to control editing,
UIText enables the user to tap an existing note to edit its text. To be notified when the user edits a note, override
text delegate callback.
The following code enables the capability to move a note by panning the screen.
When the user pans to reposition a sticky note, you convert the screen touch location to a 3D world position using
raycast(from:. The user can then reposition the sticky note’s anchor in the real world versus simply moving the annotation to a new arbitrary screen location. If a ray cast from the final screen location in the pan gesture doesn’t produce an intersection with a 3D world location, don’t move the sticky note there.
The following portion of the pan gesture handler enables the capability to remove a sticky note when the user drags it to the text that says “delete” at the top of the screen.
Enhance the Experience with Animation
Keeping screen-space annotations to a minimum will maximize the user’s immersion in the AR experience. The sample app makes sticky notes small when the user isn’t editing text, minimizing distractions so they can focus on the real-world environment. But for similar reasons, you should enlarge a sticky note when the user is editing text. To create a seamless transition between editing and nonediting states, animate the sticky note’s size instead of changing it abruptly. See the
Bring even more focus to the editing experience by dimming the background and by lighting the sticky note that the user is editing.
To prevent the user from losing track of a sticky note’s real-world location, animate the note smoothly from one position to the next. For example, if an annotation fails to reposition, animate the sticky note back to its original screen position. This increases the user’s ability to track the annotation if they want to try moving it again.
To animate the note’s movement, you continually set its location using a