Augmented Reality

ARKit, Apple's augmented reality (AR) technology, delivers immersive, engaging experiences that seamlessly blend virtual objects with the real world. In AR apps, the device's camera presents a live, onscreen view of the physical world. Three-dimensional virtual objects are superimposed over this view, creating the illusion that they actually exist. The user can reorient their device to explore the objects from different angles and, if appropriate for the experience, interact with objects using gestures and movement.

Designing an Engaging Experience

Use the entire display. Devote as much of the screen as possible to viewing and exploring the physical world and your app's virtual objects. Avoid cluttering the screen with controls and information that diminish the immersive experience.

Create convincing illusions when placing realistic objects. Not all AR experiences require realistic virtual objects. Those that do should include objects that appear to inhabit the physical environment in which they're placed. For best results, design detailed 3D assets with lifelike textures. Use the information ARKit provides to position objects on detected real-world surfaces, scale objects properly, reflect environmental lighting conditions on virtual objects, cast virtual object shadows on real-world surfaces, and update visuals as the camera's position changes.

Anticipate that people will use your app in environments that aren’t optimal for AR. People may open your app in a location where there isn't much room to move around or there aren't large, flat surface areas. Try to anticipate scenarios that present challenges, and clearly communicate requirements or expectations to people up front. Consider offering varying sets of features for use in different environments.

Be mindful of the user's comfort. Holding a device at a certain distance or angle for a prolonged period can be fatiguing. Consider how people must hold their device when using your app, and strive for an enjoyable experience that doesn't cause discomfort. For example, by default, you could place objects at a distance that reduces the need to move the device closer to the object. A game could keep levels short and intermixed with brief periods of downtime.

If your app encourages user motion, introduce it gradually. In a game, the user shouldn't need to move out of the way to avoid a virtual projectile as soon as they enter AR. Give them time to adapt to the experience first. Then, progressively encourage movement.

Be mindful of the user's safety. Moving around too much can be dangerous if other people or objects are nearby. Consider ways of making your app safe to operate. A game could avoid encouraging large or sudden movements.

Use audio and haptic feedback to enhance the immersive experience. A sound effect or bump sensation is a great way to confirm that a virtual object has made contact with a physical surface or other virtual object. In an immersive game, background music can help envelop the user in the virtual world. For related guidance, see Audio and Haptic Feedback.

Wherever possible, provide hints in context. Placing a three-dimensional rotation indicator around an object, for example, is more intuitive than presenting text-based instructions in an overlay. Textual overlay hints may be warranted prior to surface detection, however, or if the user isn't responding to contextual hints.

Consider guiding people toward offscreen virtual objects. It can sometimes be hard to locate an object that’s positioned offscreen. If it seems like the user is having trouble finding an offscreen object, consider offering visual or audible cues. For example, if an object is offscreen to the left, you could show an indicator along the left edge of the screen so the user knows to aim the camera in that direction.

If you must display instructional text, use approachable terminology. AR is an advanced concept that may be intimidating to some users. To help make it approachable, avoid referring to technical, developer-oriented terms like ARKit, world detection, and tracking. Instead, use friendly, conversational terms that most people will understand.

Do Don't
Unable to find a surface. Try moving to the side or repositioning your phone. Unable to find a plane. Adjust tracking.
Tap a location to place the [name of object to be placed]. Tap a plane to anchor an object.
Try turning on more lights and moving around. Insufficient features.
Try moving your phone slower. Excessive motion detected.

Entering Augmented Reality

Indicate when initialization and surface detection is in progress and involve the user. Each time your app enters AR, an initialization process occurs during which your app evaluates the surroundings and detects surfaces. Surface detection time can vary based on a number of factors. To reduce possible confusion, indicate that your app is attempting to detect a surface and encourage people to speed up the process by slowly scanning their surroundings.

Placing Virtual Objects

Surface detection indicator

Object placement indicator

App-specific indicator

Help people understand when to locate a surface and place an object. A visual indicator is a great way to communicate that surface targeting mode is active. A trapezoidal reticle in the center of the screen, for example, helps people infer that they should find a horizontal or vertical flat surface. Once a surface is targeted, an indicator should change in appearance to suggest that object placement is now possible. If the indicator’s orientation follows the alignment of the detected surface, it can help people anticipate how the placed object will be aligned. Design visual indicators that feel like part of your app experience.

Respond appropriately when the user places an object. Accuracy is progressively refined (over a very short time) during surface detection. If the user taps the screen to place an object, place it immediately by using the information that's currently available. Then, once surface detection is complete, subtly refine the object's position. If an object is placed beyond the bounds of the detected surface, gently nudge the object back onto the surface.

Avoid trying to precisely align objects with the edges of detected surfaces. In AR, surface boundaries are approximations that may change as the user's surroundings are further analyzed.

User Interaction with Virtual Objects

Favor direct manipulation over separate onscreen controls. It's more immersive and intuitive when a user can touch an object onscreen and interact with it directly, rather than interact with separate controls on a different part of the screen. Bear in mind, however, that direct manipulation can sometimes be confusing or difficult when the user is moving around.

Allow people to directly interact with virtual objects using standard, familiar gestures. For example, consider supporting a single-finger drag gesture for moving objects, and a two-finger rotation gesture for spinning objects. Rotation should generally occur relative to the surface on which an object rests—for example, an object placed on a horizontal surface would typically rotate around the object's vertical axis. For related guidance, see Gestures.

In general, keep interactions simple. Touch gestures are inherently two-dimensional, but an AR experience involves the three dimensions of the real world. Consider the following approaches to simplifying user interactions with virtual objects.

Limit movement to the two-dimensional surface on which the object rests.

Limit object rotation to a single axis.

Respond to gestures within reasonable proximity of interactive virtual objects. It may be difficult for the user to precisely touch specific points on objects that are small, thin, or placed at a distance. When your app detects a gesture near an interactive object, it's usually best to assume the user wants to affect the object.

Consider whether user-initiated object scaling is necessary. Scaling is generally appropriate when an object, like a toy or game character, doesn't have an intrinsic size and the user wants to see it larger or smaller. For an object with a finite size relative to the real world, like a piece of furniture, scaling is irrelevant if the item is placed at an accurate size. Scaling isn’t a remedy for adjusting the distance of an object—making an object larger to make it appear closer, for example, just results in a larger object that's still far away.

Be wary of potentially conflicting gestures. A two-finger pinch gesture, for example, is quite similar to a two-finger rotation gesture. If you implement two similar gestures like this, be sure to test your app and make sure they're interpreted properly.

Make sure virtual object movements are smooth. Objects shouldn't appear to jump when the user resizes them, rotates them, or moves them to a new location.

Explore even more engaging methods of interaction. Gestures aren't the only way for people to interact with virtual objects in AR. Your app can use other factors, like motion and proximity, to bring content to life. A game character, for example, could turn its head to look at the user as the user walks toward it.

Reacting to Imagery in the User's Environment

You can enhance an AR experience by using known imagery in the user’s environment to trigger the appearance of virtual content. Your app provides a set of 2D reference images, and ARKit indicates when and where it detects any of those images in the user’s environment. For example, an app might recognize theater posters for a sci-fi film and then have virtual spaceships emerge from the posters and fly around the environment. Or, an app for a retail store could make a virtual character appear to emerge from a store’s front door by recognizing posters placed on either side of the door.

Design and display reference images to optimize detection. When you provide reference images, you specify the physical size at which you expect to find those images in the user’s environment. Providing a more precise size measurement helps ARKit detect images faster and provide more accurate estimates of their real-world position. Detection performance and accuracy are best for flat rectangular images with high contrast and bold details. Avoid trying to detect images that appear on reflective or curved real-world surfaces.

Use detected imagery only as a frame of reference for displaying virtual content. ARKit doesn’t track changes to the position or orientation of detected imagery. Therefore, if you try to place virtual content precisely—like positioning a mustache on a face in a painting—that content may not appear to stay in place.

Limit the number of reference images in use at one time. Image detection performance works best when ARKit looks for 25 or fewer distinct images in the user’s environment. If your use case calls for more than 25 reference images, you can change the set of active reference images based on context. For example, a museum guide app could use Core Location to determine which part of the museum the user is currently in, and then look only for images displayed in that area.

For developer guidance, see Recognizing Images in an AR Experience.

Handling Interruptions

Avoid unnecessarily interrupting the AR experience. ARKit can't track device position and orientation when AR isn’t active. One way to avoid interruptions is to let people adjust objects and settings within the experience. For example, if a user places a chair they’re considering purchasing into their living room and that chair is available in different fabrics, allow them to change the fabric without exiting AR.

Use relocalization to recover from other interruptions. ARKit can't track device position and orientation during an interruption, such as the user temporarily switching to another app or accepting a phone call. After the interruption, previously placed virtual objects are likely to appear in the wrong real-world positions. When you enable relocalization, ARKit attempts to recover the information needed to restore those virtual objects to their original real-world positions. This process requires the user to position and orient their device near where it was before the interruption. For developer guidance, see ARSessionObserver.

Consider hiding previously placed virtual objects until relocalization completes. During relocalization, ARKit attempts to reconcile its previous state with new observations of the user environment. Until this process completes, the positions of virtual objects are likely incorrect.

Allow users to cancel relocalization. If the user is unable to position and orient their device near where it was before an interruption, relocalization continues indefinitely without success. Guide the user to resume their session successfully, or provide a Reset button or other way for the user to restart the AR experience in case relocalization does not succeed.

Handling Problems

Allow users to reset the experience if it doesn’t meet their expectations. Don't force people to wait for conditions to improve or struggle with object placement. Give them a way to start over again and see if they have better results.

Suggest possible fixes if problems occur. Analysis of the user's environment and surface detection can fail or take too long for a variety of reasons—insufficient light, an overly reflective surface, a surface without enough detail, or too much camera motion. If your app is notified of these problems, offer suggestions for resolving them.

Problem Possible suggestion
Insufficient features detected Try turning on more lights and moving around.
Excessive motion detected Try moving your phone slower.
Surface detection takes too long Try moving around, turning on more lights, and making sure your phone is pointed at a sufficiently textured surface.

Offer AR features only on capable devices. If your app's primary purpose is AR, make your app available only to devices that support ARKit. If your app offers AR as a secondary feature—like a furniture catalog that includes product photos and allows some products to be viewed in AR—avoid displaying an error if the user tries to enter AR on an unsupported device. If the device doesn't support ARKit, don't present optional AR features in the first place. For developer guidance, see the arkit key in the UIRequiredDeviceCapabilities section of Information Property List Key Reference, and the isSupported property of ARConfiguration.

AR Glyph

Apps can display an AR glyph in controls that launch ARKit-based experiences. You can download this glyph in Resources.

Use the AR glyph as intended. The glyph should be used strictly for initiating an ARKit-based experience. Never alter the glyph (other than adjusting its size and color), use it for other purposes, or use it in conjunction with AR experiences not created using ARKit.

Maintain minimum clear space. The minimum amount of clear space required around an AR glyph is 10% of the glyph's height. Don’t let other elements infringe on this space or occlude the glyph in any way.

AR Badges

Apps that include collections of products or other objects can use badging to identify specific items that can be viewed in AR using ARKit. For example, a department store app might use a badge to mark furniture that people can preview in their home before making a purchase.

Use the AR badges as intended and don’t alter them. You can download AR badges, available in collapsed and expanded form, in Resources. Use these images exclusively to identify products or other objects that can be viewed in AR using ARKit. Never alter the badges, change their color, use them for other purposes, or use them in conjunction with AR experiences not created with ARKit.

AR badge

Glyph-only AR badge

The AR badge is preferable to the glyph-only badge. In general, use the glyph-only badge when space is constrained and won't accommodate the AR badge. Both badges work well at their default size.

Use badging only when your app contains a mixture of objects that can be viewed in AR and objects that cannot. If all objects in your app can be viewed in AR, then badging is redundant and unnecessary.

Keep badge placement consistent and clear. A badge looks best when displayed in one corner of an object's photo. Always place it in the same corner and make sure it's large enough to be seen clearly (but not so large that it occludes important detail in the photo).

Maintain minimum clear space. The minimum amount of clear space required around an AR badge is 10% of the badge's height. Other elements shouldn't infringe on this space and occlude the badge in any way.

Learn More

For developer guidance, see ARKit.