Reality Composer

Now anyone can quickly prototype and produce content for AR experiences that are ready to integrate into apps using Xcode or export to AR Quick Look. Reality Composer lets you build animations and interactions on iPhone, iPad, and Mac to enrich your 3D content.

Built-in AR Library

Import your own USDZ files or take advantage of the hundreds of ready-to-use virtual objects in the built-in AR library. This library harnesses the power of procedural content generation for a variety of assets, so you can customize a virtual object’s size, style, and more.

Animations and Audio

Add animations that let you move, scale, and add emphasis like a “wiggle” or “spin” to virtual objects. You can choose for actions to happen when a user taps an object, comes in close proximity with it, or activates some other trigger. You can also take advantage of spatial audio to add a new level of reality to your AR scene.

Seamless Tools

Reality Composer is included with Xcode and is also available as an iOS and iPadOS app, so you can build, test, tune, and simulate AR experiences entirely on iPhone or iPad. And thanks to live linking, you can rapidly move between platforms create stunning, complex AR experiences on the devices that work best for you.

Record and Play

With Reality Composer for iOS, you can record sensor and camera data in the location where the AR experience will take place, then replay it later on your iOS device while building your app.


This brand new, high-level framework was built from the ground up specifically for augmented reality with photo-realistic rendering, camera effects, animations, physics and more. It also features a Swift API. With native ARKit integration, incredibly realistic physics-based rendering, transform and skeletal animations, spatial audio, and rigid body physics, RealityKit makes AR development faster and easier than ever before.

Learn more about RealityKit

World-class Rendering

RealityKit seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality.

Scalable Performance

Utilizing the latest Metal features to get the most out of the GPU, RealityKit takes full advantage of CPU caches and multiple cores to deliver incredibly fluid visuals and physics simulations. And because it automatically scales the performance of an AR experience to each iPhone or iPad, you only need to build a single AR experience.

Swift API

Easy to use yet incredibly powerful, RealityKit uses Swift’s rich language features to automatically provide the full feature set so you can build AR experiences even more quickly, without the need for boilerplate code.

Export to USDZ

Reality Composer now supports export to USDZ which includes support for all animations, anchors and spatial audio authored in Reality Composer.

Shared AR Experiences

RealityKit simplifies building shared AR experiences by taking on the hard work of networking, such as maintaining a consistent state, optimizing network traffic, handling packet loss, or performing ownership transfers.


Reality Composer for iOS, iPadOS, and macOS makes it easy to build, test, tune, and simulate AR experiences for iPhone or iPad. With live linking, you can rapidly move between Mac and iPhone or Mac and iPad to create stunning AR experiences, then export them to AR Quick Look or integrate them into your app with Xcode.

Xcode 11

Reality Composer for macOS is bundled with Xcode 11, which is available on the Mac App Store.

View on the Mac App Store

Reality Composer

Reality Composer for iOS and iPadOS is available on the App Store.

View on the App Store