Reality Composer

Now anyone can quickly prototype and produce content for AR experiences that are ready to integrate into apps using Xcode or export to AR Quick Look. Reality Composer lets you build animations and interactions on iOS and Mac to enrich your 3D content.

Download Reality Composer

Built-in AR Library

Import your own USDZ files or take advantage of the hundreds of ready-to-use virtual objects in the built-in AR library. This library harnesses the power of procedural content generation for a variety of assets, so you can customize a virtual object’s size, style, and more.

Animations and Audio

Add animations that let you move, scale, and add emphasis like a “wiggle” or “spin” to virtual objects. You can choose for actions to happen when a user taps an object, comes in close proximity with it, or activates some other trigger. You can also take advantage of spatial audio to add a new level of reality to your AR scene.

Seamless Tools

Reality Composer is included with Xcode and is also an iOS app, so you can build, test, tune, and simulate AR experiences entirely on iPhone or iPad. And thanks to live linking, you can rapidly move between Mac and iOS to create stunning, complex AR experiences on the devices that work best for you.

Record and Play

With Reality Composer for iOS, you can record sensor and camera data in the location where the AR experience will take place, then replay it later on your iOS device while building your app.

RealityKit

This brand new, high-level framework was built from the ground up specifically for augmented reality with photo-realistic rendering, camera effects, animations, physics and more. It also features a Swift API. With native ARKit integration, incredibly realistic physics-based rendering, transform and skeletal animations, spatial audio, and rigid body physics, RealityKit makes AR development faster and easier than ever before.

Learn more about RealityKit

World-class Rendering

RealityKit seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality.

Scalable Performance

Utilizing the latest Metal features to get the most out of the GPU, RealityKit takes full advantage of CPU caches and multiple cores to deliver incredibly fluid visuals and physics simulations. And because it automatically scales the performance of an AR experience to each iOS device, you only need to build a single AR experience.

Swift API

Easy to use yet incredibly powerful, RealityKit uses Swift’s rich language features to automatically provide the full feature set so you can build AR experiences even more quickly, without the need for boilerplate code.

Shared AR Experiences

RealityKit simplifies building shared AR experiences by taking on the hard work of networking, such as maintaining a consistent state, optimizing network traffic, handling packet loss, or performing ownership transfers.