RealityKit

The RealityKit framework was built from the ground up specifically for augmented reality with photo-realistic rendering, camera effects, animations, physics, and more. With native Swift APIs, ARKit integration, incredibly realistic physics-based rendering, transform and skeletal animations, spatial audio, and rigid body physics, RealityKit makes AR development faster and easier than ever before.

What’s new

With RealityKit 4, you can build for iOS, iPadOS, macOS, and visionOS — all at once.

RealityKit 4 aligns its rich feature set across iPhone, iPad, Mac and Apple Vision Pro. Reality Composer Pro, a new tool that launched with Apple Vision Pro, enables development of spatial apps on all these platforms. Shaders built with MaterialX, portals, particles, and many other features can now be used with RealityView on all four platforms. This includes APIs for adding materials, shader-based hover effects, and virtual lighting, as well as new features — like blend shapes, inverse kinematics, skeletal poses, and animation timelines — that expand character animation capabilities.

RealityKit 4 also provides more direct access to rendering with new APIs for low-level mesh and textures, which work with Metal compute shaders. And because Xcode view debugging now supports inspecting 3D scene content, it’s easier than ever to inspect and debug your RealityKit content.

Object Capture

Turn photos from your iPhone or iPad into photo-realistic 3D models that are optimized for AR in minutes* using the new Object Capture API on macOS. Object Capture uses photogrammetry to turn a series of pictures taken on your iPhone or iPad into 3D models that can be viewed instantly in AR Quick Look, or integrated into your Xcode project.


Learn more

Custom shaders

RealityKit seamlessly blends virtual content with the real world using realistic, physically based materials, environment reflections, grounding shadows, camera noise, motion blur, and more to make virtual content nearly indistinguishable from reality. RealityKit gives you more control over the rendering pipeline with custom render targets and materials, so you can fine-tune the look and feel of your AR objects and scene.

Custom systems

Build your own Entity Component System so you can organize the assets in your AR scene and build more complex functionality into the system layer.

Object occlusion

By combining information from the LiDAR Scanner and edge detection in RealityKit, virtual objects are able to interact with your physical surroundings just as you’d expect. Virtual objects can be placed under tables, behind walls, or around corners and you’ll see only the parts of the virtual object you’d expect to, with crisp definition of where the physical object hides part of the virtual one.

Video texture

You can now add video textures to any part of your scene in RealityKit. This brings objects, surfaces, and even characters to life by adding rich video to animate virtual TV screens to play a movie, or make a virtual character smile.

Swift API

Easy to use yet incredibly powerful, RealityKit uses the rich language features of Swift to automatically provide the full feature set, so you can build AR experiences even more quickly, without the need for boilerplate code.

Dynamic assets

Enable customizable loading for assets to give you more flexibility to tailor your RealityKit-based experiences — for example, programmatically changing an image or mesh with every frame.

Character Controller

Easily create player-controlled characters using this powerful Swift API, so your users can jump, scale, and explore the AR worlds and RealityKit-based games you create.

Scalable performance

Utilizing the latest Metal features to get the most out of the GPU, RealityKit takes full advantage of CPU caches and multiple cores to deliver incredibly fluid visuals and physics simulations. And because it automatically scales the performance of an AR experience to each iPhone or iPad, you only need to build a single AR experience.

Shared AR experiences

RealityKit simplifies building shared AR experiences by taking on the hard work of networking, such as maintaining a consistent state, optimizing network traffic, handling packet loss, or performing ownership transfers.