Discover visionOS

All-new platform. Familiar frameworks and tools. Get ready to design and build an entirely new universe of apps and games for Apple Vision Pro.

What’s new

Volumetric APIs

Create apps with richer spatial experiences that take full advantage of depth and space and can run side by side with other apps in the Shared Space. Now you can resize volumes by using the SwiftUI scene modifier windowResizability. Volumes can now have a fixed or dynamic scale, so when the 3D object moves away from the user it either appears constant in size or gets smaller as it moves away. And ornaments can now be affixed to volumes.


This new framework allows for easy development of collaborative experiences centered around a table by handling the manipulation of cards and pieces, establishing placement and layout, and defining game boards.

Enterprise APIs

New APIs for visionOS grant enhanced sensor access and increased control, so you can create more powerful enterprise solutions and spatial experiences. Access the main camera, spatial barcode and QR code scanning, the Apple Neural Engine, and more.


Updates to inputs on Apple Vision Pro let you decide if you want the user’s hands to appear in front of or behind the digital content.

Additional features

Capabilities for fidelity of scene understanding have been greatly extended. Planes can now be detected in all orientations and allow anchoring objects on surfaces in your surroundings. Room Anchors consider the user’s surroundings on a per-room basis. And the new Object Tracking API for visionOS lets you attach content to individual objects around the user.

A spectrum of immersion

Apple Vision Pro offers an infinite spatial canvas to explore, experiment, and play, giving you the freedom to completely rethink your experience in 3D. People can interact with your app while staying connected to their surroundings, or immerse themselves completely in a world of your creation. And your experiences can be fluid: start in a window, bring in 3D content, transition to a fully immersive scene, and come right back.

The choice is yours, and it all starts with the building blocks of spatial computing in visionOS.


You can create one or more windows in your visionOS app. They’re built with SwiftUI and contain traditional views and controls, and you can add depth to your experience by adding 3D content.


Add depth to your app with a 3D volume. Volumes are SwiftUI scenes that can showcase 3D content using RealityKit or Unity, creating experiences that are viewable from any angle in the Shared Space or an app’s Full Space.


By default, apps launch into the Shared Space, where they exist side by side — much like multiple apps on a Mac desktop. Apps can use windows and volumes to show content, and the user can reposition these elements wherever they like. For a more immersive experience, an app can open a dedicated Full Space where only that app’s content will appear. Inside a Full Space, an app can use windows and volumes, create unbounded 3D content, open a portal to a different world, or even fully immerse people in an environment.

Apple frameworks — extended for spatial computing


Whether you’re creating windows, volumes, or spatial experiences, SwiftUI is the best way to build a new visionOS app or bring your existing iPadOS or iOS app to the platform. With all-new 3D capabilities and support for depth, gestures, effects, and immersive scene types, SwiftUI can help you build beautiful and compelling apps for Apple Vision Pro. RealityKit is also deeply integrated with SwiftUI to help you build sharp, responsive, and volumetric interfaces. SwiftUI also works seamlessly with UIKit to help you build apps for visionOS.


Present 3D content, animations, and visual effects in your app with RealityKit, Apple’s 3D rendering engine. RealityKit can automatically adjust to physical lighting conditions and cast shadows, open portals to a different world, build stunning visual effects, and so much more. And for authoring your materials, RealityKit has adopted MaterialX, an open standard for specifying surface and geometry shaders used by leading film, visual effects, entertainment, and gaming companies.


On Apple Vision Pro, ARKit can fully understand a person’s surroundings, giving your apps new ways to interact with the space around them. By default, ARKit powers core system capabilities that your apps automatically benefit from when they’re in the Shared Space — but when your app moves to a Full Space and asks permission, you can take advantage of powerful ARKit APIs, like Plane Estimation, Scene Reconstruction, Image Anchoring, World Tracking, and Skeletal Hand Tracking. So splash water on a wall. Bounce a ball off the floor. Make experiences that wow people by blending the real world with your content.


visionOS is designed with accessibility in mind for people who want to interact with their device entirely with their eyes, voice, or a combination of both. And for people who prefer a different way to navigate content, Pointer Control lets them select their index finger, wrist, or head as an alternative pointer. You can create accessible apps for visionOS using the same techniques and tools you already use on other Apple platforms and help make Apple Vision Pro a great experience for everyone.

All the tools you need


Development for visionOS starts with Xcode, which supports the visionOS SDK. Add a visionOS target to your existing project or build an entirely new app. Iterate on your app in Xcode Previews. Interact with your app in the all-new visionOS simulator and explore various room layouts and lighting conditions. Create tests and visualizations to explore collisions, occlusions, and scene understanding for your spatial content.

Download the latest version of Xcode

Reality Composer Pro

Discover the all-new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps. Available with Xcode, Reality Composer Pro can help you import and organize assets, such as 3D models, materials, and sounds. Best of all, it integrates tightly with the Xcode build process to preview and optimize your visionOS assets.


Now you can use Unity’s robust and familiar authoring tools to create new apps and games or reimagine your existing Unity-created projects for visionOS. Your apps get access to all the benefits of visionOS, like passthrough and Dynamically Foveated Rendering, in addition to familiar Unity features, like AR Foundation. By combining Unity’s authoring and simulation capabilities with RealityKit-managed app rendering, content created with Unity looks and feels at home in visionOS.

Learn more

Your visionOS journey begins here

Start developing with the visionOS SDK, Xcode, Simulator, Reality Composer Pro, documentation, sample code, design guidance, and more.

visionOS Pathway

Pathways are simple and easy-to-navigate collections of the videos, documentation, and resources you’ll need to start building great apps and games.

Get started

Submit your app

Whether you’ve created a new visionOS app or are making your existing iPad or iPhone app available on Apple Vision Pro, here’s everything you need to know to prepare and submit your app to the App Store.

Submit your app

Work with Apple

Get direct support from Apple as you develop your apps and games for visionOS. Learn about upcoming events, testing opportunities, and other programs to support you as you create incredible experiences for this platform.

Learn about working with Apple