Get ready for iOS 18

Create more customized apps that feel great on iOS and appear in more places across the system with controls, widgets, and Live Activities. And with Apple Intelligence, you can bring personal intelligence into your apps to deliver new capabilities — all with great performance and built-in privacy.

Want the highlights? Download the iOS one-sheet

Apple Intelligence

Apple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac. It powers incredible new features to help people communicate, work, and express themselves.

Writing Tools are available system-wide, and help users rewrite, proofread, and summarize text. The Image Playground API delivers an easy-to-use experience where people can create fun, playful images right in your app. Genmoji bring fun new ways to communicate by providing the ability to create an emoji for any occasion. And Siri will be able to access text displayed in your app, and take hundreds of new actions in and across apps using the App Intents you make available.

Learn about Apple Intelligence

Watch more of the latest videos

App Intents

New orchestration capabilities provided by Apple Intelligence and significant enhancements to App Intents will enable Siri to take hundreds of new actions in and across apps. Using the Transferable API, File Representations, Item Providers, Spotlight Index, and more, you can make your entities more meaningful to the platform. Plus, you can provide powerful functionality in Siri and Spotlight. You can also explore new APIs for error handling, deferred properties, and associated enums.

Watch more of the latest videos


Now it’s faster and easier for people to complete frequent tasks from your apps with new controls in Control Center and on the Lock Screen. Controls can toggle a setting, execute an action, or deep link right to a specific experience — and you can create them with just a few lines of code and the new Controls API.

Learn about widgets

Machine learning

Core ML

Updates to Core ML will help you optimize and run advanced generative machine learning and AI models on device faster and more efficiently. Core ML Tools offer more granular and composable weight compression techniques to help you bring your large language models and diffusion models to Apple silicon. Models can now hold multiple functions and efficiently manage state, enabling more flexible and efficient execution of large language models and adapters. The Core ML framework also adds a new MLTensor type that provides an efficient, simple, and familiar API for expressing operations on multi-dimensional arrays. And Core ML performance reports in Xcode have been updated to provide more insight into support and estimated cost of each operation in your model.

Create ML

Object tracking, the first spatial computing template, is designed to help you track real world objects in your visionOS app. Enhance your customized model training workflow with the new data preview functionality in the Create ML app and new Swift APIs from Create ML Components that help you create time series models directly within your app.

Machine learning APIs

The new Translation framework allows you to translate text across different languages in your app. The Vision framework API has been redesigned to leverage modern Swift features, and also supports two new features: image aesthetics and holistic body pose. And the Natural Language framework offers extended language support with multilingual contextual embedding.

RealityKit 4

RealityKit 4 aligns its rich feature set across iPhone, iPad, Mac, and Apple Vision Pro. Reality Composer Pro, a new tool that launched with Apple Vision Pro, enables development of spatial apps on all these platforms.

Portals, particles, shaders built with MaterialX, and many other features can now be used with RealityView on all four platforms. This includes APIs for adding materials, shader-based hover effects, and virtual lighting, as well as new features — like blend shapes, inverse kinematics, skeletal poses, and animation timelines — that expand character animation capabilities.

RealityKit 4 also provides more direct access to rendering with new APIs for low-level mesh and textures, which work with Metal compute shaders. And because Xcode view debugging now supports inspecting 3D scene content, it’s easier than ever to inspect and debug your RealityKit content.

Home Screen

App icons and widgets can now appear Light, Dark, or with a Tint. And no matter how your icon is rendered, you can make sure it always looks great by customizing each version.

Learn about icons


Passkeys are a replacement for passwords that are more secure, easier to use, and can’t be phished. They offer faster sign-in, fewer password resets, and reduced support costs. Use the new automatic passkey upgrade API to create a passkey when someone signs in to your app and let them know that the passkey was saved — all without interrupting their flow.

App Store and StoreKit

Find out how to nominate your apps for featuring on the App Store, share exciting moments (like a version launch) with marketing assets generated for you, and deep link to specific content in your app from custom product pages. Enhancements to StoreKit views give you more flexibility and customization options when building your merchandising experiences. Improvements to StoreKit Testing in Xcode and the Apple sandbox environment help you test additional purchase scenarios. And win-back offers give you a new way to re-engage previous subscribers.

Wallet and Apple Pay

Make your event tickets shine with rich pass designs in Wallet, bring great Apple Pay experiences to even more people with third-party browser support, and use new API updates to integrate Apple Pay into even more purchasing flows.

Get started with Xcode 16 beta

Use Xcode 16 and the iOS 18 SDK to build the latest iOS capabilities into your app.

Download Xcode