Core ML

Core ML is optimized for on-device performance of a broad variety of model types by leveraging Apple hardware and minimizing memory footprint and power consumption.

What’s new

Updates to the Core ML framework bring even faster model loading and inference. The new Async Prediction API simplifies the creation of interactive ML-powered experiences and aids in maximizing hardware utilization. Use the new Core ML Tools optimization module to help compress and optimize your models for deployment on Apple hardware. Weight pruning, quantization, and palettization utilities can be applied during model conversion or while training your model in frameworks like PyTorch to preserve accuracy during compression.

Watch the latest videos

Experience more with Core ML

Run models fully on-device

Core ML models run strictly on the user’s device and remove any need for a network connection, keeping your app responsive and your users’ data private.

Run advanced neural networks

Core ML supports the latest models, such as cutting-edge neural networks designed to understand images, video, sound, and other rich media.

Convert models to Core ML

Models from libraries like TensorFlow or PyTorch can be converted to Core ML using Core ML Tools more easily than ever before.

Personalize models on-device

Models bundled in apps can be updated with user data on-device, helping models stay relevant to user behavior without compromising privacy.

Mac Studio and Apple Studio Display with Xcode windows open.

Xcode integration

Core ML is tightly integrated with Xcode. Explore your model’s behavior and performance before writing a single line of code. Easily integrate models in your app using automatically generated Swift and Objective‑C interfaces. Profile your app’s Core ML‑powered features using the Core ML and Neural Engine instruments.

Performance reports

Generate model performance reports measured on connected devices without having to write any code. Review a summary of load and prediction times along with a breakdown of compute unit usage.

Profile with instruments

Profile your app to view Core ML API calls and associated models using the Core ML instrument. Find out where and when Core ML dispatches work to the hardware and gain more visibility with the Metal and Neural Engine instruments.

Live preview

Preview your model’s behavior on sample data files or live from your device’s camera and microphone, all right in Xcode.

Encrypt models

Xcode supports model encryption, enabling additional security for your machine learning models.

Powerful Apple silicon

Core ML is designed to seamlessly take advantage of powerful hardware technology including CPU, GPU, and Neural Engine, in the most efficient way in order to maximize performance while minimizing memory and power consumption.

Get started with Core ML

Create ML

Build and train Core ML models right on your Mac with no code.

Learn more

Core ML Tools

Convert models from third-party training libraries into Core ML using the coremltools Python package.

Learn more

Models

Get started with models from the research community that have been converted to Core ML.

Browse models