What’s New in the iOS SDK

Learn about the key technologies and capabilities available in the iOS SDK, the toolkit you use to build apps for iPhone, iPad, or iPod touch. For detailed information on API changes in the latest released versions, including each beta release, see the iOS & iPadOS Release Notes.

iOS 14

With the iOS 14 SDK, users can more easily discover your app’s core functionality through app clips. SwiftUI introduces a new app life cycle and new view layouts. It supports the new WidgetKit framework, which allows your app to display information directly on the iOS Home screen. Machine learning adds style transfers and action classification to the models, and offers a CloudKit-based deployment solution. Vision API additions help your app analyze image and video more thoroughly. ARKit advances promote an even tighter integration with the world around the device, and you can include markups in your emails and websites that help Siri Event Suggestions surface your events.

App Clips

An app clip is a lightweight version of your app that offers users some of its functionality. It’s discoverable at the moment it’s needed, fast, and quick to launch. Users discover and open app clips from a number of places, including Safari, Maps, and Messages, as well as in the real world through QR codes and NFC tags. App clips also provide opportunities for users to download the full app from the App Store. To learn how to create your own app clips, see the app clips documentation.

Widgets

Widgets give users quick access to timely, at-a-glance information from your app right on the iOS Home screen. iOS 14 offers a redesigned widget experience. Your app can present widgets in multiple sizes, allow user customization, include interactive features, and update content at appropriate times. To learn about designing widgets, see the Human Interface Guidelines. To learn how to support widgets in your app, see the WidgetKit framework.

SwiftUI

SwiftUI provides a selection of new built-in views, including a progress indicator and a text editor. It also supports new view layouts, like grids and outlines. Grids and the new lazy version of stacks load items only as needed.

Starting in Xcode 12, you can now use SwiftUI to define the structure and behavior of an entire app. Compose your app from scenes containing the view hierarchies that define an app's user interface. Add menu commands, handle life-cycle events, invoke system actions, and manage storage across all of your apps. By incorporating WidgetKit into your app, you can also create widgets that provide quick access to important content right on the iOS Home screen or the macOS Notification Center. For more information, see App Structure and Behavior.

ARKit

ARKit adds Location Anchors, which leverages the refine location feature in the new Apple Map to enable rear-camera AR experiences in specific geographic locations. A new Depth API lets you access even more precise distance and depth information captured by the LiDAR Scanner on iPad Pro. To learn more about these features, see the ARKit framework documentation.

Machine Learning

Your machine learning apps gain new functionality, flexibility, and security with the updates in iOS 14. Core ML adds model deployment with a dashboard for hosting and deploying models using CloudKit, so you can easily make updates to your models without updating your app or hosting the models yourself. Core ML model encryption adds another layer of security for your models, handling the encryption process and key management for you. The Core ML converter supports direct conversion of PyTorch models to Core ML.

The Create ML app’s new Style Transfer template stylizes photos and videos in real time, and the new Action Classification template classifies a single person’s actions in a video clip. For more information, see the Core ML and Create ML developer documentation.

Vision

With iOS 14, the Vision framework has added APIs for trajectory detection in video, hand and body pose estimation for images and video, contour detection to trace the edges of objects and features in image and video, and optical flow to define the pattern of motion between consecutive video frames. To learn more about these features, see the Vision framework documentation. In particular, read Building a Feature-Rich App for Sports Analysis to find out how these features come together in a sample app.

Natural Language

The Natural Language framework has new API to provide sentence embedding that creates a vector representation of any string; word tagging to train models that classify natural language, customized for your specific domain; and confidence scores that rank the framework’s predictions. For more information, see the Natural Language framework documentation.

App Store Privacy Information

Privacy is at the core of the entire iOS experience, and new privacy information in the App Store gives users even more transparency and control over their personal information. On iOS 14, apps will be required to ask users for permission to track them across apps and websites owned by other companies. Later this year, the App Store will help users understand apps’ privacy practices, and you’ll need to enter your privacy practice details into App Store Connect for display on your App Store product page.

Siri Event Suggestions Markup

You can use the Siri Event Suggestions Markup to provide event details on a webpage and in email. Siri parses travel arrangements, movies, sporting events, live shows, restaurant reservations, and social events. Once parsed, Siri can suggest driving directions, a ride share to a scheduled event, or activation of Do Not Disturb just before a show starts. To learn how to integrate your own events with Siri, see the Siri Event Suggestions Markup documentation.

PencilKit

PencilKit now enables handwriting recognition inside text fields. Using gestures, users can also select or delete text, and join or break up words. You can add data detection to your app, as well as text and shape recognition and selection. For more information, see the PencilKit framework documentation.

Accessibility

A new Accessibility framework lets your app dynamically deliver a subset of accessible content to a user based on context.

MetricKit

MetricKit adds Diagnostics, a new type of payload that tracks specific app failures, such as crashes or disk-write exceptions. For more information, see the MetricKit framework documentation.

Family Sharing for In-App Purchases

Family Sharing is a simple way for users to share subscriptions, purchases, and more with everyone in their household. And with iOS 14, you can choose to offer Family Sharing for your users’ in-app purchases and subscriptions so their whole family can enjoy the added benefits. See the SKProduct and SKPaymentTransactionObserver for the new APIs.

Screen Time

iOS 14 includes Screen Time APIs for sharing and managing web-usage data and observing changes a parent or guardian makes. For more details, see the Screen Time framework documentation.

Uniform Type Identifiers

Use the new Uniform Type Identifiers framework to describe file formats and in-memory data for transfer, such as the pasteboard; and to identify resources, such as directories, volumes, and packages.

File Compression

Use the new Apple Archive framework to perform fast, multithreaded, lossless compression of directories, files, and data in iOS.