View sample code to see how the latest Apple technologies are implemented.
Add menus to your user interface, with built-in button support and bar-button items, and create custom menu experiences.
Detect and classify human activity in real time using computer vision and machine learning.
Create widgets to show your app’s content on the Home screen, with custom intents for user-customizable settings.
Create a shared codebase to build a multiplatform app that offers widgets and an app clip.
Improve navigation in your app by using keyboard shortcuts and accessibility containers.
Enhance interactions with your app by handling raw keyboard events, writing custom keyboard shortcuts, and working with gesture recognizers.
Reveal your app’s shortcuts inside the Health app.
Store and fetch images asynchronously to make your app more responsive.
Use Bonjour and TLS to establish secure connections between devices, and define a protocol for sending messages to play a simple game of Tic-Tac-Toe.
Create a custom view with data driven transitions and animations in SwiftUI.
Create a health app that allows a clinical care team to send and receive mobility data.
Make your app accessible to everyone by applying accessibility modifiers to your SwiftUI views.
Build complications that display current information from your app.
Save data to tags, and interact with them using native tag protocols.
Enable writing on a non-text-input view by adding interactions.
Create a virtual drawing app by using Vision’s capability to detect hand poses.
Locate people and the stance of their bodies by analyzing an image with a PoseNet model.
Customize and enhance web pages by building a Safari web extension.
Convert data into readable strings or Swift objects using formatters.
Add expressive, low-latency drawing to your app using PencilKit.
Provide a great user experience with pointing devices, by incorporating pointer content effects and shape customizations.
Add visible alert notifications to your app by using the UserNotifications framework.
Add background notifications to your app by using the UserNotifications framework.
Enable an iPhone to measure the relative position of other iPhones.
Bring compositional layouts to your app and simplify updates to and management of your user interface with diffable data sources.
Score users’ ability to match PencilKit drawings generated from text, by accessing the strokes and points inside PencilKit drawings.
Donate reservations and provide quick access to event details throughout the system.
Consume data in the background, and lower memory usage by batching imports and preventing duplicate records.
Control audio playback and handle requests to add media using SiriKit Media Intents.
Communicate between your Safari web extension and its containing app.
Receive notifications and authorization requests for sensitive operations by creating an Endpoint Security client for your app.
Resolve, confirm, and handle intents without an extension.
Provide voice and text communication on a local network isolated from Apple Push Notification service by adopting Local Push Connectivity.
Make it easy for people to use Siri with your app by providing shortcuts to your app’s actions.
Start, stop, and to save workouts on Apple Watch with the Workout Builder API.
Store separate data for each user with the new Runs as Current User capability.
Customize your app’s user interface with views and controls in UIKit.
Place points in the real-world using the scene’s depth data to visualize the shape of the physical environment.
Implement ray-traced rendering using GPU-based parallel processing.
Run your own graph functions on the GPU by building the function programmatically.
Convert an RGB image to discrete luminance and chrominance channels, and apply color and contrast treatments.
Filter an image by convolving it with custom and high-speed kernels.
Enable nearby devices to share an AR experience by using a peer-to-peer multiuser strategy.
Apply virtual fog to the physical environment.
Create AR games and experiences that interact with real-world objects on LiDAR-equipped iOS devices.
Compile a library of shaders and write it to a file as a dynamically linked library.
Enable nearby devices to share an AR experience by using a host-guest multiuser strategy.
Shape audio output using discrete cosine transforms and biquadratic filters.
Share image data between vDSP and vImage to compute the sharpest image from a bracketed photo sequence.
See how Apple built the featured demo for WWDC18, and get tips for making your own multiplayer games using ARKit, SceneKit, and Swift.
Track specific geographic areas of interest and render them in an AR experience.
Add advanced multitasking capabilities to your video apps by using Picture in Picture playback in tvOS.
Configure an iOS device’s built-in microphones to add stereo recording capabilities to your app.
Augment the macOS Photos app with extensions that support project creation.
Support high-dynamic-range (HDR) video content in your app by using the HDR editing and playback capabilities of AVFoundation.
Add haptic feedback to supported game controllers by using Core Haptics.
Share screen recordings, or broadcast live audio and video of your app, by adding ReplayKit to your macOS apps and games.
Improve the user experience of finding and selecting visual media by using the Photos picker.
Create an HTTP Live Streaming presentation by turning a movie file into a sequence of fragmented MPEG-4 files.
Add auto layout constraints to your app to achieve localizable views.
Determine a customer’s entitlement to your service, offers, and messaging by analyzing a validated receipt and the state of their subscription.