Streaming is available in most browsers,
and in the Developer app.
-
Platforms State of the Union
Take a deeper dive into the latest tools, technologies, and advances across Apple platforms to help you create even better apps.
Resources
-
Download
♪ ♪ ♪ ♪ Susan Prescott: Welcome to the Platform State of the Union for WWDC 2022. We're always excited for WWDC because it's an opportunity for us to connect with all of you, share some news about what we've been working on, and better understand what you need from our developer platforms. What you do as developers is amazing. You transform your ideas, the stuff of imagination, to take users' experiences to another level. And we want to help you take your ideas even further. In the Keynote, we already talked about some new capabilities across iPhone, iPad, Mac, Apple Watch, and Apple TV, and the incredible power of Apple silicon to help bring even the most ambitious ideas to life. We have a lot to cover today. Let's start with some updates. Earlier this year, we opened the brand-new Apple Developer Center, a place designed for you to connect and collaborate with Apple engineers and designers right here at Apple Park. And last fall, thousands of you from all over the world attended our new online Tech Talks, with hundreds of live sessions originating in 11 countries and in five languages. For us, one of the best parts of the Tech Talk series was one to one meetings, which were a great opportunity to hear about what you're up to and share some advice and guidance. Last fall, Swift Playgrounds 4 shipped with the power to build apps and submit them directly to the App Store and support for SwiftUI, making it an incredible tool for experimentation and UI prototyping. And of course, there's Xcode Cloud. We built Xcode Cloud to help you build better apps faster and more easily. It's a continuous integration and delivery service built right into Xcode and hosted in the Cloud. Xcode Cloud supports development for all Apple platforms. It integrates with TestFlight and App Store Connect as well as every major git based source control provider. It even has REST APIs to help connect to other aspects of your development workflow.
It's built with advanced security to protect you and your projects. And I'm delighted to say that Xcode Cloud is available starting today.
We think nearly every development team can benefit from Xcode Cloud, and we've priced it to be accessible to developers of all sizes. We're offering the 25 hour per month subscription free to all Apple Developer Program members through the end of 2023! And you'll be able to subscribe to any of the Xcode Cloud subscription levels in the Developer app later this Summer. Today we're gonna talk about three big topics. First, we want to share more about our vision for developing for Apple platforms, where we are with our platforms, and where we're headed. Then, we'll share some compelling new ways your apps can integrate with the system experience on Apple platforms. And finally, we'll discuss some powerful new APIs and show you how they can enable you to do even more ambitious things with your apps. Let's start with the vision for our developer platform and how it's been evolving. Josh is here to tell you all about it., Josh Shaffer: A great developer platform provides tight integration between programming language, frameworks and tools. When all three fully complement one another, we can ensure that common things are easy, and even uncommon things are possible.
Now, getting that right shortens the path to building a great app, and it benefits everyone. Customers get a consistent experience, like scrolling that always feels perfect. And you're able to focus your time and effort on what makes your app unique. But designs evolve, hardware advances, and what was once cutting edge becomes the expected baseline. The Objective-C language, AppKit & UIKit frameworks, and Interface Builder have empowered generations of developers. These technologies were built for each other, and will continue to serve us well for a long time to come, but over time new abstractions become necessary. For a while now, you've seen us hard at work defining the next generation of integrated language, frameworks, and tools: Swift, SwiftUI, and Xcode Previews.
Tight integration in a development platform like this requires that all three pieces be designed and evolved together, both driving and driven by one another. Swift result builders were inspired by SwiftUI's compositional structure. SwiftUI's declarative views were enabled by Swift value types. And Xcode Previews was specifically designed for, and enabled by, both. Now, the result is the best development platform that we have ever built. And this year, Swift, SwiftUI, and Xcode all have fantastic updates that take this vision further, and make it even easier for you to build great apps for all of our platforms. And it all starts with Swift. Now Ben from the Swift team is gonna tell you all about what's next.
♪ ♪ Ben Cohen: Swift is fast, modern, and safe. It combines the speed of a strongly-typed language, with an expressive syntax that's easy to read and write. And its design eliminates entire categories of programming errors. Swift is the absolute best language for building apps across our devices.
Swift is also open source, with an amazing community of contributors organized at swift.org, supporting one another through initiatives like Diversity in Swift and the Swift Mentorship Program, and advancing the language with working groups on topics like Swift on server, and C++ interoperability.
Over the last year, Swift has gotten even better, with enhancements in concurrency, upgrades to make Swift code easier to read and write, tooling to help you customize your workflow, and amazing under-the-hood improvements. It started last year, with the introduction of Swift Concurrency.
Swift Concurrency dramatically simplified reading and writing code that runs in parallel, and has been a huge hit, with more than 40,000 apps in the App Store adopting it in just the first year. Because it's such a fundamental and important improvement to your app's code base, it's now possible to deploy code with Swift Concurrency to all operating systems released in the last three years. Swift Concurrency also introduced async sequences. This year, there's a new, open-source package that brings concurrency to Swift's rich set of existing sequence algorithms. It's called async algorithms. For example, where Swift's sequence protocol supports a zip algorithm to combine two sequences, async algorithms brings a version for zipping together two asynchronous sequences. Because async sequences are integrated directly into the Swift language, they use familiar constructs like 'for' loops that, thanks to the async/await syntax, looks like regular straight-line code. You're also able to use the familiar try/catch pattern to handle things like network failures from asynchronous data streaming over the network. A key thing about async sequences is how they deliver data values over time.
Swift now includes a new set of clock types for representing time units, and async algorithms builds on them to provide many time-based algorithms, like throttle here, which can help slow down updates from a sequence. Swift's concurrency model is designed to make asynchronous code as easy and safe to write as your synchronous code. A big part of that is Swift's actor model. Actors allow you to isolate your data using thread-safe, concurrently-executing code. Swift prevents you from accidentally sharing that state between parallel threads, defining away a major source of bugs.
Communication between actors is easy and efficient through async/await. Now, Swift takes the idea of actor isolation further with distributed actors. Distributed actors can communicate across multiple processes or devices. The "distributed" keyword marks these actors and methods that can be accessed remotely, whether that's between separate processes on your Mac, peer to peer between different devices, or from a device talking to your backend written with Swift on Server.
Just as actors help Swift protect your state data from race conditions, distributed actors help Swift make them available outside your process, using a pluggable transport mechanism. The Swift compiler can then perform checks that help guarantee correct behavior in a distributed environment, allowing you to get back to working on the features that you care about. Distributed actors and other concurrency features show just how easy Swift code can be to read and write when enhancements are crafted deep within the syntax. To tell you more about usability enhancements in Swift, here's Ken.
Ken Orr: Strings are one of the most important features of any programming language. But dealing with strings can be a common source of frustration.
At some point in a developer's journey, they may find themselves needing to extract information from a string like this.
Writing code to parse the string is easy to get wrong, with many details to track. And the resulting code– it's hard to read and modify.
Regular expressions are a powerful solution to this challenge. They allow you to describe the pattern you expect to see in your string and specify which pieces of information you're interested in capturing. This year, Swift is delivering a huge improvement to the developer experience around regular expressions, starting with a new regular expression literal.
They're built directly into the language, allowing the Swift compiler to check for correctness. And they unlock the power of Swift's type system when you're extracting information with a regular expression. And they take full advantage of Swift's best-in-class Unicode support. Let's take a look. I'm working on an app called Food Truck that organizes everything from taking orders to tracking sales. And some orders come in as strings, packed with data. Now, regular expressions are perfect for extracting the details I want, and there's no better place to experiment with them than here in a Playground. I'll start by creating a regex literal.
Now I'll type the expression, and pull out who made the order, the donut type, and the number of donuts. Now, as I type, the regex is syntax highlighted, which helps me confirm my expression is correct. Now I'll try it out.
Use the order string from above and look for the first match of the regex.
Now, when I run the Playground, I can see with the inline result exactly what parts of the order string the regex matches. And here, it's finding just what I was looking for. Swift's new regex support doesn't stop here. As literals become more complex, like this one that matches parts of a log file, Swift offers an even better way to craft these patterns– regex builders.
And it's easy to convert a literal to a builder.
Now I have code, and that makes it easier to read and change. I can simplify this one even more. Here, where I'm looking for a hex digit, I'll use the new .hexDigit CharacterClass, helping making my intent even clearer. Now, the builder syntax makes it so much easier for me to create and extend my expressions, and get the results I'm looking for. And that's the powerful, new developer experience around regular expressions in Swift. Ben: Beyond string syntax, Swift is also getting easier to read and write through improvements to a language feature known as generics. Generics power features of Swift you use every day, like the Array type, which holds any kind of element, from strings to your own custom types. Generic code uses the concept of a placeholder type to stand in for another type to be determined later. By removing assumptions about specific types, you can be more clear about the intent of your code, and make it easier to re-use. But this can also make your code harder to read.
For example, if you wanted to handle a generic collection of songs as a function parameter, you'd have to write quite a bit of code to express your intent.
Now in Swift, writing a function that accepts some collection of songs is as easy as using the 'some' keyword to tell Swift about the parameter. You get the same meaning, but with less code. In other cases, you might need more dynamic behavior, like with this music library's array of playlists, which might need to contain different types of collections of songs– sets of songs, or arrays of songs. That's where the new 'any' keyword can help. The 'any' keyword is built right into Swift, and allows you to express a type that can hold any collection of songs. And it works seamlessly with generic functions too. By adopting familiar syntax and using more natural keywords, writing generic code in Swift has never been easier. Just as important as the features built into the Swift language are the tools built around it. The Swift Package Manager makes it easy to manage your app's dependencies and take advantage of the fantastic packages published by developers around the world. To date, those developers have published thousands of Swift packages, providing code to help with everything from authentication and web services to data management and reusable UI components. And this year, the Swift Package Manager is amplifying the ways you can create and build code with all-new Package Plugins. Plugins are Swift packages you can add to your project as easily as any other dependency. They download and build automatically on a fresh checkout, except instead of being code in your app, they're code that helps build your app.
Package Plugins can be invoked from the command line or within Xcode, either as part of your build phase or on-demand. They run in a sandbox environment which prompts you for permission before reading or modifying your code.
There are endless possibilities for extending your workflow with Package Plugins. You could use them to lint and format your code to match the team style guide with packages like SwiftLint or SwiftFormat or automatically generate source code at build-time with tools like Sourcery. Anything that helps you get the job done.
Ken: Package Plugins are a great way to extend Xcode, just by writing a little Swift. And you can do that in two ways. With command plugins that you use on demand, and with build plugins for whenever your project builds. Now, back here in our Food Truck app, here's the code for a command plugin I created.
My team has a unique code aesthetic. We like our imports sorted in string length order. Shortest to longest. And since Package Plugins are all about customization and control, we created a command plugin that uses SwiftFormat to take care of that.
It finds all the locally modified files, and then it sorts their imports. Now, here's a file I've been editing with some unsorted imports at the top. I'll use the command on the entire project. I can select any number of targets. I'll run it on everything.
And I can review the plugin's code if I want. I'm all set. I'll run the command.
And then, just like that, the plugin goes to work on my files.
It finds all of the locally modified source files, and then it sorts them in length order. With plugins, you can go beyond just formatting. You can generate source code, work with git, even surface your own custom errors and warnings. I have another plugin to make sure my code is well documented. It's a build plugin, and it's based on the open source SwiftLint package.
So now when I build, I can easily see all the places in my code where I need to add documentation. And build plugins extend all the way to Xcode Cloud, where they run as part of every build. With Swift package plugins, my team and I can create our own commands, customize builds locally and in Xcode Cloud, and then share those plugins with others. All using the power of a few lines of Swift. And that's a quick look at the ways Package Plugins can level up your development workflow. Ben: Finally, Swift has some impressive changes under the hood. Building Swift projects is quicker than ever. Thanks to new parallelization efforts, link time is up to twice as fast. And the Swift concurrency runtime is now more tightly integrated with the OS to better ensure the priority of your asynchronous tasks, helping your apps stay efficient and responsive. Lastly, launch time for apps written in Swift is dramatically faster on iOS 16, with apps like Lyft or Airbnb launching almost twice as fast thanks to improvement in the dynamic linker. With these under-the-hood improvements, new abilities in tooling, an evolved syntax that's easier to read and write, and improvements in concurrency, there has never been a better time to develop in Swift. Swift is the absolute best language for building apps across our devices. But a language is just part of what you need to build your best apps. You have to pair a language with a powerful user interface framework. And Eliza is gonna tell you more. Eliza Block: A powerful UI framework provides abstractions that make it easy to describe your interface, to populate it with data, and to keep it up to date. It should scale well with complexity. And it should be designed for the platform you're targeting, giving you full access to the power of the device. Your UI framework should help you make your app feel familiar and intuitive. It should make it easy to create standard controls and native interaction patterns, with options for advanced customization. And it needs to have an expressive API that allows you to quickly prototype your ideas and see the results across a range of devices.
SwiftUI offers all this and more. Like Swift, Swift UI is designed with strong opinions about the best way to build apps. It has a declarative syntax that's easy to read and write. You describe what your interface looks like, instead of how to build it.
And this leaves room for SwiftUI to provide intelligent defaults for each platform. SwiftUI automatically keeps your interface up-to-date with changes to the underlying data model, so your app's UI never ends up in an inconsistent state.
SwiftUI handles all these details for you, so you can focus your time and energy on what makes your app unique. Writing a new UI framework is a huge undertaking. Since its introduction we've been continually expanding SwiftUI's API coverage, guided by your feedback.
This year we've made it even easier to adopt SwiftUI incrementally in your existing apps, and we've made some exciting enhancements to its power and flexibility, starting with app navigation. With SwiftUI, it has always been easy to create the common kinds of navigation hierarchies found in many apps. And this year, SwiftUI is expanding that support with an all-new navigation API.
The new navigation API makes it easy to express the style of navigation that best fits the needs of your app. With robust programmatic control over the presentation of your app's views, you can easily save and restore selection, and even replace the full contents of a navigation stack. This is really useful when handling important behaviors like setting the launch state of your app, managing transitions between size classes, and responding to deep links.
SwiftUI also has huge improvements when it comes to controlling the layout of your app's interface.
The layout of many app interfaces can be described using SwiftUI's model of horizontal or vertical stacks of elements. And while this model works for many common layouts, sometimes you need something more flexible. This year, we're adding a new Grid API, which makes it easier to lay out a set of views aligned across multiple rows and columns. And you can take your layouts even further with the all-new custom layout API. The custom layout API gives you the flexibility to build any type of layout you want. For example, you could create a flow layout where your views are arranged like the content of a newspaper, wrapping to the next column when more space is needed. Or you could create a radial layout that draws your views in a circle, like the numbers on a watch face.
The custom layout API makes it easy to re-use your layout logic, making your view code simpler and easier to read. SwiftUI continues to grow to offer many more types of interface elements. Like half sheets, which define secondary views that slide above a main view. These are great to provide quick access to information on smaller screens. And SwiftUI now supports Share Sheets, so your app can easily leverage all of the Share extensions available on a user's device. Share Sheet support is powered by the new Transferable protocol, which introduces a type-safe API for transferring app data.
We've also made it easier to adopt SwiftUI incrementally in your existing apps with a special collection view cell that can host SwiftUI views. If you already have a collection view in your UIKit app, you can now write custom cells using SwiftUI's declarative syntax. These cells are tightly integrated with UIKit, supporting swipe actions, cell backgrounds, and all the other features of UICollectionView. Today we're also introducing a brand-new framework that complements SwiftUI and will allow you to express even more of your interface. Here's Jo to tell you more. Jo Arreaza-Taylor: Today's world is filled with data. Data to help understand, make decisions, and see new perspectives. A well-designed and accessible data visualization can communicate complexities to your users in a way that feels clear and natural, empowering them as they move through the day.
Like helping to show changing trends in their health, highlighting their progress towards personal goals, and preparing them for what's to come.
Today, we're introducing a new framework to help empower your users to unlock the data within your apps. Say hello to Swift Charts. Swift Charts is a highly customizable charting framework built on top of SwiftUI that makes it easy to create gorgeous visualizations. It uses the same declarative syntax as SwiftUI to make it easy to read and write code which conveys visual information.
Swift Chart lets you customize the presentation of information to best fit the needs of your app to create everything from line and bar charts to more sophisticated examples like heat maps and stream graphs, and many, many more types.
And because Swift Charts is built on top of SwiftUI, charts have great support for accessibility features, like a terrific, out-of-the-box VoiceOver experience that's easy to customize. Being built on SwiftUI also means you can animate your charts, to help you give your app just the right look and feel. And of course, Swift Charts works great across all our devices. Eliza: Back in our Food Truck app, here are the beautiful new Swift Charts in Xcode's fully redesigned preview area. I'm also using the new multicolumn SwiftUI table view. Let me show you how easy it is to build this chart. And as I scroll, check out the awesome new structured headers in the source editor. They make it really easy to see where you are in the file.
Here's the code for the chart. Now, this is actually a stacked bar chart, but you can't really tell. Let's have each donut use its own color.
Maybe it would be easier to see how the donuts compare if we position the bars side by side.
I love how I can make all these big changes with just a couple simple modifiers. We can customize the styling, too. Let's make the bars reflect the donut colors. And we can even add annotations to the bars with another modifier.
Looking great. Previews are now live by default, so I can immediately interact with my view. I'm gonna change the sort order. Watch how the bars animate beautifully, with Swift Charts doing all the heavy lifting. Let's fetch more data. The chart and table both automatically update as my model changes. The chart even recalculates its Y axis to reflect the new totals. Let me show you one more chart. I've stubbed out a line chart which we can add to the view. I'll jump to the implementation. Line charts with Swift Charts have some really cool options. We can add symbols for each donut. We can annotate the lines. We can even smooth the curves with a variety of interpolation strategies. Let's use catmullRom. Finally, I'll override the chart scale style by providing my own mapping. That will make my chart fit in better with the app's color scheme. Really nice. The redesigned preview area makes it easier than ever to see how my view looks in different environments. By pressing this button in the canvas, I can see my view in Dark and Light mode. I can even look at my layout in every interface orientation, all without adding a single additional preview. Let's zoom in on landscape. It looks like my UI isn't quite fitting here. A few controls are offscreen, and the charts have an awkward aspect ratio. Let's see where we're describing this layout. These views here are in an implicit Vstack.
This year there are some powerful new APIs in SwiftUI that can create more flexible layouts. Here, I'm going to use a ViewThatFits to switch between a vertical and a horizontal stack, depending on the available space.
That looks much better. Let's wire this up so we can navigate to it from the main screen.
I'm using SwiftUI's new navigation split view, which makes this really easy. The split view has a sidebar to track the selection and a NavigationStack that changes its content as the sidebar selection changes. I'll jump into the sidebar and add a navigation link for our Donut Champion view. And then, we can try it out in the interactive preview.
I'd like to see my split view in landscape, so I'll use the new canvas settings to rotate the live preview. Works great. I'm happy with how this is looking on iPad, but now I'd love to bring it to Mac, and it's only a few clicks to do that. I want to take full advantage of the Mac SDK, so I'm going to use native.
With just a single target backing my app, I can share almost all my code, and SwiftUI makes my app look great on each platform. I can also easily add device-specific features. For my Mac app, let's add a menu bar extra. Those are the little useful icons on the upper right corner of your screen, like Wi-Fi and Spotlight. SwiftUI has a new API for this. I just add it to the body of my app. Now let's run this for Mac.
Our Donut Champion view looks great on Mac right out of the box. And here's that menu bar extra. That'll be handy. And that's a quick look at Swift Charts and just a few of the enhancements coming to SwiftUI and Xcode. And now back to Josh. Josh: We're continuing to expand our adoption of SwiftUI across our apps and system interfaces. For example, iOS's new Lock Screen widgets were designed from the ground up using SwiftUI. The new Font Book app was completely rewritten with it. And the modern, forward-looking design of the new macOS System Settings app was built using it. Swift and SwiftUI were designed from the start to provide a single, native language and API for all Apple platforms. You can learn them once and apply them everywhere. Whether your vision is to provide quick access to information at a glance on Apple Watch, productivity tools on MacBook Pro and iPad, new experiences on iPhone, or a new way to relax with Apple TV, Swift, SwiftUI, and Xcode provide a next-generation integrated development platform to help you build apps for all of our products. Now, if you have an existing app, it's easy to adopt these new technologies incrementally. And if you're new to our platforms or if you're starting a brand-new app, the best way to build an app is with Swift and SwiftUI. Now, of course that's just the beginning. We're also continuing to evolve the user experience of our platforms to give you more ways to engage your users. And to tell you more, here's Sebastien.
Sebastien Marineau-Mes: Now, apps are about turning ideas, code, and APIs into user experiences. And the best apps are the ones that can meet users where they are in the moment. We've created ways to help you take user experience beyond your apps, and build it into the system experience on Apple devices.
This journey started with extensions, integration with the Share Sheet, and custom keyboards. And more recently, it's included the ability to have your app display key information on the Home Screen using widgets. Now, this year, there are a number of new ways for your app to integrate with the system experience across our platforms, and it really starts with the Lock Screen, which gets its biggest update ever. It re-imagines how the Lock Screen looks and how it works, and it gives your ideas and your apps another place to engage users. And to tell you more, here's Robert.
Robert Dhaene: In reimagining the Lock Screen, we set out to make it even more personal and beautiful, while improving everyday utility. As part of this, we knew we needed to bring the power of widgets to the all-new Lock Screen. Widgets have been an incredible way to elevate key information from your app and display it where people can view it at a glance. They make it easy to access rich, timely information right from the Home Screen.
The Lock Screen is the first thing you see every time you pick up your iPhone, and it's always been a place to check the date and time and look out for key messages. When thinking about the best format for displaying even more information here, we didn't have to look far for design inspiration.
Complications on Apple Watch already provide glanceable, relevant, and up-to-date information, presented beautifully right when users need it.
The design language naturally extends to iOS and feels right at home on the new Lock Screen. So using WidgetKit, we brought some of those same designs to widgets on the Lock Screen, including Circular, which displays a small image, gauge, or a few characters of text. Circular widgets are great for displaying whether you've been active enough today or if you need to go out for a run. Rectangular provides a large canvas for displaying things like the upcoming weather forecast. Inline provides a powerful way to convey information with a tiny amount of text and SF Symbols above the clock on iPhone, next to a system-supplied date string, such as Monday the 6th. And by the way, all of these widgets work on both iOS and watchOS because starting in watchOS 9, complications are also powered by WidgetKit. For the first time, you can use the same code to generate glanceable data on both platforms. WidgetKit manages platform differences for you automatically, using the appropriate system fonts by default, and tinting the widgets on the Lock Screen for maximum legibility. To show you how to use WidgetKit to create widgets for the Lock Screen on iPhone and complications on Apple Watch using the same code, I'll hand it over to Michael.
Michael Kent: Building widgets on the iPhone Lock Screen and complications on Apple Watch is really easy with WidgetKit. If you've made Home Screen widgets, you're already most of the way there, including how your data and timeline are updated. In our Food Truck app, we already have a systemSmall widget that users can add to their Home Screen to see how many orders they've filled out of their quota today. This kind of information would be great to show on the Lock Screen or in a complication on the watch face. Let's start by building out the Circular family.
We'll first declare support for it in our Supported Families array.
You'll notice that we're using some platform conditionals here. This is because we want this widget to continue supporting macOS and iOS with systemSmall, but that family isn't available on watchOS.
Then, we'll add a case to define its view.
Let's use a gauge that shows the current number of orders from 0 to the daily quota so the users can quickly see their progress at a glance.
We'll display the current order count as text in the center, along with a donut symbol. There. Let's take a look at this in Xcode Previews now.
Awesome! That feels right at home on the Lock Screen.
In order to show a bit more detail at a glance, we can also add support for the rectangular family.
For this view, we'll make a VStack, starting with a title for the data that's shown, and that same donut symbol.
By using the Headline style for the font, we'll get a treatment that looks great on both platforms. And we'll make sure it pops with the widgetAccentable modifier. Since the rectangular family gives us a bit more space, we'll show a cool custom segmented gauge and display the current number of orders out of the daily quota for the gauge's label.
Looking back at the canvas, we can see the rectangular widget in previews as well. I really love that gauge. Now let's take a look at how this widget would appear as a circular complication on a watch face.
Well, all right, everything is there, but for complications, we also need to consider the full color rendering mode, which is the default in Xcode Previews. Let's do that by adding a tint to each of our gauges.
And a foreground color to the rectangular headline.
For a bit of a pop in full color, we can check the rendering mode with an environment property to replace the donut symbol with a donut emoji on both the circular and rectangular views.
That looks really great! With the new variants UI in previews, we can change the color we're previewing with no code at all.
Or even look at several at once.
And since we used default spacing, system font styles, and adapted to the rendering mode, the same views look right at home on both the Lock Screen and watch face. It's that easy to make a widget on the brand-new Lock Screen in iOS 16 and a great complication in watchOS 9, all with the same code. But this isn't the only way we've brought the power of WidgetKit to the Lock Screen. Here's Matt to tell you more.
Matt Shepherd: With WidgetKit, you can give people access to glanceable information. But what about those moments when they need live updates, information tied to an activity, or an event that they care about right now? For that, we are working on something new we call Live Activities. Live Activities makes it easier to stay on top of things that are happening in real time, right from the Lock Screen. Things like the latest score from a game, the progress of a ride share, or a workout, right on the Lock Screen, and always up-to-date. Just like with widgets, you create Live Activities with WidgetKit. The difference is, you update your Live Activity's presentation and state in real time. Since they're built with SwiftUI, you can even animate your updates from one state to the next. These updates make sure your Live Activities has the most current information when the user chooses to glance at it. Live Activities will be available starting in an update to iOS 16 later this year. So those are the updates to the all-new Lock Screen. We think they're gonna be a great way to help you give people more information at a glance in the moments they need it most.
Next, let's talk about a brand-new way to enhance collaborative experiences. To tell you more, here's Pierre. Pierre de Fillipis: Collaboration is an important part of what people do on iOS, iPadOS, and macOS. And that's due in large part to the incredible wealth of apps that many of you have built, to support teams who are collaborating across any distance. There's collaboration for work, like a product road map in Airtable, and there's also collaboration for play, like finding your dream home in Redfin. Whether it's for work or play, collaboration often starts with a conversation. And with the new Messages Collaboration API, you can bring your app's existing collaboration experiences into Messages and FaceTime. When users share a link to content in your app, the API makes it easy for you to mark that link as collaborative, enabling a seamless experience. We provide the identifiers you need so you can give access to the recipients immediately when they tap the link to join. And of course, this works without compromising privacy. Messages identities and app identities remain private and are not shared. And the best part is, you can do this with existing technologies your app is most likely already using. With one object, your users can initiate collaboration in two convenient ways that they're already familiar with. One, the Share Sheet, which has been updated to put collaboration front and center, and two, drag & drop, where you can share content you want to collaborate on by dragging it directly into the Messages conversation. And once the conversation is started, you can even post notices about content updates right to the Messages conversation. With a couple lines of code, your users can get back to collaborating in your app with a single tap in Messages. And with the collaboration popover, your users can get back to the conversation in Messages or FaceTime right from your app. So with the Messages Collaboration API, your app is truly woven into the fabric of Messages and FaceTime. We take care of giving your users powerful communication tools so you can focus on the powerful collaboration tools you deliver in your app. So this is going to level up collaboration on iOS, iPadOS, and macOS, creating a consistent experience that's deeply rooted in the connection between the people collaborating, whether for work or play. Next is Ari, who's going to tell you about a new framework called App Intents. Ari Weinstein: I'm excited to tell you about the App Intents framework, which makes your app's features available to the system, so people can use them automatically through Siri and Shortcuts. People love using Shortcuts with their apps. They let them get things done so fast, just by asking Siri, or by quickly tapping a shortcut on the Home Screen. And it's amazing to see how people remix app capabilities into totally new pieces of functionality with custom shortcuts. Today, people have to add shortcuts manually before they can use them at all. We're making this automatic in iOS 16 with the new App Intents framework.
App Intents works together with Shortcuts to form App Shortcuts, which people can use with Siri right away, without having to set anything up first, like, "Hey, Siri, clean the kitchen with Roomba." But it's not just Siri. App shortcuts give your users a front-row seat to your app's features throughout the system, like in Spotlight, where anytime people search for your app, your shortcuts show up, too, and your shortcuts will be suggested right below app suggestions without needing to adopt any additional APIs, like donations. Your shortcuts also appear immediately in the Shortcuts app, where people can run them with a tap.
App Intents is the next step for the SiriKit Intents framework that we introduced in iOS10. If you adopt Intents to integrate with Widgets or domains like media or messaging, you should keep using the SiriKit Intents framework, but for developers who build custom intents for Siri and Shortcuts, you should go ahead and upgrade to App Intents. You can easily upgrade to App Intents in Xcode by pushing the Convert button in your intent definition file. Xcode will generate the equivalent App Intents source code, and then you fill in the blanks with your intent handling code. The App Intents framework is really easy to develop for because it's designed from the ground up for Swift, and it requires much less code. The Swift code that you write is the only source of truth, There's no separate intent definition files or code generation to keep in sync. And the code is easy to add to your project. You don't need to rearchitect your codebase. Even if you have Objective-C code, you can use it with App Intents by wrapping it with Swift. An app intent represents something people can do inside of your app, and it makes it possible to do it from outside of your app. You can define an intent and add an app shortcut in just a few lines of code. Let's give it a try together. Back in the Food Truck app, I have this great chart view that lets me see the top five best-selling donuts over a given period of time, like today or this week. I want to expose this to Siri and Shortcuts so people pull it up super quickly, so first, in Xcode, I'll go to a new Swift file. I'll import the App Intents framework and SwiftUI.
Then I define the intent by defining a struct that conforms to the AppIntent protocol. I'll give it a title. And I'll add a parameter for which time frame of trends to look at. This uses the time frame enum that's already defined in my codebase. I need to extend it to conform to the AppEnum protocol so that we can extract human-readable names for each enum case, like "today" and "this week." Next, on the intent, I'll implement the perform method. Here, I return a result that includes the SwiftUI chart view. I could also include a dialogue or output value. I want people to be able to use this intent automatically, without setup, so I'll define an app shortcut.
This includes the phrase that people can say to Siri to use this intent. The phrase has to include the app name as a variable, and I've included the time frame parameter so people can say "Food truck trends for today," or "Food truck trends for this week." The last thing I need to do is make this discoverable for my users. People need to see the phrase at some point, so they know what to say to Siri, so I'm going to switch to the file for the Top 5 Donuts view that we were looking at a second ago, and I'll add a Siri tip.
Now I can build and run the app and hop over to my phone. Let's give it a try.
I can see the shortcut now appears in the Shortcuts app, with variants for each parameter value, and I can run one just by tapping on it. And I can run them from Siri just by saying the phrase, "Food truck trends for today." Or I could say, "food truck trends for this week." When people are in the Top Five view of my app, they'll see this tip we added at the bottom, so they know what to say to Siri to ask for this feature. Lastly, people can quickly access these in Spotlight when they search for the app, like this.
It's super useful. App Intents will make it easier than ever before to make your app's functionality available throughout the system experience on all of these platforms. Next, Ricky will tell us about some big updates to authentication technologies.
Ricky Mondello: For as long as we can remember, we've been creating and using passwords. But passwords have serious issues, like phishing, reuse across accounts, and website leaks. The good news is that together we can solve these issues. And we can do this today with Passkeys. Passkeys will streamline your authentication flows and address the top security issues with passwords. Passkeys were designed to be incredibly easy to use. The interface uses familiar Autofill-style UI and FaceID and TouchID for biometric verification. These elements create a seamless transition away from passwords, while delivering a profound increase in security. Let's check out Passkeys in action.
When setting up an account with a passkey, I don't need to create a password. I'll type a user name and save the passkey to my iCloud Keychain. This will securely sync this passkey to all of my other Apple devices. And if I sign out, signing back in is a breeze. Just Face ID, and I'm in. Because passkeys are built on open industry standards that platforms are adopting, I can use the passkey I just created on my iPhone to sign into the Food Truck website on my friend's PC. On the website, I'll type my username, submit, and choose the option to sign in using a phone, scan the QR code, let the iPhone and PC securely connect, and I'm signed in. In Safari on my Mac, it's even easier to sign in. My passkey is already here, thanks to iCloud Keychain, and I can sign in directly from the website's username field. It's easy to integrate passkeys into existing sign-in flows. For example, this website's username field lets me sign in with a passkey or a password. If I type a username for a password based account, I can quickly sign in. With passkeys, the device does the hard work, and it's secure every time. When creating a passkey, the device generates a unique key that is specific to the website or app it was created for and protects it behind biometrics. It's impossible to have a weak passkey. It can't be forgotten, reused, or guessed. Passkeys are based on public key cryptography, which makes credential leaks from servers a thing of the past. Instead of storing salted, hashed passwords, that can leak and be cracked, your server keeps only a public key. Public keys are designed to be truly public, and not at all valuable to hackers. This significantly reduces your risk as a website owner. With passkeys– and this point is huge– credential phishing as we know it today is gone, eliminating the number one security vulnerability that users face. Passkeys are intrinsically linked to the website or app they were set up for, so users can never be tricked into using their passkey on the wrong website. And unlike passwords, it's not possible to type or copy a passkey into a convincing fake website, or even give anything away to someone looking over your shoulder. When you put it all together, what we're talking about here is a new era of account security. Bringing passkeys to your app and website takes only a few steps. First, you'll teach your account backend to store public keys and issue authentication challenges. Then, on your website and in your app, you'll offer passkeys to users and adopt API to create a new passkey and sign in with it. Passkeys are based on the Web Authentication, or WebAuthn, standard, which has been a collaborative effort across the industry from both platform vendors and service owners. The standard itself is mature and well documented, and passkeys fit it like a glove. All of this is ready for you to build on right now. Next generation security, a seamless user experience, and a design that works beautifully alongside passwords during the transition. Back to you, Sebastien. Sebastien: You've just seen a few of the newest ways that you can integrate your apps with the system experience on all of our platforms. And beyond those integration points, there are a ton of new APIs and frameworks across all of our platforms that open up even more possibilities for you and your apps this year.
And I'd like to walk you through a few before diving into some others in more detail. Let's start with iPadOS. With iPadOS 16, you'll be able to make the most powerful iPad apps yet, with a consistent, desktop-like experience. There's a seamless find-and-replace experience for UI text views that your apps get automatically, as well as updates to the navigation bar, toolbars, the document menu, that make it easy for your users to manage documents and customize their experience. To enable even more powerful applications of iPad with connected hardware, DriverKit comes to iPad, helping to unlock the incredible power of the M1 chip. It's the same API that's available on Mac today, enabling you to easily deliver support for your USB, audio, and PCI devices to an even larger audience.
Now, watchOS is creating new opportunities for apps through deeper integration with system services. The CallKit framework in watchOS 9 includes a new Voiceover IP background mode that lets apps make voice calls directly from Apple Watch, with the same familiar user experience as FaceTime audio and phone calls. And Bluetooth-connected medical devices get more robust connectivity and data delivery, allowing for timely alerts when a critical condition is detected.
Now, tvOS 16 gives you new ways to create connected experiences between your apps on Apple TV and iPhone, iPad, or Apple Watch apps on nearby devices. So a workout could use motion data from Apple Watch, or you could use iPhone or iPad as a custom controller for your turn-based games, And tvOS manages device discovery and connection for you, so your app doesn't even need to be running on the other device. In fact, if your app isn't installed, the user is automatically prompted to download it right from the App Store.
Now, for Phone and iPad, there are new tools for advertisers. We know that effective advertising is important to a lot of your businesses, which is why we created SKAdNetwork. It's an API that helps ad networks and advertisers measure the performance of campaigns without tracking users, and we've been pleased to see many third-party ad networks adopt it. Now, we've heard feedback from ad networks and developers, and this year, we made a number of improvements to SKAdNetwork that reflect some of the biggest requests and give you dramatically more flexibility, all without compromising user privacy.
Now, on iPhone and iPad, there are new cool features that use AR and LiDAR scanning with ScanKit and RoomPlan. These APIs let your apps create rich 3D parametric room models in USD and USDZ formats. So you can create a variety of workflows and experiences, from architecture and design, to retail and hospitality, and the models include furniture classification for categories such as sofas, cabinets, TVs, and yes, even kitchen sinks. Now, last year, we introduced Focus for iPhone, iPad, Mac, and Apple Watch, and with it, ways for your app to manage notifications based on a user's Focus. And this year, Focus goes further with Focus filters. They're built on top of App Intents, and Focus filters let you adjust the content of your app based on the user's current focus. So, for example, an app could create a Focus filter to only show work accounts when the user is in their Work Focus. And those examples are really just scratching the surface. Across the board at every level, there are new tools and APIs with the power that you need to take your apps further than ever and to create entirely new apps and experiences. So next, let's go a little deeper starting with Metal, a technology that's really taking things to the next level. And to tell you more, let's go to Sarah. ♪ ♪ Sarah Clawson: Metal is the powerful graphics and compute API that helps you create amazing games and pro apps for Apple platforms. Metal makes it easy to take advantage of the groundbreaking Apple GPUs and unified memory system now spanning the latest iPhone, iPad, and Mac lineups shipping with Apple silicon. And this year, we're introducing Metal 3, with powerful new features that help you render immersive graphics with even higher frame rates and enable new levels of computational performance. For instance, you'll get huge performance gains for the machine learning framework, PyTorch, which now uses the new Metal backend to enable ML training with the GPU. And the biggest focus area is on gaming, starting with game loading, a key element to the gaming experience that can affect launch time and loading new levels. Modern games deliver a rich gaming experience by providing high-quality assets, and loading these assets quickly from storage to the GPU can be challenging. Often, games will hide asset loading behind a loading screen, and one technique to launch gameplay faster is to load and draw a lower quality version until the high-quality visuals are available. This is not an ideal gaming experience since the user sees lower quality graphics for longer. Metal 3 introduces fast resource loading with the Metal IO API that takes advantage of the Apple GPU's unified memory architecture to minimize loading overhead and ensures that the high-speed SSD storage that ships with every Apple silicon Mac has enough requests in its queues to maximize throughput. This new API provides faster and more consistent performance so that more time is spent drawing at the ideal quality.
In addition to moving resources from storage to memory, game loading is also about shader compilation. Shaders always need to be compiled for the user's unique hardware configuration, and with the wide variety of PC hardware permutations, this usually has to be done at runtime. This in-game compilation can affect the gamer's experience causing dropped frames, slower frame rates, and longer loading. In contrast, Apple silicon and Metal 3 are designed together to support all Apple devices. And now, with offline shader compilation, you can generate GPU shader binaries at project build time, enabling you to eliminate in-game shader compilation to reduce load times and improve rendering performance. Another important aspect to gaming is providing rich, detailed assets, and one way to increase the visual fidelity of your game's graphics is by generating much more sophisticated geometric meshes. Traditionally this is done with a compute pass that'll evaluate the surface and generate geometry to be used in a later render pass. The challenge is that this can introduce latency and take an unpredictable amount of memory. Metal 3 introduces a new Mesh Shading API, which gives you precise control over an optimized geometry processing pipeline from a single render pass. The Object shader decides how many meshes to generate, and the Mesh shader generates the actual geometry to be sent directly to the rasterizer, avoiding a trip to device memory and increasing performance. Gamers also want to see these stunning visuals at the highest possible frame rate, but rendering advanced graphics at ultra-high resolutions can cost precious milliseconds. MetalFX upscaling helps you render immersive graphics in less time per frame. Here's how it works. Previously, you would render your full frame at native resolution, but the GPU render time might not hit the target frame time. Now, you can render the same complex scene at a lower resolution to meet the target frame times, and use MetalFX framework to perform temporal antialiasing and upscaling to the target resolution. With Apple silicon and Metal 3's optimized features, gaming has never looked so good on the Mac. And developers agree. Leading game studios have plans to bring their titles to the Mac, like Grid Legends, taking advantage of Apple silicon to help you reach maximum speeds. Or Resident Evil Village, using features like MetalFX upscaling to deliver hauntingly beautiful scenes at the highest resolution. And No Man's Sky, taking advantage of Metal 3 to explore rich, expansive worlds on both Mac and iPad. Metal 3 is incredible, with features to boost the performance of your apps and provide an amazing gaming experience. Now to tell us more about the direction MapKit is headed, here's Kathy. Kathy Lin: Whether you're navigating to a favorite restaurant, planning that next vacation, or just checking where your favorite food truck is parked on a map, we rely on our devices more than ever to help us explore the world around us. MapKit is the best way to help users discover and navigate the world with rich and flexible mapping and location services, powered by Apple Maps and available to developers for free. With MapKit, you can display map or satellite imagery in your app, find and call out points of interest, add annotations and overlays, get directions, and more. MapKit is powered by our all-new map, built from the ground up by Apple. It offers improved detail and accuracy, and can bring useful mapping and location services to your app. With iOS 16, we're building on this map to introduce our biggest update ever for MapKit, starting with making the 3D City Experience available to all developers. Users of your apps will be able to see incredible details, like 3D elevation, turn lanes, crosswalks and bike lanes, and amazing handcrafted 3D landmarks like the Golden Gate Bridge, or the Ferry Building. The additional detail of the map allows you to provide context and precision that was never before possible. You can, for example, show that a point of interest is between the crosswalk and where the bike lane starts. No other digital map lets you do that, and we've made it incredibly easy to implement. To show you more, let's create an experience that makes it easy for a user to find where their favorite food truck is parked using the details of the new map. Map views like this one will automatically get the 3D City Experience where it's available. Just select iOS 16 as the deployment target. Next, I can utilize the extraordinary detail of the map to illustrate the exact location of the food truck. MapKit has powerful controls that allow us to position the camera in 3D space to create a precise view of the map. Here, I can choose how far we want to be zoomed in by setting the distance of the center coordinate of the camera to 600 meters. By adjusting the pitch and heading and tilting the camera into 3D, you can see amazing and useful details like turn lanes, crosswalks, and even trees. By default, elevation will be flattened. In order to help users understand the terrain they'll encounter, I can specify a preferredConfiguration with elevationStyle 'realistic' to include 3D elevation.
When adding an annotation or a route line sourced from MapKit's Directions API, MapKit automatically handles elevation and will adjust the annotation or route line by placing it on top of the 3D terrain. Animating the camera heading by adding a slow pan really brings the map view to life. When a user switches into Dark Mode, the map will adjust together with the rest of the UI. We're very excited to make this immersive experience available to developers with iOS 16. In addition, we're also bringing another popular Apple Maps feature to MapKit, Look Around, which is a great way to explore the world at ground level, with high resolution 3D photography and smooth animations. Users can simply tap to move down the street.
I can add a static Look Around preview just below the map by dropping in a View Controller and specifying a MapItem. The Look Around view automatically frames the location correctly. When a user taps on the preview, I can choose to provide a full screen Look Around view, where users can see the address, the date the imagery was collected, and they can tap to move around freely to get a better understanding of their surroundings. There's one more new, highly-requested capability we're introducing to MapKit in iOS 16– Apple Maps Server APIs. Maps Server APIs are RESTful and support four of the most used functions of MapKit: Geocode, which turns a lat/long into an address; Reverse Geocode, which does the opposite– it turns an address into GPS coordinates; Search; and Estimated Times of Arrival. Our new Maps Server APIs are a great way to make your own backend services richer and more performant. Of course, MapKit is built from the ground up on the same foundation of privacy as Apple Maps, and does not associate users' data with their identity or keep a history of where they've been. And that's a quick look at what's new with MapKit in iOS 16. Now for the weather, or at least how you can build it into your app, here's Novall. Novall Khan: We announced today that we're bringing the Weather app to iPad and Mac, and introducing powerful new features, including severe weather notifications, rich detail views, and ten days of hourly temperature and precipitation forecasts. And there are all kinds of other experiences across Apple devices and platforms that get better because of the weather data we provide– from asking Siri for today's forecast, to rerouting navigation around flooding. All of that is built on our Apple Weather service. Apple Weather delivers a world-class global weather forecast using high-resolution meteorological models combined with machine learning and prediction algorithms. Apple Weather provides current weather, 10-day hourly forecasts, daily forecasts, and historical weather so you can evaluate trends in data. Severe weather alerts and minute-by-minute precipitation are also available for select countries around the world. Forecasts feature 10 days of hour-by-hour temperature, precipitation, UV index forecasts, and much more.
And all of this data is available to you through WeatherKit. WeatherKit is a native Swift API for all Apple platforms, and a REST API you can use from anywhere. These APIs deliver accurate, hyperlocal weather forecasts, to help your users stay safe, informed, and prepared. Let me show you how easy it is to get weather information through WeatherKit's great Swift API in a quick demo. Let's revisit our Food Truck app. To make sure my customers don't get caught in the rain, my app is set up to recommend a parking spot with clear skies. Let me show you how I can get the weather. Here I have a list of safe parking spots. I've already added the WeatherKit capability in Xcode, and all it takes is a few lines of code. With Swift Concurrency, requesting weather is simple. We call weather(for:) on WeatherService, and pass in a location. Then I can get the relevant data I need for my app, like condition, precipitation, and cloud cover.
Now that I have the data I need for each of my parking spots, when I run my app, my custom view has updated to recommend a location with clear skies.
There are all kinds of ways you can use weather data to make the experiences in your apps better. You might use weather forecasts to help you with inventory, predicting that your ice-cream filled donuts are going to be a popular order on a hot day, so you should stock up on ice cream. That's all it takes to get the weather for our food truck, and there's so much more to uncover with WeatherKit. In keeping with Apple's commitment to privacy, location is used only to provide weather forecasts, is not associated with any personally identifying information, and is never shared or sold. Privacy is a shared responsibility, and through WeatherKit, you can get accurate weather data while protecting user privacy. Because we want to make it easy for you to get started with WeatherKit, we're including 500,000 weather(for:location) API calls per month in your Apple Developer Program membership. Those of you who need more will be able to purchase additional tiers of service right in the developer app, starting this fall. So that's WeatherKit, accurate, hyperlocal weather forecasts powered by the Apple Weather service. We're starting with a beta, and it's available now on all platforms. There are so many creative ways that you can use WeatherKit in your apps. And now here's Ryan to give us some perspective on what your apps can see with Live Text. Ryan Dixon: Our users are loving Live Text, and we have heard from many of you that you want to bring it to your apps too. So this year, we are expanding VisionKit with two new APIs that will allow you to do just that.
The Live Text API unlocks the ability to analyze image content, allowing users to interact with text and QR codes found in photos and paused video frames, and it provides quick actions so your users are just a tap away from taking action on relevant data. It's great for any app that displays visual media, like Apollo for Reddit or Vimeo. And the Data Scanner API unlocks the ability to analyze a live camera feed. It dramatically simplifies text and barcode ingestion for users. All you need to do is add any overlays or custom controls that tailor the live camera experience to the needs of your app. This is especially useful for consumer apps that rely on QR codes or enterprise apps built for back-of-warehouse inventory management, pick-and pack delivery services, and point of sale kiosks.
Both the Live Text and Data Scanner APIs support automatic detection of nine languages, including this year's additions of Japanese and Korean.
These VisionKit APIs will bring years of computer vision innovation to your app with just a few lines of code. Here's Jenny to show you how. Jenny Chen: To show you a demo, we're back to our trusty pop-up Food Truck app. We're doing a promotion where if users post a picture to a social channel holding a sign up with the hashtag #freedonut and their address on the app, we'll drive to their address and deliver a free donut to them. We'll head to our social donut feed. We want to add Live Text to the images so that drivers can extract the text to get the addresses for delivery.
Here's where the new Live Text APIs play in. I can easily add an ImageInteraction on top of my view, and that will add the Live Text button with quick action support to it. While the Live Text button normally sits on the bottom right, I already have a heart button in my app, so I can adjust the placement using custom insets. I can also set the button configuration to customize the style of the button so that it matches my app better. Now that I've added it in, I can tap the Live Text button, select the text, or use quick actions to easily grab the address.
I love that users don't have to learn a new interaction model, as it provides the same ease of use with the Live Text experience. The UI is consistent and familiar. It feels integrated with the OS, but I can still adjust the placement even when I have my own custom UI.
Of course, like any good delivery app, we also want to provide the best service to our customers and make sure we're giving people the right donuts. So we track our donut orders via QR code. Using the new Data Scanner APIs, I can easily add that as the first step of any customer interaction. Right now, that button doesn't do anything, but I can easily instantiate a new DataScanner object that looks for text, QR codes, or barcodes that I can then import into my app. With just a few lines of code, I can bring up the camera, specify that I want QR codes, and startScanning! When the driver taps on the QR code, I want to show that the scan is a success. I'll add in the delegate handler in Xcode...
...and on tap, show an alert to the user that the scan was a success so that I can go ahead and get the donut order started. Now when I run the app, this brings up a view controller with the camera view, and I can see the guidance and the reticle view highlighting the QR code. When I tap on the QR code, I can see my scan was a success and the donut order is confirmed. And with that, my #freedonut delivery is on its way. With VisionKit, the new Live Text and Data Scanner APIs easily allow you to bring these powerful vision capabilities to your app. And now, back to Susan. Susan: This is an exciting time to be building apps. Xcode Cloud is now ready to help you build better apps faster. With Swift and SwiftUI, it's easier than ever to transform your ideas into apps that work across Apple platforms. There are cool new ways for your apps to bring your ideas deeper into the system experience. Lock Screen widgets and Live Activities bring your app to the Lock Screen. Messages Collaboration makes it incredibly easy for your users to connect and collaborate. And App Intents help integrate your app with Siri. There are entirely new APIs and major updates to existing APIs, like WeatherKit, MapKit, Live Text, and Metal. And that's not the end of the story. It's another big WWDC this year, with 175 sessions, hundreds of labs, and Digital Lounge activities running all week. We can't wait to connect with you this week, and more importantly, this week is for you. We're eager to see what you create next. Thank you! ♪ ♪ .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.