When you create an intent for your app, you can help people accomplish tasks quickly by using it as part of a shortcut or when asking Siri. Learn how to adopt Siri more easily than ever when you use SiriKit's in-app intent handling, and how to improve Siri performance with existing Intents app extensions. We'll also show you how to leverage features in SiriKit to improve the experience of using your actions — like including images and subtitles for a rich conversational experience. And find out how to fine tune support for intents in your codebase to make your life as a developer easier.
Hello and thanks for joining us. My name is Roman and I'm a shortcuts engineer at Apple.
Today I'm going to be sharing with you some techniques and some strategies for efficiently implementing Siri and Shortcut support in your applications.
First we are going to take a tour of the new a.p.i's in Sirikit. And then we are going to learn all about how you can fine tune custom intents in your app. So let's start with a brief overview. Sirikit encompasses the Intents and Intents UI frameworks which you use to integrate your services with Siri, Shortcuts, and maps. And in iOS 14 you can use intents to add configuration and intelligence to widgets. An intents UI app extension displays custom UI in the Siri shortcuts or maps interface. After your intents app extension fulfills a user request an intents app extension receives user requests from surrogate and turns them into app specific actions such as sending the message, playing some music, getting current weather conditions, or ordering a soup. Sirikit defines a large number of system Intents that represent common tasks. People take. For system intents Siri defines the conversation of flow. While your app provides the data to complete the interaction if your app lets people perform an everyday task that doesn't fit into any of the Sirikit domains you can create a custom intent to represent it.
The job of your intents extension is to provide Sirikit with the handler objects that you use to handle specific intents. You provide these objects from the handler Florentin method of your extension. Let's take a look at the lifecycle of handling and intent. Every time the user interacts with an intent whether the results confirm or handle phases your intent handler has 10 seconds to complete the request. The 10 second timeout starts as soon as the user request initiates a connection to your extension.
When this happens if your extension is not yet running it will be launched by the system. The amount of time it takes to launch your extension depends on how long it takes to load all of the frameworks. Your process links.
And how much time it takes to run the plus loads and static initializes included in your process or any frameworks it links. Your extension also needs some time to perform your business logic. So it's important to optimize for launchtime by making sure that your extension only links to the framework's that it really needs because the time spent loading over the link Framework's Counts towards the 10 second timeout. The amount of time it takes to handle requests also affects how people perceive interactions with your app. Siri interactions are intended to be quick and lightweight. So you should avoid making the user wait while you're intent Handler is working. Ten seconds seems like a lot. Since most of the cases your intents extension doesn't really need to link all of the frameworks that your app links you have the opportunity to minimize the number of symbols that you import. Another noteworthy characteristic of intents extensions is that they are modular independently run processes with a lower memory footprint than an app.
However sometimes it's not always possible or convenient to use extensions iOS 14. We are introducing in-app intent. Now you have an option to add an intent handler to Your app where you can handle Sirikit requests such as result confirm handle. Let's talk about a few good use cases that should help you decide when to handle your intent in your app versus your extension.
Starting a controlled media playback or starting work out previously required you to perform results and confirm in your intents extension and then handle in your app. It is more efficient to do this entirely in your app process now.
In another scenario if handling your intent affects your apps user interface live on screen it's also good candidate for an in app intent handling. In app Intent handling also opens up an opportunity for new use cases that weren't possible before due to memory constraints of extensions such as photo and video processing. And well let's be honest. In some cases your app structure doesn't currently allow you to factor out code into an intents extension or a share framework. But of course be mindful about the launch time of your app because it will eat into the 10 second timeout we talked about earlier in this session. When you're designing your own 10 hadlers you should evaluate which intents need to be handled in your intents extension and which can be moved to your app. Let's see how you can implement support for in app intent handling in your app. The first thing you need to do is to make sure that your app supports multiple windows and has adopted the UI sin lifecycle when your app is launched in response to a Sirikit request.
It will be launched without any UI seen objects connected to your app.
Then you need to list all of the intents that you would like to handle inside of your application in the support intents section of your apps target.
And finally you need to implement the handler for intent method in your apps delegate. This method acts as a dispatcher mapping common sense to the object capable of handling them. In your implementation check the type of the intent parameter and return an object capable of handling that type of intent. The object return must adopt the protocol used to handle that intent.
For example when handling process for intent object. Return an object that adopts the process for intent handling protocol. If handling an intent updates your applications user interface requires users to be looking at something in the app before using the intent. Your intent handler may need to make sure that the relevant UI is on screen in the handle face. You can do this by checking to make sure the app is not in the background.
And if it is asking the system to open it by responding with continue in app response code let's see how to add in app intent handling to an app Xcode. I got into cooking at home recently so I've been working on this app called recipe Assistant that allows me to browse my favorite recipes.
When I tap in a recipe I can see all of the ingredients needed to make this recipe. I can tap on the directions button to view step by step directions when I tap on the next up button. I can advance to the next step. However sometimes it's not always convenient to tap buttons on screen while you're preparing the meal. So I would like users of my app to be able to advance to the next step by using a shortcut that they invoke with their voice using Siri. I'm going to implement in app intent Handling in my app since my users will be interacting with the content on screen.
Here in Xcode I have my intend to finish and file where I defined my custom intent called Show Directions. Also added the at the Syria bottom to my recipe directions for you so users can easily setup a shortcut here. This is what it looks like in the app. Now back to Xcode have already adopted Multi-window support in my app. Now I need to add show directions intents to the list of intents eligible for in App intent handling. To do that I'm going to click on the plus button and add my intent. Now what I want to do is to make every view controller in my app to be able to respond to the next step command.
To do that I'm going to define common intent handler that accepts an object conforming to the next step providing protocol.
If view controller that adopts the next step providing and protocol will need to return an instance of the intent handler class. It would also need to implement the next step function that will take the user to the next step in the app. Let's conform our intel handler so the show directions intent handling protocol. That was code generated for us and implement the result method. In the Resolve method for the recipe parameter we are going to check if you have a recipe. If you don't have a recipe we'll ask for disintegration.
Otherwise we will return success. In the Handle method we are going to tell the next step provider to go to the next step. We need to make sure that our app is in the foreground. And if it's not that we are going to launch the app using the continued app response code we will need to handle launching with the user activity in our scene. Delegate and I will come back to this in a moment. Now I will create a new Singleton class that will hold a weak reference to the current intent handler. We are going to assign the current Intent handler in viewed it up here of each view controller in our app. In my app delegate I need to implement the new handler for intent method and return the current intent handler instance. In my scene delegate object I need to implement both will connect the session and continue user activity methods to continue user activity. When we respond with continue in that response code in our handle method. We'll connect to session. It will be invoked when the app does not have a UIScene object connected to it. Now we need to conform. Each view control. So the next step providing protocol. And finally we need to make sure to assign the current intent handler. In view did appear.
Let's give it a try. First I need to add my shortcut to Siri.
Now let's see what the experience looks like. Hey Siri next step.
Which recipe would you like to see. Spicy tomato sauce chickpea curry or cinnamon apple cake. The first one. OK viewing. Here are the ingredients for spicy tomato sauce. Hey Siri. Next step. OK. Viewing step one in a large pot. He olive oil on medium heat. Hey Siri. Next step. OK viewing step 2. Add minced garlic and saute for a few seconds until fragrant pretty cool isn't it.
When deciding between handling and intent in an extension versus an app you should ask yourself first can this task be accomplished in an extension.
Because if so it's better to do that. Intense extensions can be more lightweight and faster to launch. Depending on the number of frameworks and symbols that doing this is your opportunity to optimize for launch time. Since you decide which frameworks your extension is linking to summarize what we learned about in app and tackling today. Consider implementing an intents extension first. Always be mindful of how many frameworks you link in both your app and your extensions. In app intent handling is only supported for multiple window Apps so make sure to check out. Introducing multiple windows on iPad an architect in your app from multiple windows from WWDC 2019. We are very excited to see all the great new Sirikit integrations that you will build using in-app intent handling. Now let's take a look at another API enhancement that we made this year. We call it reached disambiguation.
In iOS13 we introduce parameters for shortcuts which allow your users to provide intent parameter values at runtime when resolving Perameter values of your intent. You can return disambiguation resolution result and the user will be prompted to pick from a list of values. This year we are introducing the ability for your app to include subtitles and images in those lists.
The API is pretty simple. You just need to provide the subtitle as a string an image as INImage for your custom types at runtime. What's really cool is that you can also provide dynamic options with images and subtitles that you users will see when they configure your intent and the shortcuts app.
Another addition to Disambiguation list is pagination in Siri. As a developer now you can provide a number of items that Siri should speak to the user at once and also provide subsequent introductions spoken by Siri that the disintegration pagination will be only used when the user invokes Siri by saying Hey Siri here is how to support rich disambiguation with X code. Open your intent definition file where you define your custom intent.
And expand the Siri dialogue section for the parameter that needs customized disambiguation pagination here you can simply specify the maximum number of items that can be spoken to the user at once. As well as the subsequent introduction string you can also override the disambiguation introduction dialogue provided by Siri in the voice only mode by specifying your own dialogue in the intent editor. Another addition to the Siri API this year is Dynamic Search for Dynamic Options.
Last year we introduced a dynamic option API that allows you to provide a set of values for eligible parameters dynamically when the user is configuring your intent in the shortcuts and. This year we are expanding this API to include the search term provided by the user. There is a new checkbox to provide search results. As you type if you check the checkbox it will code generate a new method that includes search term. This method will be called repeatedly while user is typing if the search term is empty. You can provide the user with the list of default values. Dynamic search should only be adopted for search and large catalog's not for filtering small static collections because the Shortcuts app supports filtering by default. The provide options collections method completion handler accepts a new high end object collection object using this new object. You can now group your dynamic options intersections with titles and optionally use index collection in iOS 13. Introduced configurable parameters for shortcuts. Each parameter that you specify as user facing had to be resolved at one time. Now in iOS 14 you can mark each parameter is configurable and resolvable separately. Siri and the Shortcuts app will not resolve parameters which are marked as unresolvable.
You don't need to provide Siri dialogue for unresolvable parameters. Today you learned all about in-app intent handling and how you can add support for it in your apps. You can enhance your disambiguation lists and dynamic options with rich disambiguation in iOS 14. Dynamic search gives you a new flexible way of providing dynamic options for your intent parameters.
When you're designing your intents decide which parameters should be configurable and resolvable at runtime. Now let's take a look at some useful tips and tricks that could help you take your custom intents to the next level. If one of your custom intents is no longer needed because you discontinued the correspondent feature in your App or you're replacing the intent with another intent is now possible to deprecate custom intents in iOS14 using Xcode twelve in the intent editor select the intent that you would like to deprecate and reveal the inspectors by clicking on the expand inspectors button in the toolbar. Now all you need to do is just check that deprecated checkbox.
In the shortcuts app your existing users you will see that this intent action might be no longer available in future versions of your app. Additionally this intense action will be hidden from the shortcuts action list. Now let's talk about how you can specify your custom intent class names when you define a custom and end it custom type or a custom enum in the intent error.
You define a type name not the actual class. The actual class will be code generated for you based on the type name. So if your app uses the quest prefix this is not the right place to specify for your custom intentstaps and enum. In-Step provided your desired class name in the custom class. Inspector of your intent. Alternatively you can specify a common class prefix used for all custom intense types and enums in the project document inspector of your target where the code generation needs to happen. This allows you to have different class prefixes in different targets if you need.
Some custom intents ask that you define in your intent definition file may require an intense UI view for the confirm and handle phases while others don't.
Let's see how you can efficiently manage that. When you add your intent definition file to an intents UI extension target X code automatically lists all of the intense from that intent definition file supported by this intents UI extension. However for some of your custom intents you might not want to display any UI at all.
You can easily achieve this by creating a separate intend to finish and file for your non y intense. Then in the target membership inspector of your intent to finish file simply don't include it in your intents UI extension target.
Sometimes you might want to explicitly choose the cogeneration language for your custom intense and you can easily do this in your projects bill settings.
By default Xcode will automatically decide the intent. Cogeneration language for each of the targets eligible for cogeneration based on the existing source files included in this target. But here you have an option to customize this behavior. So we've gone over a lot today. We've gone over some of the major enhancements in the SiriKit API This year.
We've also gone over some of the best practices that you can keep in mind when you're designing your custom intents. Thank you so much for watching.
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.