Increase Usage of Your App With Proactive Suggestions
iOS and macOS can proactively promote your app and data, thereby increasing user engagement. See how adopting a few simple APIs to inform the OS about your app's capabilities can lead to your app being suggested in various places like the app switcher, on the lock screen, Contacts, and more.
Thank you for coming out
and spending your Friday afternoon with us.
Hopefully you had a great week at WWDC.
And I'm excited for our talk now because I think we're going
to discuss something that all of you want.
Fundamentally, everybody is here
because you love building great apps,
and you probably want those great apps to have more users.
Now, the good news is, we want to figure out opportunities
to offer your app at the right time to our customers,
so that you get more engagement.
And we're going to spend the entirety of this talk discussing
about different ways that you can do that.
We promote apps in various forms,
throughout the operating system today in many different places.
Siri app suggestions are available to our customers
when they swipe left from home,
as well as when they invoke search.
Even after entering a query, you may get results
from inside other third party apps.
Handoff is another way we advertise third party apps,
your apps, as well.
As you start an activity from one device and more to the next,
we try to anticipate what you're about to do and offer
that app's icon in the bottom left hand corner of the screen.
In certain situations, where we think what you really want
to do is play a form of media, we'll promote your app's content
to the lock screen of the device.
So say after you plug in a headphone or Bluetooth pair
to your car or your speakers, if there's an app
that you usually use in that circumstance,
the system will learn that, and then offer not only that app,
but that app's content.
And we're going to talk today
about how you can build a great experience for that environment.
As you're using the operating system,
say reading a news article, you can create reminders
that are contextual to the content you're staring at.
So for example, if you're reading an article
and you decide that you want to take another look at it at home
or you want to read it at home because you're in the middle
of watching a WWDC presentation, you can invoke Siri.
And Siri will understand what you mean when you say,
"Remind me about this."
It will have the context of what's currently on screen.
And this will work in any app that adopts some of the APIs
that we're going to discuss later today.
This is particularly exciting to you because this means
that when customers open the Reminders app,
they'll be de-blinked right back into your content.
New in iOS 10 is the promotion
of locations throughout the system
when we think the customer has a particular intent to want
to go somewhere, to want to use a location
that they've been recently looking at.
We'll be talking later today
about how you can get your app's data flowing through the system.
Multitasking as well is another interface
where we will promote locations we think the customer is
You can get your app, your icon, suggested right there
at the prime real estate in the Multitasking UI.
New in iOS 10 as well is our promotion
of your app's contact information
within the native contact app.
So you notice here, there's a handle that's been found
And we'll talk later today
about how you can get your app's content in that interface.
So, to learn when to promote you, the operating system needs
to learn about your data and your app.
Now, the good news is that a few simple APIs will give you much
of the value that I just described and more.
In addition to having your app promoted at various times,
you'll also get deeper Siri integration like I described
with the contextual reminders.
So what we're going to do today, is we're going to talk
about these APIs and we're going to walk you
through proper adoption of them so that all of you end
up getting more engagement from users due
to promotion throughout the OS.
First, we're going to talk about NSUserActivity and schema.org.
NSUserActivity is the kind of the API that is the eyes
of the operating system.
It helps us understand what the customer is currently staring
at on screen.
And schema.org is somewhat of an equivalent for the web.
Next, we're going to talk about some new enhancements
and new APIs around apps that handle locations.
So if you're an app like Yelp that's a directory
that offers many different locations, or conversely,
if you're an app like Uber that consumes locations,
this will be very relevant to you.
Then we'll talk about building a great experience
around media apps suggestions.
I showed you earlier an example of the type of content promotion
that will happen, say after a customer plugs
in their headphones, and we'll talk
about how you can get your content front
and center in that interface.
And then, we'll try to overview
about all the different things we hopefully learn today.
To kick things off with NSUserActivity and schema.org,
I'd love to invite Sofiane onto the stage.
Good afternoon everyone.
My name is Sofiane.
And I'm really excited to be here today to talk to you
about some of the features that we've been up to lately.
Now, we just heard Daniel talk about great ways
to promote the content in your apps, throughout the system.
Now let's talk about some of the APIs
that you can adopt to make this happen.
Throughout this presentation, we'll talk about features.
Some that you may be already familiar with, like Handoff
or Spotlight Search, and some new features we're introducing
in iOS 10, like location suggestions for instance.
Now all of the features that you're looking at here,
have something in common.
And it's a single API called NSUserActivity.
First, some background.
NSUserActivity was introduced in iOS 8, support Handoff,
which is the amazing feature that allows you
to start an activity on a device and pick it up right
where you left off on another.
Last year in iOS 9, we added support
to promote content directly
from NSUserActivity inside Spotlight Search results.
Now, this year in iOS 10, we further enhance NSUserActivities
so it can capture locations viewed inside your app
and promote them in many places throughout the system.
Even inside other apps.
NSUserActivity now also provides context to Siri,
so Siri can now help you get directions, or make a call
to the place you're looking at inside an app.
Also new in iOS 10,
NSUserActivity enables your communication app
to be promoted straight from a contact card,
as an alternate communication method.
Now, I know what you guys are thinking.
"That's cool," hopefully.
"But how does this work?"
Well, let me tell you, it's really straightforward
and we have a lot of great content to talk about.
So let's dive right in.
We're going to talk about NSUserActivity and schema.org.
These APIs allow you
to seamlessly integrate with the system.
NSUserActivity for native apps, and schema.org for the web.
We'll focus on the NSUserActivity first.
NSUserActivity is a lightweight interface
to capture application state as users move through your app,
in a way that can be restored later.
So for instance, here we're looking at the Yelp app
which is a local search app, and as the user browses the app,
we created an NSUserActivity capturing the information
that we need to recreate the state later.
So for instance, when I trigger a search for a restaurant,
we follow the same pattern of creating activities.
And again, when I actually view result from that list.
Now, in this particular case, we're looking at a location.
In iOS 10, NSUserActivity is now more aware
about certain concepts such as locations
or communication interactions.
We'll talk about this in a minute, but before I wanted
to take a closer look at this screen.
So here we create an activity and describe it with information
to recreate the state later, as well as submitted data.
For instance, the location's name or address.
Then we inform the system
that this represents the current user state,
and we decide whether it will be advertised for Handoff,
available temporarily throughout the system
for location suggestions.
Or added to the on device index --
or added to the Spotlight on device index, so that it appears
in Spotlight Search results.
Now there are a few related sessions on this topic,
specifically on adoption of NSUserActivities
for Handoff and App Search.
Really recommend that you guys go check them out.
They provide a lot of great content.
Alright, now let's dig into the code, shall we?
So, I'm going to show you how you create these activities.
So here we're extenuating an instance of NSUserActivity.
And providing it with an activity type.
That's the string that you provide.
It's the same string that you specify in your info plist.
Now we recommend to use a reverse the NS style notation
here to keep the strings unique.
Next, our activity needs a title, and we're going
to use a title that describes the content we're looking
So here we're looking at a restaurant.
And something to keep in mind is that this is the public face
of your user activities.
This is how, for instance, they're represented
in Spotlight Search results.
So you want to make sure you use a title
that is descriptive and meaningful.
Next, we're enabling Handoff, App Search, and public indexing,
since the content we're viewing here is public.
And then we're setting a dictionary
on the userInfo property, capturing the information
that we need to recreate the state
when the activity is restored.
Now in that case, we use the unique ID
of that location we were working with, and typically,
when the activity is being restored, for instance,
when it's handed off to another device, you can fetch back
that ID from the server and restore the location
as the user would expect.
Now to fully describe your activity
and for richer search results,
you can use this class we introduced
in iOS 9 called CSSearchableItem AttributeSet.
It provides a common language between your app and the system
to describe your content better.
Now, if you have a website that mirrors the content
that you're looking at inside your app,
can provide the web URL so that if your activity is being
for instance, handed off to a device
where your app is not installed,
it will be appropriately launched
in Safari instead, following that link.
And finally, we call becomeCurrent on the activity,
indicating to the system
that this represents the current user state.
This is what the user is seeing on their screen.
Now, what happens when your app is actually being launched?
When your activity is being restored, for instance?
Or handed off to another device?
Your application is launched, which is good,
and a UIApplication delegate method is called,
called continueUserActivity restorationHandler.
The first thing we need to do here is check the activity type
and make sure that it matches one that we expect.
That case matches the one
that we were just working with earlier.
And then we can use the User Input Dictionary
to start restoring the state.
Earlier, we had put the unique ID of that location.
So typically here, we would be able to fetch it back
from the server and then restore it back
and display the right controller as the user would expect it.
So with minimal code here, we were able to benefit
from three great features; Handoff, Spotlight Search,
and contextual Siri reminders.
Now, we'll talk about some of the other features listed below.
Before, let me ask you this.
Have you ever been in a situation where you're hungry
and you're trying to find a good restaurant to eat?
So you start in Yelp and you find the perfect place.
Then you want to text your friends
so that they can meet you there.
So you switch to messages, start writing,
"Hey guys, I found this place."
But then you realize you don't actually have the address
of that place, so you switch back to Yelp,
you copy the address, then you switch back to messages.
You paste it in there, and then you're done finally.
Except you're not, because at some point,
you're also going to need to get there.
So you're going to switch through maps,
and there's probably more back and forth in there.
And you get the idea.
It's not a great experience.
So in iOS 10, we're trying
to make this experience much easier.
You still start at a same point.
You're looking at a location,
say a restaurant, inside your app.
And by adopting NSUserActivity, your app can promote
that same location in many different places throughout
When you switch away from the app,
the multitasking UI makes it very easy
and offers this nice proactive suggesting at the bottom,
offering to get directions using your favorite routing app
to the place you were just looking at.
If you go to Messages, no more back and forth.
You can start writing something like, "Let's meet at--
" and the QuickType UI will automatically promote
that location directly into QuickType from the content
that you're providing.
Same thing with Maps.
Maps promotes these locations directly in the Maps app.
Not only in the Maps app,
but also in the new Today View maps widget,
as well as in CarPlay which is my favorite feature actually.
Look at your app, get in your car, tap on your screen,
that's it, you're done.
You can be on your way.
These locations can even be promoted inside other apps.
So what we're looking at here is a location provided by Yelp
through NSUserActivity displayed inside the Uber app,
as the user engages with a destination text field.
Again, it also provides context to Siri,
so you can now be looking at your app and ask Siri
to get you directions or make a call
to the place you're looking at.
So all of these features with the adoption of NSUserActivity.
And all of these features have something in common,
is that they all indicate
where the suggestion is originally coming from,
in that case from Yelp.
Now imagine if this were your app.
Wouldn't that be great?
Well, this is possible now.
NSUserActivity now allows you to do so by capturing locations
that are viewed inside your apps.
We're introducing new simple APIs in both MapKit
and Core Spotlight, allowing you to do that, and integrate
with all of these places that I just talked about.
In terms of code, we're going to create an NSUserActivity
and we're going to reuse the same activity
that we had defined earlier, and just add on top of it.
We don't have to create separate activities.
So, for apps using MapKit, it's as easy
as setting your MKMapItem to this new MapItem property,
defined as NSUserActivity.
And that's pretty much it.
This also has the great side effect
of populating the content attribute sets
for you automatically so you don't --
so if you want to adopt App Search, all you have to do is
to specify that you want opt into App Search
and these locations will also be indexed.
Now there's another way which is by adopting App Search actually,
and for those of you who are already familiar with that,
you can use CSSearchableItem AttributeSet
and describe the location.
Now, let's break this down a little bit.
First thing we're doing here is setting the name
of the location, which matches what we're looking at.
And that's required for all
of the UIs we just talked about earlier.
Next, we're specifying the text representation of the address,
and that's required for all of the text based UIs,
such as the keyboard, for instance.
Next, we're specifying the latitude and longitudes.
That's optional, but that insures
that your users are accurately routed
to the place you intend to route them to.
Latitude and longitude is the most precise representation
that you can use if you have this information.
Next, we're setting the phone number, and that allows us
to get access to Siri so that we can say,
"Call this place," for instance.
And finally, we're indicating
that this content supports both navigation and phone call.
This way, the results will be presented in the spotlight UI.
In Spotlight Search results UI,
with the right quick action icons to make a phone call
or get directions, for instance.
That's how you get location suggestions.
It's that easy.
Now, your app is going to be promoted in all
of these different places.
We think this is going
to provide a much richer experience, both inside
but also outside your app.
Now, your brand can be further promoted throughout the system.
Not within the confinement of your apps only.
Let's move on to contact interactions.
IOS 10 DP integrates communication apps directly
from a contact card.
So what we're looking at here is a contact card
where the WhatsApp handle was learned and displayed there.
When I go ahead and tap on the Quick Action button at the top
to send a message, I'm asked to select what method
of contact I'd like to use.
And if I do so, this choice can even be remembered and persisted
as a default mean of communication.
So your app can be promoted there as a default channel
for messaging for that specific contact.
That's also supported by NSUserActivity together
with a new intense framework,
which is the same framework you use
to get deep Siri integration.
There were two great talks about this topic earlier this week.
I really recommend that you guys go check them out.
So here we're looking at this WhatsApp app,
sending a text message, and we're going
to work on how this works.
So the first thing we do is create an INInteraction object
which is a new object we're defining in the intense API.
This contains information about the intent.
For instance, which represents the user action.
So for instance, sending a text message or making a video call,
as well as some other information about the recipient,
the sender, and some metadata in creating things
like whether the communication was successful, for instance.
Next, you call the Donate Method on your interaction.
This way the system will learn about this interaction
and promote your app directly inside a contact card.
Now, when users interact with your app
through the contact card, we follow the same pattern.
Create an INInteraction representing what the user is
trying to do.
And wrap it up inside an NSUserActivity
which your app is then launched with.
Now let's see how this translates into code.
The first thing here,
we're creating the sender and recipients.
And something to note is that if your app has access
to the user's address book,
you can specify what contact identifier you're working
And this way the system will be able to better figure
out the right contact to be augmented.
Next, we're creating intent.
And here we're using the INSendMessageIntent
since we're sending a text message.
Now there are three communication intents available
to you for sending text messages,
making audio, and video calls.
Now it's important to note that you want
to also make sure you specify what intents your app is capable
of handling in your info plist
to that pretty much the same thing as with activity types,
what we talked about earlier.
Next, we'll provide the service name we're using.
And here we're using WhatsApp.
And this is particularly useful if your app deals
with multiple protocols.
So for instance, if you have multi-client chat app,
which deals with both Jabber and ICQ for instance,
you'll want to make sure you use the right service name here.
Next, we create an INInteraction object which captures all
of the information we created before.
And some metadata including things
like directionality of the communication.
Now since here we're sending a text message,
we're using the outgoing direction,
and that's the only supported direction
for donations of interactions.
And finally, we donate this interaction to the system.
This way it will be --
the system will be able
to augment the contacts you're were communicating with.
Now again, like I said, when users interact with your app
from a contact card, we follow the same pattern
and use the same UI application delegate method
to launch your app.
The first thing we're doing here is check the intent
and interaction from the NSUserActivity,
and then we can start communication
with the intended recipients.
And that's how you get contact interactions.
Now, your app can be promoted straight from a contact card,
which is the most natural way of communicating.
So we did it.
We checked everything in that list.
But we're not done.
We have some more to talk about.
So let's talk about some best practices to keep
in mind while working with NSUserActivity.
The first thing is to be lazy.
Okay, now we're talking.
So, imagine you're working in a Mail client, and you're trying
to adopt Handoff so that your users are able
to start writing an email on their Mac and pick it up right
where they left off on their iPhone, when they're
on the go for instance.
So here what we're doing is we have a method
that is called every single time a key is pressed
on the keyboard.
And every single time, we recompute
and update the userinfo dictionary.
Now that can be costly and inefficient.
So instead what you should do is to use the needsSave property,
indicating to the system that this activity is dirty.
And then this way, the system will be able to call you
at an opportune time to update your userinfo dictionary.
For instance, right before the activity is being handed off
to another device.
Now here, this is what we're doing,
and we're implementing the updateUserActivityState delegate
method, and this is our opportunity
to update the userinfo dictionary.
This way we can do it only once
and we're way more efficient like that.
Next, make sure you keep a strong reference
to the current activity.
So what we're doing here is extenuating NSUserActivity,
doing a bunch of things to it, calling becomeCurrent to it.
So that's good.
Except after that, we're not keeping a strong reference
and exiting the function.
And therefore, the activity is released
because it goes out of scope.
An activity release cannot be current
because it doesn't exist.
So instead what you should do is keep a strong reference
to the current activity.
If you're dealing with UIViewControllers or anything
that conforms to a UI responder,
can also use the User Activity Define --
User Activity Property, defined on UI responder.
And this has the great effect
of having UIKit manage the activity currentness for you.
So you don't have to call becomeCurrent of invalidate it.
UIKit does it for you as your view controller
for instance are linked to the view hierarchy.
Next, transfer a small payload.
So make sure that you keep only just enough information
in your userinfo dictionary,
to recreate the states, but not much more.
And remember that these user info and activities are sent
over the air, by Bluetooth, so that's why they need to be kept
as lightweight as possible.
So for instance, imagine you're working on an app that deals
with large photos downloaded from the internet.
So what we're doing here is downloading these photos,
serializing them as data, and putting them
in the userinfo dictionary,
which could be very easily several megabytes,
which is not efficient.
Instead what you can do is try to keep a unique item to fire
to that content you're working with, for instance, the web URL,
this way when you're restoring the activity,
you can maybe fetch it back from a server
or obtain it through other means.
Now if you really have to deal with large payloads,
you can use continuation streams,
and these are specifically designed for that.
There's a ton of information on this
in the developer documentation.
Okay, one more.
Make sure you keep your activity types unique and descriptive.
And that's to avoid collisions.
But it's also important
that they represent the current user activity,
or the current user actions.
So instead of having a single activity type here,
that I reuse everywhere inside my app,
it's good to have multiple activity types describing the
different parts of my app.
So for instance, one when I'm viewing a location,
one when I'm searching for locations,
and what you'll notice here as well is
that I'm using a reverse DNS style notation which makes sure
that there's no collision with other activity types
that might have been defined inside other apps.
Alright, time for a demo now.
So we're going to use our Proactive Toolbox app
which is a sample we're making available to you
on the Developer Library.
And we'll see how we can adopt NSUserActivity
to promote locations.
We'll also briefly touch on Handoff as well as App Search.
Let's get writing.
Alright, so I'm going to launch the Proactive Toolbox app here
and I have my device running iOS 10 on the left,
and Xcode here right behind.
So I'm going to give you a quick tour of the app first
to get an idea of what it does.
So here we have a list of pizza places.
This is an app that searches for pizza locally,
and uses core location to get my current location as well
as MapKit to fetch places that match the pizza criteria.
So I'm going to look at the first one here,
and I get a richer page with more details about it,
like the name, phone number, address, website,
as well as a map of this place.
Now we're going to work on how we can go ahead
and implement location suggestions
so that this same location is promoted
in many different places.
But before this, let's do that.
I'm going to time myself.
How about that?
See how long it takes to do so.
Now, I'll start the timer and let's go.
Alright, so I'm going to go in Xcode now.
And this is my location view controller which is the control
that I use to display the right, richer search --
the richer results for the pizza place.
And this is the method
that I call every time a map item is updated inside my app.
So I'm going to go ahead and drag and drop this right there.
And this is where I'm creating in NSUserActivity.
So as you can see, I am creating an activity, providing it
with an activity type that is meaningfully representing what
In this case, I'm also using a reverse DNS style notation
which makes sure that my activity types are unique.
And I already defined this activity type
in my info plist earlier.
Next, we're setting the title and the keywords.
And again, here I'm using a meaningful title
because that's what's going
to be the public face of my user activity.
Next, I'm setting Handoff, App Search, and Public Indexing,
because the content we're looking at is public here.
And the more important part here is this,
which is where I'm setting my mapItem inside my activity.
Now again, this has the side effect
of populating the contentAttributeSets for us.
So that's what I'm doing right there.
I'm just completing that by indicating
that this content supports navigation and phone call.
This way it will have the right nice icons
in the Spotlight Search results.
Okay now, remember about being lazy?
That's what I'm doing here.
So, setting needsSave to True and the delegate method.
This way, the userActivity delegate will be called
when I need to update my userinfo dictionary.
And finally, I'm using the userActivity property defined
by UR Responder in my UIViewController,
and setting the activity to it
so that I keep a strong reference
and that UIKit manages it automatically for me.
Next, I'm going to implement this UI --
this User Activity Delegate Method,
And this is where I'm going to update my userinfo dictionary
with the content that I define in this method below,
which returns the dictionary of the information,
the relevant information,
that I need to recreate this state later.
Okay, let's go ahead, build and run.
Okay, so safe to say the same place here.
Okay, I didn't break anything for a change.
So I'm going to go ahead and double-tap Home,
and now as you can see, we're seeing this nice banner
at the bottom to get directions to the place
that I was just looking at.
It's very easy.
Same thing when I go to Maps.
I see that same location directly here,
even showing my application icon.
Alright, let's zoom back out.
Same thing when I go to Messages, and say something
like "Let's meet at-- ."
You see that I have the same location right there being
suggested directly inside the QuickType UI provided by my app,
so I can just tap on it and both the name of the place
and the address get inserted.
But, it even works with incoming messages.
So here I'm getting a text message asking me what their
phone number is, and it's pulling the information right
from my NSUserActivity so that I can just tap on it
and insert the phone number as well.
That's really cool.
Now let's go check on timing.
Three minutes, forty.
So that's not bad.
All of these location suggestions features,
provided by my app with a single API.
Well, I cheated a little bit.
I had some sample codes but you get the idea.
It's not a tremendous investment.
So messages, QuickType, Siri, Maps, multitasking,
all of that, a single API.
Alright, so we had worked on App Search as well,
indexing this content.
So I'm going to go back to the app first
and restore it in its initial state.
Then I'm going to go ahead and search for pizza.
Pizza. And sure enough, I see that place right there.
But when I go ahead and tap on it, my app is launched,
but my app doesn't do the right thing.
It doesn't restore the context back
to the user as they would expect.
It doesn't show the rich UI of the pizza place.
I'm going to go back to my code here,
and this is because I did not implement the
So I'm going to go to my app delegates, drag and drop
that part, and what I'm doing here is,
I'm implementing the continue userActivity restorationHandler
UIApplicationDelegate, checking the activity type first.
This is the one that we were just working with.
And then calling this method that I defined right below,
which looks at the userInfo and restores the state
as the user would expect.
I'm going to build and run again,
and see that one more time.
So, I'll go back to Spotlight, tap on this result,
and now we're doing the right thing and restoring the content
as the user would expect.
So that's a quick tour of how you can adopt NSUserActivity
for location suggestions, App Search, and Handoff.
Alright, let's switch back.
So again, this sample is available to you
on the Developer Library.
It actually does a couple more things over there.
And it helps you test your integration with NSUserActivity.
So it's a great tool to keep by your side while you're working,
on the Location Suggestions feature.
Okay, so we just saw how easy it was to adopt NSUserActivity,
and promote your content in many different parts of the system.
And we'll take a look at how schema.org can provide some
of these benefits, specifically for location suggestions.
Now when we design these feature these location
We wanted them to work just as well
with websites you view in Safari.
And so for this, for instance, when you're looking at yelp.com
which mirrors the same content we were looking at earlier
in the Yelp app, when you switch away from Safari,
you get the same, nice, handy suggestions
to get directions to this place.
And that's because yelp.com adopts schema.org
which is the technology we use for this.
A little bit of background,
schema.org is open web markup standard that allows you
to semantically annotate your content
with rich, structured, metadata.
So there are many schemas providing a bunch
of representations for various concepts.
And schema.org is also intended
to provide a rich search experience to your users,
both inside iOS, but also with all major search engines.
So these schemas are all organized
in a tree-like structure.
So for instance, more specific Schemas
like restaurants inherits
from more generic ones like local business.
Let's take a look at restaurant as an example.
So, it inherits from multiple schemas and therefore,
inherits from all of their properties,
defining each of these.
And for instance, the name property which is defined
in the Thing schema would capture things like the name
of the restaurant in that case.
Same with the address which is defined
in the Local Business schema.
And things that are more specific to restaurants,
such as whether it accepts reservations are defined
directly inside the Restaurant schema.
So here's an example of schema.org in action.
On the left, we're seeing the same websites we were looking
And on the right, content that's semantically describes it
in a way that can be understood by Safari.
Creating things like the phone number,
the name of the restaurant, address,
rating, website, and more.
It can even do more.
Let's look at a simple example.
Here is a very simple HTML document,
describing content about a restaurant.
Now, let's look at how this page could look like,
with the addition of JSON-LD schema.org markups.
As you can see, we haven't changed anything
to the actual content
or the actual structure of the document.
We've just provided metadata alongside,
describing the content in a more machine friendly format.
It includes simple properties like the telephone,
but also more complex structured properties, like the address,
which is itself another schema of type PostalAddress.
If you prefer, you can also annotate your content
in line with microdata.
So that's what we're doing here.
We're changing the actual structure
of our initial document and augmenting it
with inline microdata markups.
So Safari in iOS 10, it strikes these location related schemas,
and promotes them much
like a native app would promote NSUserActivity.
That also gives you some of these benefits that we talked
about earlier for location suggestions.
So these are some of the schemas that safari extracts.
Anything with a PostalAddress, GeoCoordinate,
or a telephone property, and the Restaurant schema for instance
that we were working with earlier, is a perfect example.
You have all of these properties and you can combine
or specify the ones that you have available at the time.
Alright, so we talked about NSUserActivity and schema.org.
Use NSUserActivity, to promote locations
that users view inside your app in many places,
throughout the system, effortlessly.
Also for communication apps.
And also of course for all cases of Handoff and App Search.
Use schema.org for your website, to get some of these benefits
for location suggestion, much like you would get
Alright, now let's move
on to how your apps can actually consume these location
suggestions like we just saw earlier.
So here we're going to talk
about two different ways to do that.
One through the keyboard, and one specific to routing apps.
Let's start with Number 1.
So, if your app deals with addresses and text format,
you can benefit from location suggestions inside your app
through the QuickType UI.
Now here we're looking at the Uber app,
which is ride sharing app for those of you who don't know.
And as the user enters a text field, which is annotated
as a location text field, we're promoting content in that place
from Yelp, directly inside Uber.
Now this includes locations that have been recently promoted
in NSUserActivities by other apps, or schema.org.
Upcoming locations based on your calendar.
Locations that you may have copied in your pasteboard.
So if you receive text message with an address for instance.
And even, recent places that you may have interacted
with using Siri.
So for instance, if you ask to get --
if you ask Siri to show you restaurants nearby,
and then interact with a result, you can expect to see it
in there as part of location suggestions.
So, we're introducing a new API in UIKit.
So your app can inform the system what content type your
text fields are expecting.
And in this case, we were working with locations.
Now based on this hint, the keyboard will be able
to make the right proactive suggestions,
if there's any available at this time in context.
But if there's no proactive suggestions available,
it still provides a much richer experience
in terms of auto correction.
Because autocorrect can now be able to --
is now able to know what content your app is expecting.
So, we added a new property
in UITextInputTraits called text content type,
which allows you to specify this.
So here, we're working with a UITextField which conforms
to UITextInputTraits, and therefore, gets this property.
And we're indicating that this text field expects a full
There's a bunch of text content types that you can work with,
ranging from describing people to locations
and a bunch of others.
There's a full list that's available to you
in the Developer Library.
But let's take a closer look at two in particular.
So these two are both related to locations,
but they have different levels of granularity.
So you want to use the one
that represents your use case the best
with the right level of granularity.
So for instance, a navigation app would typically expect a
full street address so that it can accurately route the user
to their destination, whereas a weather app may only typically
care about the state and city
because that might just be enough for a weather use case.
And as you can see here, based on the different content type
that we specified, we're getting different content promoted
inside the QuickType bar.
In one case, the full street address, and in the other case,
just San Francisco, California.
And that's how easy it is for you
to get suggestions inside the keyboard.
Now let's look at how your routing apps can benefit
from these nice direction banners at the bottom.
Like Daniel said earlier, routing apps,
third party routing apps can also be elevated here,
accelerating users directly into your apps
when we think there is an intent for the user to get directions.
So we're leveraging an existing API called MKDirectionsRequest
which was introduced in iOS 6,
and lets your routing app register as such
and then handle all directions request.
iOS will learn over time what a user's preferred routing app is
based on various factors, such as engagement for instance.
And then suggest it
in the Multitasking UI when appropriate.
So, to do so, you want to make sure you configure your app
to receive directions requests.
And that's easily done inside Xcode
through the capabilities panel.
In iOS 10, we're introducing a new routing mode
which is specifically for ride sharing apps.
Next, you want to declare the map regions
where your app is actually relevant.
So if your app is a local metro app for instance,
you can specify that your app is only relevant in this area.
And finally, you want to make sure
that you take the appropriate action when your app is launched
And that is that your app should automatically start directions,
or populate the UI in a way that makes it easy
for the user to get directions.
In terms of code, this is an example
of how you can adopt MKDirectionsRequest.
MKDirectionsRequest uses a URL theme.
So when your app is launched, it's launched with a URL.
First thing we're doing is checking
that the URL we're getting is actually a
An MKDirectionsRequest offers there is directions request
to your URL for that.
Next, we're extenuating an instance of MKDirectionsRequest
from the contents of that URL which will contain information
about the origin and the destination
that the user is trying to go to.
And next and new in iOS 10, something really important,
especially for those of you
who already adopt MKDirectionsRequest,
is that you can now be launched with map items
that don't have geo-coordinates.
And in this case, you'd want to geocode
that address dictionary you were getting, using CLGeocoders,
geocode AddressDictionary, which will give you back a place mark
with the right latitude and longitude,
which will help you then start directions
to the intended location.
And that's pretty much it.
That's how your routing app can be promoted right there
in this prime real estate of the Multitasking UI.
Now with that, I'd like to hand it back to Daniel to talk
about media app suggestions.
That was awesome.
Alright. Now, let's talk about media app suggestions.
So, if you're an app that handles any form of media
like a podcast app or a Spotify-like app, or even an app
that plays video, you'll want to stay tuned for this.
iOS today, promotes the app that we think you're likely
to use based on your behavior.
And we promote that app in a bunch of different UIs
that I showed you earlier.
We in particular, offer those suggestions in Spotlight
and in the Today View.
If the act of the suggestion follows a particular trigger,
like for example when you plug in headphones or Bluetooth pair,
or even arriving at a certain location, and it's a media app,
we may further elevate that.
So, let's see what this looks like.
This is what the promotion in Spotlight looks like.
If say, I always or frequently listen to podcasts
after plugging in headphones.
Notice, in the upper left hand side,
the podcast app a suggestion.
In certain situations though, the promotion
of that will be elevated to the Lock Screen itself.
And so what you're seeing is, the UI we'll traditionally use
for Handoff, is now being used to promote an app
that we think I'm likely to use because it followed one
of those triggers I mentioned.
Now, this is a pretty good experience, but it's not as good
as it could be because at the end of the day, what you have
to do now is you have to unlock your device and swipe
in the right direction in order to get to the content
that we're predicting, we think you want to consume.
With a pretty simple API,
you can build a far better experience for your users.
This is what it looks like before.
This is what I just showed you.
And this is what it looks like after.
We're getting it?
This is what it looks like before.
This is what it looks like after.
Far, far better.
Not only can the customer engage on the content without having
to unlock their device, but you also get your album art front
and center, in front of the user.
Now, this might be quite obvious in hindsight,
but this particular interface offers also far better
conversion if you look at the actual numbers.
And so, if you're interested
in getting your users actually playing your content,
which presumably you are, this is an API for you.
So how do we do this?
I'm going to walk you through it.
The adoption is pretty simple.
It all boils down to a class called MPPlayableContentManager.
So let's imagine you've got a dictionary with a bunch
of different properties.
You know, a title, an artist, an album, and what you want
to do is you want to get this elevated to the Lock Screen
of the device when the system thinks the customer's likely
to engage with it.
Well, you're going to want to start off
by doing is importing Media Player and declaring yourself
as conformant as to a delegate in that class.
And then you're going to want
to implement a method called playableContentManager.
It's pretty simple.
First, you want to grab the media item
through whatever means you have.
Now, you'll obviously want to be prepared,
especially if you're fetching remote assets
for it not working.
In which case, you're going to call
that call back handler you see below as such
and the system will understand, not to necessarily promote you.
Next, you're going to want to populate the NowPlayingInfo
on the Lock Screen of the device.
This is a method that I implemented, so I'm going
to show you what's behind it.
The first thing that you want to do is grab a hold
onto the infoCenter object,
and then if you do have a particular image
that this media item has, you can promote
that to the Lock Screen of the device.
If you have a default image that you use, this is an opportunity
for you to get your brand elevated as well.
Then you're going to translate your representation
to the nowPlayingInfo representation,
and most importantly, you're going to set the nowPlayingInfo
on the infoCenter, before the callback
that I showed you earlier, completes.
So you want to do this before your method closes.
If for whatever reason you don't feel like or your app has a bug
in preparing playback.
Again, if the phone's on Airplane Mode.
This is another opportunity for you to error out early.
Now note, you don't actually want
to start playback at this point.
You plugged in your headphones, you want to wait
for the customer to hit the Play button.
But you want to get prepared for it,
load into Stream, and so forth.
And that's it.
So again, today, iOS promotes apps based on your behavior.
If they follow a particular trigger and they're a media app,
we elevate them to the Lock Screen.
Through pretty simple adoption that I just showed you,
you can get a much better experience and an opportunity
to promote your brand
and whatever gorgeous album art you have.
Alright, so let's summarize what we walked through today.
We spoke about a few simple APIs
that helped you deeply integrate your app
into the operating system.
We spoke about NSUserActivity, the kind of eyes
of the operating system.
It helps us understand semantically what the customer
is currently staring at.
Then we spoke about schema.org, which is in many ways,
similar to NSUserActivities for the web.
We spoke about a new use case in an existing APIs well
as a brand new UIKit API, to help us as the system understand
when to elevate locations.
Then we spoke about MPContent, MPPlayableContent Manager.
Which is particularly relevant if you're a media app.
And hopefully it's been clear to all of you today
that these APIs are easy to adopt and easy to test.
There's some more information available online,
as well as some great related sessions
that you should check out, if you haven't already.
There's a lot of related work with SiriKit.
A lot of related work around our search APIs that you saw today,
as well as some previous sessions from the past two years
around Handoff and aforementioned search APIs.
Lastly, I wanted to close by saying that this is an area
that we are going to continue to invest in as a company.
And what we need from you, developers, is to inform us
about the capabilities of your app
in the most detailed way possible.
Because fundamentally the more we know about your app,
the more we'll understand when it's best to promote it.
And so, you saw today, a lot of different situation
where there was varying levels of kind of semantic detail
that you could annotate an activity or a text field in.
I'd encourage you as much as possible to try to be
as explicit as you can when you do so,
not only to support the features that we're discussing today,
but also to put your application
in a good place, towards the future.
Thank you all for coming today.
I hope you had a great WWDC and a great Friday.
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.