ADAM DRISCOLL: So good afternoon and welcome
to What's New In Core Location.
My name is Adam Driscoll, and I'm an engineer
on the Core Location frameworks team.
I'm glad to bring this session to you today
because things are getting more complicated.
As you know, we have Apple Watch and iPhone now.
So we thought we would focus on simplicity.
What can we do to simplify the tasks for you guys,
and we think we have good stuff for you today.
Some of you may be completely new
to Core Location as a framework.
I thought I would start out with an overview of what we do,
the full breadth of our API.
The users know, the people who buy your apps know
about Core Location, that they are in charge
of whether you can use it or not.
You can't access their location off the bet,
you have to ask permission and we have the API
that lets you do that.
After that we have the bread and butter,
which is location updates.
This is the portion of the API you can use to get access
to where the user is at that moment.
Starting in iOS 8 we introduced the ability to do that indoors,
maybe in a large venue like an airport or shopping mall.
Then we have APIs that we collectively think
of as monitoring APIs.
These are APIs that allow you to specify an interest in an event
that may happen and then be launched even
if you are not running in order to get that,
when that event does happen.
The first of these is region monitoring.
This allows you to specify a geographic region
or iBeacon based region and be launched into the background
when the user enters or exits that region.
We have visit monitoring, sort of a complementary technology
that allows you to specify that you would
like to be notified whenever a user arrives or departs
from the place but you don't have to know
in front what that place is.
It's reminders type things, remind me to get milk when I get
to the store and visiting monitoring is
for diary type things.
I want to know where I went today and what I did.
We have we have other APIs, location, geocoding,
I'm not going to go into it today.
We have improved a number of these regions of the API,
but I want to especially call out indoor location.
Indoor location is more accurate.
It's faster at detecting floor changes.
Most importantly the sample has been rewritten
to make it more straightforward, use MapKit.
If you think that that's relevant
for your users, check it out.
It's should be much easier for you to use.
For the most part we will be talking about the first two
of these points, authorization and location updates.
Specifically, we've got four parts today
and the first one is background location.
This is the portion of our API that you use to record
where the user has gone.
Maybe you have a run tracking app and they want to have a map
of where they went, or a navigational app and you want
to give them turn by turn directions.
We have some new API for the other cases of location use
where you just need to know where they are right now.
You don't need to be able to map but want to provide them
with the information about where the nearest store is,
that kind of thing.
Then we're going to talk about authorization, how that changed
in iOS 8 and how it applies now in the Apple Watch era.
Finally we will dive into detail on best practices
for Apple Watch, which hopefully should be very useful
if you're getting started there.
Okay. Background location updates.
Well, we decided not to have any.
I was just kidding.
This is what the slide would have looked like in iOS 2 or 3.
If you were a user then, you remember having
to leave a phone on, unlocked, in your pocket when you went
for a run if you wanted a map.
Starting in iOS 4 we improved on this greatly,
with what we call Background modes.
You still by default will be running only in the foreground.
However now you can use this handy capabilities tab in Xcode
that says that you want to be kept alive
in the background unsuspended in order
to receive location updates.
What it does, it sets a value key in your info plist,
statically for the whole app.
The key is UI background modes.
What happens when this has been done?
We will see a video in a second.
Before we do, let's just see,
let me list out what to look for.
So it allows you when -- the user is using your application
to start location updates.
Then let's say the user gets bored.
Maybe they check out the mail or their music, a little demo,
maybe they're going to stare at the home screen for awhile,
but you will be able
to accessing location during that time.
The user launches you up and starts you again.
You can stop the session and stop accessing their location
and thereby stop draining their battery with the GPS.
So here is the video.
So user launches our app.
There's a button in the middle there to indicate
that they would like us to start recording, which we do.
The eye chart, notice that the numbers are counting up,
what they get to.
The user is going to put us in the background.
And you see at the top now there is what we call the Blue Bar.
This indicates that our sample app is still running.
It gives the user the ability to tap on it to come back,
which is what they just did.
Notice that the numbers in the eye chart got bigger,
not bigger in the eye chart sense.
That indicates that we continue to receive location updates.
That's just a count of the ones we've received.
We continue to receive them in the background.
The user asks us to stop, we go back to square one.
App developers have created a lot
of great experiences off of this.
We are wondering if there's something we can do
to improve it.
Is there room for improvement here?
With the particular theme of simplification, you'll notice
that once your application has opted into this capability,
it has responsibility because if you continue running
in the background, the user may not know this is happening
and you could accidentally use up either through a bug
or through a corner case in your UI you could use up a lot
of battery without their intent.
We think of that as high stakes and we want to lower those,
so that the consequence of a bug
or an unexpected circumstance are lower.
Another thing that you have to do right now that we would
like to improve is, you have to be closely coupled
to the transitions that your app goes through when it is entering
and exiting the foreground.
Once you opted in to begin running and using user location
in the background, you have to tell us you are not interested
to when you do go to the background.
You only find out you are going
to the background until after the fact.
That leads to this view that we have in the video here
that you may have seen in your app as well.
We modified the sample so that it stops location updates
in the delegate callback to indicate
that it went to the background.
The Blue Bar appears briefly and then disappears.
That can be kind of confusing to users.
You notice that the unifying characteristic here is you
signed up for the ability to check,
to access the user's location in a continuous manner
in the background, but that was done on a kind
of carte blanche basis for your whole app forever.
In fact you only want to do it at certain times,
only when the user is actively using your app
in the background.
Starting in iOS 9 we have a new API that helps you express that.
It's called allows background location updates,
and it's a property you set on your location manager.
If you have multiple location managers, you have to set it
on each one when appropriate.
You can have it set differently on different location managers.
What does it do?
Well, you set this property to yes.
Then it kind of opts you into this behavior
that you requested in your info plist.
If you said you like the capability in the background,
you set this to yes and start the location updates,
you will be kept running in the background
to receive those location updates.
However if you set it to no, you will not.
Okay. Crucial question then: What is the default value?
How do we know when to set this?
The default value is no.
So I want to reiterate this.
The default value here is no.
If you have an application that runs for purposes
of recording location tracks in the background
or for navigational purposes,
you must make a change when you adopt iOS 9.
You have to go into the app and find
where the user is indicating that they want you to do this,
really want you to record a session in the background.
Set this property to yes on the location manager there.
When they finish, set it back to no.
Once you've done that, however, you no longer have to worry
about transitions to the foreground
and background in an immediate case.
You don't have to rush to turn off location updates when you go
to the background because you are already configured either
to continue or not continue as you go.
Okay. So must change if you have an app
that uses background location updates.
What is the compatibility story?
Will the apps in the Store be broken?
No, we grandfathered in all the apps in the Store, but you need
to make this change as soon as you adopt iOS 9 SDK.
What if you want to support users not
on the latest and greatest iOS?
Maybe they're still running iOS 8.
You can do that in the standard fashion,
set a minimum deployment target on your app and check at runtime
with response to selector to see
if the location manager you are running
against currently has this property or not.
And that's what it looks like in Objective-C,
the response to selector in Swift.
We have a fancy new pound available
in Swift in response to this.
If you are at runtime with a location manager
that has knowledge of this property, you can set it to yes.
Otherwise you can assume it's already set to yes
and you don't have to set it.
Unfortunately, that means you can't set it to no
for extra protection either.
Okay. Brief sidebar because I know many of you
who are interested in background location updates
of this kind also may have audible cues.
Probably the go-to way to communicate with the user
when they have their phone in their arm band
or mounted on their dash.
Starting in iOS 9, the core audio team introduced this new,
very long-named option.
What this can, you use this option in order
to indicate you are going to do spoken output
and the system can mix you with other output that the user has,
so you are not garbled but you do mix with music.
Check out What's New In Core Audio.
It happened yesterday, if this is appropriate for you.
What about those times when you don't need
to access the user's location in a continuous fashion?
You are not building a map for them.
You are trying to provide them context
for where they are right now.
Starting in iOS 9 we've introduced a new method
for that called request location.
It basically automates the process that you would have had
to do in the past to accomplish this goal.
If you are experienced with Core Location,
maybe you know the drill already.
You start updating location.
You watch those locations come into your delegate.
You'll notice that they come in very quickly at first but not
with the best accuracy.
So you need to keep an eye on the accuracy level.
Figure out when it's good enough, when it is not going
to get better, pick the right tradeoff on waiting
and getting converged accuracy, and use that location,
and then you have to remember
to stop location updates after that.
We are going to do that automatically for you.
When you call request location,
we'll start automatically getting location updates
but we will not give them to your delegate callback.
We threshold them internally according
to your desired accuracy
and then we will call your delegate back once
with just the location you need.
It will apply our logic
to determine what the best location is.
We'll automatically stop location updates for you
at that time so you don't have any risk of forgetting it
or having a bug there.
I have a little visualization of how this works
to explain what we are doing behind the scenes.
You couldn't generate this by using this API,
because remember we only give you one delegate callback,
but to see what it looks like.
This ring represents the desired accuracy that you set.
Think of it as a diameter.
The fixed accuracy that you want should be tighter than that.
We are just going to graph it on top
of this map here as locations come in.
Now, you can see the center of a location is represented
by a dark circle that looks like maps.
The accuracy level is represented
by a larger diameter translucent circle we about to see.
See how that changes over time.
The first fix came in quickly but not terribly accurate.
It gets better over time.
When it gets to a level that's good enough, we will accept it.
This is the location it will give to your delegate callback.
I mentioned this is configurable,
what your desired accuracy is.
How do you set this?
Fortunately we already have a property named desired accuracy
on location manager.
You may be familiar with it.
It is how you control what kind of accuracies you're hoping
for from start updating location.
It applies also for request location.
Let's just dial that down a little bit
and make it very small, tight accuracy.
And maybe a little bit too tight.
We'll rerun the simulation and see what happens.
First fix, we can see the convergence going on.
Sometimes you have regressions in convergence.
Overall the trend will be narrower until you reach kind
of a break-even point.
At that point, notice we didn't actually meet the
We got to a point where we weren't getting better.
At this point we return this location
to your delegate callback.
Now, that means that the locations that you receive
after you set a desired accuracy
in your request location may be either higher accuracy,
that's smaller value for accuracy,
or they could be lower depending on how we got them.
Maybe you got lucky in the first case,
and maybe it's challenging conditions in the second case.
They get delivered to your location manager delegate
on this delegate callback,
location manager updated locations.
Bear in mind that last parameter is an array.
We only give you one location
so it will be the zero element in the array.
If the user, on the other hand, does whatever they do,
if you request location in an environment
where we can't establish the user's location,
maybe at the bottom of a coal mine, we'll call back
on location manager did fail with error instead.
So you're guaranteed as long
as you don't interrupt request location, you will get one
or the other of these two callbacks
so you can hang some application logic off
of that if you want to.
And the error that you get
from the error callback is KCL error location unknown.
Two other things to keep in mind with request location.
The first is that it's mutually exclusive
with start updating location.
Since we are using the same delegate callback,
if you have a start updating session already going, started
and didn't stop, then a request location will be simply
On the other hand if you have a request location outstanding
but we're still converging, we haven't settled on a fix for you
yet and you call start updating, that will take over
and the request will be canceled
and you'll start getting regular updates
to the delegate callback.
You can have only one at the time,
and start updating location effectively takes precedence.
The other one is that you can use stop updating location
in order to cancel a request.
We encourage you to do this if the user navigates away
from the scope where you had requested their location.
So you are no longer interested, we can save some power.
Great. So that's the two APIs that we have for you today.
The first two of our overall points.
So now let's talk about the topic
that probably you are all very interested in,
you are here to see.
That's Apple Watch.
So Apple Watch, first thing to know about any story
about Apple Watch is that it is really a story
about Apple Watch and iPhone.
So these devices, they are a pair.
They have a direct one to one relationship with each other.
We are going to want to make them cooperate in order
to get the best possible user experience.
The other thing that sort
of mirrors this situation is the way you deliver your apps
You can now write watchOS apps, but the user doesn't get them
from a watchOS App Store, but they get them packed
on with the iPhone from the regular App Store.
When they do that, if they configured the system to do this
or choose to, iOS will ship that component
over to the Apple Watch.
The lesson to take from this is that your watchOS component
and your iPhone component have a close relationship as well.
From Core Location's perspective they are one
and the same authorization identity.
So if you have, whatever kind of authorization you have,
whatever state you're in, it applies to both.
Okay. So subtopics on this.
First one, authorization story.
There's some wrinkles.
We'll get into them.
The second one is cooperation.
Now that we have two devices, two pieces of your app,
how do we get them to work together and some best practices
to keep the complexity lower.
Let's do a quick recap to make sure we're all
on the same ground with authorization.
Users know they are in charge.
So you ask for permission to access the user's location,
either by request when in use authorization
or with request always authorization.
When in use, always.
These are new terms in iOS 8.
What do they mean?
What, how does your ability to use location vary
when you use these different kinds of authorization?
Well, hopefully this table will help answer that question.
So on iOS we will consider an app which has requested
when in use authorization and been granted it and an app
that requested always authorization
and been granted it.
And so first the three points you get for signing the test.
If you have either of these kinds of authorization
and the user is tapping on your app,
can you access their location?
The answer is yes.
No doubt you already know that.
So what about the slightly more complicated case
where the user has interacted with your app
and they've indicated they would like you to track their run
or some other reason they want you
to continue accessing their location in the background
in a continuous fashion?
As you may know, you can do this with both forms
of authorization as well.
However, the behavior you get in this case is slightly different.
So the Blue Bar that we noticed is reserved
for when in use apps.
The reason for that is an app which is authorized for when
in use has an implicit transactional model
with the user.
The user is expected to know
that you are accessing their location.
They specifically want you to do so, as in the case with a run
or navigational session.
We reflect that and also help them in case they get distracted
and forget, remember it's ongoing
by displaying the Blue Bar.
They can get back to the app without scraping
around in the launcher and find your app.
Okay. So what about the slightly more complicated case?
Oh, that doesn't apply to always apps
because there's no transaction.
There's no expectation.
If you ask for always authorization,
then you are basically asking for carte blanche
and the user doesn't expect to know
when you are accessing location.
We won't give them that breadcrumb.
Okay. Background in the case
that I've called intermittent here.
This is when you're accessing location from the background,
but it isn't in reaction to a foreground started or otherwise
like known to the user case.
This is a background app refresh launch
or a region monitoring launch.
Well, to access the user's location
in this case you need always authorization.
That's because they don't know.
That way they can't understand what is going on intuitively
if they are expected to consider you in use.
So you can't access the Swift when in use authorization.
Brief reappearance of monitoring API just for the purposes
of this extra row we have.
Little known fact is you can access monitor API region
monitoring in both cases.
You can access it if you have always authorization but also
in when in use in using UI local notifications,
that includes a CL region as a triggering event
and the user will be notified whenever you enter or exit
that region and they can decide whether to pass
that on to you or not.
They are in the loop
and implicitly aware of what's going on.
How does this story change for watchOS?
Keep in mind we're talking about the watchOS component
of your app only, not the complex of iPhone app
and Watch app, but what can you do just on Apple Watch?
Can you access location when the user is poking
at your Watch app?
Yes. But, extensive background as well.
We only provide the request location API on watchOS.
You can access single locations from the user,
but you can't start a background tracking session
and you can't do a tracking session
in the foreground either.
Why is this?
Apple Watch is really optimized around the quick look.
We don't want the user to sit there and hold their arm up
and get tired looking at it for a long period of time
as they walk around the block.
The request location API is the most appropriate one
to use on Apple Watch.
That doesn't mean you that can't start a background session
from Apple Watch, that the user can't record the run using
exclusively interaction with Apple Watch,
but we will use cooperation with iPhone to do that, and we'll get
into how in a little bit.
There are not many cases where your watch app will be able
to run in the background.
If you find yourself running
in the background you can get access
to the user's location only if you have always authorization.
Not in the most relevant row in the most table --
in most cases for watchOS.
Monitor API is not available.
You can't -- we can't launch your app
into the background on Apple Watch.
There's no point in having these APIs.
Okay. So I have been referring to this concept of being in use.
Let's drill down on that a little bit
and get more specific about what it means.
So your app is considered in use if it's in the foreground.
This applies to both iPhone and Apple Watch.
If the user is actively poking at it.
But since there is such limited cases, no continuous updates
and there's no background tracking, that's the only way
on Apple Watch to be considered in use.
However, on iPhone you can also be considered to be in use
if you have a Blue Bar,
which means you started a session and it's ongoing.
So what is the unifying characteristic here?
How do we extend that to the Apple Watch case?
Both of these represent types when you are visible
to the user, yes, but more specifically
when the user is aware that you're there, running
and doing something on their behalf.
So if the user is just using Apple Watch,
and Apple Watch sends a message to iPhone in order
to do something on its behalf, in that case
in use status will be transferred
to the iPhone app even though maybe it's in the user's pocket
and they are not using it.
They are using the aggregate, the whole complex
and authorization is handled on the complex level.
You need to send the message using an API we'll get
into in a moment, in order to accomplish this.
While you are handling work from Apple Watch
on iPhone you will be considered in use.
How do you get authorization in the first place?
Well, you have to ask for it.
Those two APIs that I mentioned earlier, request with when
in use authorization or always authorization.
Can you do that from iPhone, from Apple Watch?
From both, neither?
Neither would be kind of fun.
So, what does this look like on iPhone?
Well, the user will launch your app.
And then they'll do something to indicate
that you need to access location.
You'll realize you don't have authorization yet.
You'll call request when in use authorization.
You'll see that the dialog shows
at the top the request that's being made to the user.
Then below that we have your usage description key
which you can set in your apps info plist
to explain what you're going to use location for,
and we encourage you to do a good job of explaining it
so users will understand why they should grant you this
ability, understand the tradeoff.
You can see in this case that it's
for purposes of demonstration.
That sounds pretty appropriate.
Let's allow authorization in this case.
And you can see immediately that location updates start.
Okay. So let's send this to the background after stopping it,
and summarize what we just saw.
When you request location authorization, from iPhone,
iPhone shows a prompt you are all used to.
But also note that Apple Watch didn't show anything.
It was just along for the ride there.
That's because when the user is using their phone they are not
likely to be using their Watch at the same time.
If they do use it, they want to see the time
or the app that's there.
So we don't want to block it.
What about the direction then?
Can you ask for location authorization from Apple Watch?
Yes, you can do this, too.
It has a slightly different behavior so you'll want
to do this in certain circumstances and not in others.
Let's see what happens when we do this.
The user launches your app.
And then they navigate to the part where you are going
to need location authorization to fulfill their requirements.
Pretend we didn't just receive it back there,
it's a fresh start.
You call request when in use authorization.
The prompt appears on iPhone and on Apple Watch.
There's an alert that appears on Apple Watch.
The alert on Apple Watch says go look at your iPhone,
you have a prompt to deal with there.
But it has a dismiss button.
That lets the user dismiss this alert.
It is not modal.
You should expect to see users interacting
with the app while you have an outstanding request,
which is unfamiliar to iOS.
It doesn't constitute an answer to the prompt question.
The user hasn't given up their ability to option
to make a decision here, they just said get out of the way.
Still appropriate to demo purposes, so the user will go
to their phone and authorize us.
It is set to reset quickly.
Looks like we didn't, but if you look quickly you notice we got
You can run it again and again and see that you got one.
What did we see overall?
What is the story there?
Apple Watch initiated a request on Apple Watch
and Apple Watch does an alert so the user knows what is going
on so the user understands that the workflow needs
to be interrupted to answer the authorization before whatever
they were trying to do can go through.
The iPhone shows a prompt as you're used to and importantly,
the user has to answer the prompt on iPhone.
This may seem a little inconvenient,
so why did we do it?
There's two good reasons for this.
The first one is, we do really want you
to give a good explanation for your use
in that usage description key.
If you did, that would mean scrolling on Apple Watch.
It's not a great user experience.
We would rather have the large screen real estate that we have
on iPhone for this prompt.
We want to set user expectations
of where they can control authorization.
Right now iPhone is the root location
for controlling location authorization
and privacy authorization in general.
We want to have all the interactions concentrated
on iPhone so the user is not surprised.
They can go to settings, privacy, location services later
on and change the settings there if they want to.
I mentioned earlier that request location is the API
that we are making available in watchOS
to access users' interactions directly from watchOS.
What kind of accuracy can you expect?
If you are issuing the request on Apple Watch,
it depends on whether you have iPhone on hand or not.
If you do, you get everything, even indoor location fixes
which are accurate down to only a few meters.
On the other hand, if you don't, then you can prototype this
and understand the expectation here as the same
as if you had specified a requested location,
requested desired accuracy of KCL location of 100 meters.
This is not location accuracy best, but bear in mind
that it will usually be much better than 100 meters.
In fact, in ideal environments you can even go
for a run with this.
There's no updates so don't try that.
The other thing to know here, it's best effort,
because Apple Watch isn't connected to the rest
of the world very well.
It could be if it's in a novel location
where it doesn't have a lot of experience with the user,
it may be unable to fix the user's location
in their context.
All right, that's the best effort disclaimer,
but again it is a really good effort.
When we have iPhone on hand we make an aggressive effort
to use its facilities to learn more about the environment
and cache a good long distance out.
Probably for a user's run, unless they are training
for a really long run, you will be fine.
But just take this into account when you are trying
to position your use for your users,
whether they should expect to bring iPhone with them or not.
If your app is used around their home or work
or somewhere they are familiar with,
probably they don't need to.
On the other hand if you are expecting it to be used on trips
or where the user has turned the data plan off
so they don't get charged ridiculously,
you want to encourage them
to bring iPhone along anyway even if their data is off.
But let's suppose we have iPhone on hand.
Well, if you need continuous location updates,
common case in the background, or you need region monitoring
or for that matter if you need anything else available on iOS
but prohibited on watchOS, you want to use cooperation
with your iPhone to do that.
How do we do that?
There is a great session, actually the last session
in this room today before this one, before lunch.
Unfortunately it's in the past
but you can go see it on the WWDC app.
If you were here before, great,
this is a little bit of overview.
This is a portion
of the framework that's particularly relevant
First class of interest is WC session.
This is the class that represents the connection
between Apple Watch and iPhone.
There's two methods on it that are
of particular relevance to us.
The first is SendMessage reply handler error handler
and the second is update application context.
SendMessage reply handler error handler is bidirectional,
you can send it from either side,
update application context has to be initiated from iPhone.
Let's take a look at how these work in practice for us.
We are going to start by communicating from the Watch app
to our iPhone app using SendMessage reply handler
The user navigates to the part of the app
where we need help from the other side.
And they ask us to start.
We send that message across and you can see the app comes up.
And the Blue Bar came down too in the video, too,
to indicate that it was able
to get continuous background location session.
It can start sending that information back.
And you can see the updates coming
in on the watch display if it's not too tiny.
We are going to stop and put everything away.
And okay? Then we can review what happened there.
So crucially, if you send a message to the iPhone app
and it's not running, it can be launched if necessary
by SendMessage from Apple Watch to iPhone.
So you don't have to worry
about having the user go start the app.
They can leave the phone in their pocket
and everything can take care of itself.
Of note, in use status transferred.
I mentioned this before, you just saw it in action
because the user is using the Watch, they are using your app,
that means it is entitled
to start a background location tracking continuous session
That also means take special account
of this allows background location updates flag property.
You need to set this to yes if you intend
to start a continuous background location session
and set it to no if you don't.
When the user pulls their phone out and happens to be looking
at it while they are using their watch they won't see mysterious
Blue Bars dipping in and out.
What about the other direction?
So iPhone to Apple Watch.
Well, we can use SendMessage reply handler error handler
because it's a bidirectional API.
When we do that, you see we have a bit of a problem here.
Our messages aren't getting through.
In fact, what happens in this case,
because the Watch app is not running, we are just going
to get the error handler called
on our delegate each time we do this.
SendMessage reply handler error handler is launching in the case
of app to phone, and it is not launching
in the case of phone to watch.
Maybe we should look into the other API I mentioned earlier,
update application context.
This is phone to watch.
We can use this API.
And when we do, something different happens.
So our Apple Watch app is still not running, but the messages
that we send hang out.
In particular, the last one hangs out.
You don't want to use this method of facility
to send delta updates but you do want
to send state updates, full context updates.
When you do, the system will hold on to it for you
until the user does update, does launch your app.
Okay, at that time your app launches and it gets
that update that's waiting for it.
You can see if you look carefully
that it already knew it was started.
You can send continuing periodic updates
across from your phone app later on to update the UI.
That initial state was already there.
That update application context is appropriate to use
for subsequent communications as well.
Then one thing to bring up with regard to this whole path
from iPhone to Apple Watch is also take a look at this API
that we introduced in Core Location
in iOS 6 called allowed deferred location updates
until traveled time out.
This API will, sort of a natural complement to the system
because when the user is not looking at Apple Watch,
you don't need to keep it up to the second.
You need to keep it somewhat up to date.
The best way to do that is batch up the updates,
send them over every so often,
so you don't keep the user's Bluetooth radio running all the
time and we can save some power.
On the iOS side, on the phone you can use this API
on Core Location in order to indicate
that right now you are running in such a latency tolerant mode
and you don't need to be woken
up until the user has gone a certain distance
or until a certain amount of time has passed.
Maybe they went a mile, this is their pace.
If you hit the timeout based on distance,
it has been five minutes,
or you want to send an update to your Apple Watch.
Okay, so that's it.
So we've got allows background location updates.
This is a new property.
You have to take action on this but it will free you
from a couple of hassles with close coupling and high stakes.
After that we've got request location.
This is a new method on C location manager that allows you
to cut out some
of the boilerplate you probably all have and get one location
when that's all you need.
The theme for using both of these devices is cooperation.
And the great news, I'm happy
to say we've got a sample called potluck
which demonstrates everything we talked about today.
In fact, that's the app that we demoed for these videos,
and you can install it yourself and run it.
Hopefully if the app you're working on isn't working,
potluck will and you can converge and figure
out where the error might be.
Here is a bunch of information,
I'm sure you can all scribble down quickly.
The top part is probably the most important.
Check out potluck.
This link takes you to the sample code.
Scroll down until you get to the P section and there's potluck.
These are the sessions I thought might be especially relevant
They are all in the past, but they're all on tape as well.
You can go check them out later.
And especially Introducing WatchKIt 2.0 at the top
and Introducing Watch Connectivity, that's the API
that I was just talking about at the bottom.
I hope to see you all there.
I can't wait to see what you can create
with the iOS 9 watchOS 2 and Core Location.
Keep up the good work.
[Applause and cheers]
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.