Designing security into your app requires that you follow secure coding practices and use the security features that are built into the operating system. Learn about new developments in Security on iOS, macOS, watchOS and tvOS that impact your apps. Hear about best practices for developing and distributing secure apps and protecting people's data.
[ Music ]
[laughs] Welcome everyone.
Thanks for coming out to hear about what's new in security.
My name's Lucia Ballard
and I manage the Secure Transports Team here at Apple.
And together with my colleague Simon we'll be talking you
through a bunch of new stuff that we've been working
on to help you improve the security for your customers.
So, I hope a lot of you caught Evone's [assumed spelling] talk
It's a great overview of why we do what we do and sort
of the fundamental underpinnings of security on our system.
Here we're going to dive a little more into the details.
So, here's what's on deck.
First up we'll be talking
about some network security changes on iOS.
Then we'll discuss a couple of updates
to our cryptography APIs.
And then Simon's going to talk
about platform security on macOS.
So, diving into network security; if you use your phone
as much as I do, you know
that it's accumulated an incredible wealth of detail
about your personal life,
and a lot of that comes across the network.
Whether it's articles you're reading,
or messages you're sending to your friends,
all these little bits of detail,
even if they seem insignificant alone,
can add up to a really incredible picture of a person.
So, that's why we think at Apple all
of this detail should be protected by default.
We think HTTPS is the new HTTP.
So, for every resource you're loading across the network,
you should provide confidentiality
and data integrity to your users.
The other key point here is that not all HTTPS is created equal.
HTTPS is built on SSL, or now in the modern age known as TLS,
and not all versions of this protocol actually provide enough
security that your users be resistant against attacks.
So, building on these principles,
last year we introduced App Transport Security.
App Transport Security means that for all loads
that you're doing, using the NSURLSession,
or even the older NSURLConnection APIs,
you have to be using top of line, solid TLS connections.
First off that means TLS version 1.2; this version's been
out for a while, but it's the only one that's fully resistant
to all the vulnerabilities that we know about, like BEAST attack
or the POODLE attack, or other exploits with these scary names.
It also means you have to use strong cryptography
so that's ciphers like AES-128 or better,
and certificates signed with SHA-2.
Certificates signed with SHA-1, it's getting easier to attack.
Finally, it also means forward secrecy.
So, this is a way to exchange keys
between server and the client.
It gives an amazing property, which means even
if in the future that certificate --
the server's certificate is compromised,
you can't reveal the content of any communications
that have happened in the past.
So, you add all these together and you have, what we think,
is a secure connection
that protects the data of your clients.
Now, we know that it takes some time to get up to speed
with App Transport Security, so we also introduced Exceptions.
You can turn it off globally, or you could set Exceptions
up for particular domains
that you knew couldn't move to TLS that fast.
Now I'm wondering how many people
in this audience are thinking, "Oh yeah,
I set that Exception last year and haven't thought
about it for a while."
Well, now is the time to revisit it
because this year we're starting
to enforce App Transport Security at the App Store.
This is going to kick in at the end of 2016 and it means
that for most Exceptions you'll need
to provide a reasonable justification.
So, for all of these Exceptions
that actually turn off App Transport Security,
or its key properties like using TLS 1.2,
you'll need to explain why you need
to use this Exception in the first place.
For other Exceptions like Forward Secrecy, we recognize
that support for those is not fully universal,
so for now those Exceptions will be granted automatically
without any justification.
So for example, if you have a partner server you're working
with and you don't have control
over what cipher suites they're offering, we will be able
to enable you to keep talking with that server.
We're also adding some new Exceptions
to make it easier to adopt.
So for example, if you're using Streaming Media
and that media's already encrypted in bulk, we'll be able
to offer an Exception through AV Foundation for you to load
that media without connecting over TLS.
I want to be clear, we still think the right answer here is
that you use TLS for everything,
but in some cases we will be able to provide this Exception
to help you transition more smoothly.
We're also offering a Web Content Exception.
So, here sometimes your app needs to load arbitrary content
from around the web
and of course you can't guarantee that's using HTTPS.
So, if you're WKWebView, then you can just set this key
in your app's Info.plist.
NSAllowsAarbitraryLoads inWebContentKey and then all
of those loads will be exempted
from the App Transport Security requirements,
but everything else your app is doing, like talking
to your own server, will still get the protection.
So, that's App Transport Security,
but we're also making changes to TLS across the system
because cipher suites keep evolving and keep getting more
and more effective attacks posted against them.
So, no matter what your ATS settings are,
we've now disabled RC4 by default
for anyone using our networking APIs.
We've also disabled SSLv3 in Secure Transport.
So, even if you're dialing
down into our lower level APIs you won't be able
to access SSLv3.
Research has simply moved too far for both
of these technologies and no longer think
that they provide effective security for our users.
So, these are disabled.
Other algorithms are starting to show their age;
specifically SHA-1 is showing more and more vulnerability
to attacks, as is 3DES.
So, if you know that you have a dependency on either
of these algorithms now is the time to drop it.
And I recognize many people in this room are app developers
and aren't in charge of their own back ends, so you'll need
to reach out to the folks at your company
or whoever's hosting your back end to make sure
that they've dropped any dependency
on these older algorithms that are getting deprecated.
It would be a great time to check in about the status
of loading things using HTTPS and make sure
that you are cleared for App Transport Security,
then you'll be able to fly through app review.
So, that's App Transport Security.
It's making sure everything loads using strong TLS,
but strong TLS is not enough.
You also need to have confidence that the certificate
that you're using to validate
that TLS connection represents the right server.
So, I want to talk about a couple of different technologies
that we're using today to help you have confidence that you're
in fact reaching out to the server that you think you are
when you're establishing these secure communications.
So, first let's back up and talk a little bit
about how certificates work today.
So, if you're connecting to a server,
that server has had a certificate issued
from a certificate authority.
That authority validates the host name
and says, "Yep, sounds good.
You are in fact example.com.
Here's your cryptographic proof of that."
And when you connect the server sends that certificate over.
But this is not always a perfect mechanism.
If there's an attacker who manages to get a certificate
for your host name, then the attacker can provide
that certificate, and you as the client don't have a way
to distinguish the attacker's server from your own server.
This could happen
if the certificate authority makes a mistake or,
worse case scenario,
if the certificate authority gets compromised,
someone else uses their private keys to sign with things.
So, today I'm proud to announce that we're joining the effort
for certificate transparency.
Certificate transparency is a technology that uses public,
verifiable logs of issued certificates.
These logs collect certificates from a huge variety of sources,
so there are many participating certificate authorities,
but in fact anyone can submit a log.
Then the logs issue a cryptographic proof
that the certificate has been logged
and a client can check for that proof.
There are a bunch of different ways to do it.
Proof can be embedded in the certificate itself,
or it can get handed over in the TLS handshake,
or it can be delivered by OCSP stapling,
which I'll talk more about in a minute.
So, here's a quick overview on how it works.
First, the certificate authority not only issues a certificate
to the server, but it also sends it over to the log,
it gets added into the public log.
The log then sends a signed proof
that the certificate has been included,
and the server hands both that and the original certificate
over to you, the client.
That means you can validate the pair together.
So, certificate transparency makes it more difficult
to launch attacks.
Basically it puts the attacker in a bind.
If the attacker can get a certificate
from an authority that's not participating, they have no way
to get that cryptographic proof
that the certificate has been included in the log.
So, they hand over the certificate alone
and the client can reject it.
Alternatively, if they are using a certificate authority that's
participating, then that tainted certificate gets logged
and is publically visible, and that gives you an opportunity
to revoke the certificate at the certificate authority level.
So, we think this is a critical technology to enable you
to have confidence that the certificate that you're talking
to is in fact the certificate you want to be talking to.
So, here's how you can try it out.
You can actually use the same info.plist that you used to set
up your app for security configuration.
It's a new keyword, so for each example it has a certificate
that's participating, you just set this keyword,
and then your client will reject any certificate it can't prove
that it's been publically logged.
Our current policy roughly requires that you need a proof
from at least two logs.
These logs -- we're adding new logs as we qualify them,
but basically if your certificate works
in Chromium it'll work for us as well.
And there's a lot more information
about the general technology at certificatetransparency.org.
So, I encourage you to go check that out.
Now certificate transparency is a great piece of this ecosystem,
but it doesn't totally replace revocation.
There's still that last step where once you found
out the certificate is flawed for some reason you have
to actually stop trusting it.
So, I want to take a couple of minutes to talk
about our recommended practice here.
It's called the OCSP stapling.
Now this is a standard that's been out for a couple of years,
but we think that it --
now's the time for folks to actually move to it
and start adopting it because support
for it is now quite widespread.
OCSP stapling is an enhancement
to the online certificate status protocol
and solves a bunch of problems with it.
So, as a refresher, this is how OCSP works.
We have the same set up, right?
Where your certificate authority is issuing the certificate
to a server, every time a client connects
to that server it sees the certificate and it wants
to know whether it's still valid.
So, it asks the certificate authority right there
in the middle of the TLS handshake,
and the certificate authority says, "Yep,
the certs still valid," or "No, sorry it's invalid.
Don't trust it."
This has some issues.
One of them is that it's slow.
You're right in the middle of this handshake; you're trying
to get your resources, you don't want to wait
to make your network connection to some other entity,
especially if that server's gone down,
you might be hanging for a while.
The other major issue is that it leaks a little bit
about whatever activity you're doing online.
Your certificate authority gets to see
which host names you're connecting
to because you're sending a response
up each time you connect.
OCSP stapling resolves a lot of these concerns.
So, here's basically how it works.
Instead of the client asking,
the server asks the certificate authority,
and the certificate authority hands a signed response back
to the server.
Now it's signed by the certificate authority
so you know you can trust it.
Then the server gives both the certificate and a promise
that it's valid over to the client, all in line,
all in the same handshake.
So, this means that your revocation information is
delivered reliably and quickly.
There's no extra waiting.
There's no concern there.
And it protects your user's privacy
because the only connection they're making is back
to your server.
You may have noticed that diagram looked a lot
like the certificate transparency diagram and that's
because you can use the same mechanism
to deliver certificate transparency proofs.
As long as your certificate authority is participating you
can hand all of that information together in the same handshake.
And, like I said, OCSP stapling is widely supported
across many operating systems and is backwards compatible,
so you can go turn it on today, in Apache, in GenX,
whatever your backend is hosted on.
And it's now fully supported across every Apple platform.
So, if you could take a step back and summarize where we are
with network security, now's the time to move forward
to App Transport Security Standards.
So, that's strong algorithms and strong ciphers; TLS 1.2,
forward secrecy, and SHA-2 Certificates.
Also, it'd be a great time to start experimenting
with certificate transparency.
Find certificate authorities that are participating
and get integrated into this ecosystem.
And please, go enable OCSP stapling
so we connect the full loop and know
that you can have confidence
in making secured connections back to your servers.
So, that's network security.
Now I'd like to take a couple of minutes to talk
about some cryptographic improvements.
So, first up is SecKey.
SecKey is our algorithm, or excuse me,
is our API for asymmetric cryptographic operations.
And in this release we've unified it
across macOS and iOS APIs.
SecKey now has support for all the common operations you would
want to do with asymmetric key pairs; the RSA, the ACC.
So, that's signing and verifying with asymmetric keys.
This means that SecKey is a total replacement
for the deprecated CDSA calls on macOS.
And it also replaces any use of SecTransform you might be doing
to do cryptography with asymmetric keys.
So, we strongly recommend moving forward with SecKey.
We've also tied this into a new kit called CryptoTokenKit.
CryptoTokenKit is system support for cryptographic devices
so that smart cards you might be using to prove your identity
and various enterprises or USB tokens,
we now have out of the box integration with these
and they can integrate into system surfaces.
So, that means the Token content is available in the keychain,
as you would expect, and the Token operations are available
using the SecKey API.
Now this is a complex topic
and there's a lot of detail to go into.
So, please come see us in the security labs and we can talk
through your use case.
So, thanks for your attention and with that I'll hand it off
to Simon to talk about what's new in Platform Security.
Thank you very much Lucia.
Hello. I'm Simon Cooper.
I'm the manager of the Trusted Engineering Team.
So, I'm going to talk about what's new in security,
but first of all I want to talk about --
a little bit about how software is delivered on the Mac,
a little bit about Developer ID, and I'm going to talk
about Gatekeeper and some packaging issues
So, let's start off about --
talking about how software is delivered
for a couple of our platforms.
So, for iOS you can get apps from the App Store.
You can build and run apps using Xcode
and install them on your own devices.
And there's some enterprise programs that allow you
to deliver and manage content to devices.
You'll notice that all of these installation mechanisms are
silently handled by the iOS platform.
So, now I'd like to talk a little bit about macOS.
So, you can also get apps from the Mac App Store
and that's a very good way to get your apps.
You can also get apps that are assigned with Developer ID.
You can also use Xcode to build your own apps
and use the traditional command-line UNIX tools
to build things in the normal UNIX way.
So, let's go back and talk a little bit about Developer ID.
So, what is Developer ID?
Developer ID allows you
to deliver apps outside of the App Store.
These apps are usually downloaded using a web browser.
The Developer ID Program will issue you
with a Developer ID signing ID.
And when you sign apps using
that identity they are treated specially by Gatekeeper.
There are actually some improved flows in Xcode 8 that allow you
to properly export your Developer ID
So, there's something we're actually going to do
and that's a change to the Developer ID Program,
and this was announced yesterday.
We're actually allowing Developer ID
to be used with iCloud.
So, Developer ID can now use iCloud features.
That includes iCloud Drive, iCloud Keychain,
Push Notifications, and VPN.
So, what does this mean?
This means that you can deliver iCloud enabled apps outside
of the App Store and you can now use Developer ID to share data
with your iCloud enabled iOS apps.
You will be able to deploy these new Developer ID apps back
to macOS 10.9.
So, I'm sure you want to know
when you'll be able to do all of this.
So, you can start iCloud Development testing today.
That's using the Xcode 8 tools.
And I'm sure you're aware that when you're doing testing
of iCloud there is a development and production environment.
When you're going to deploy your iCloud App you'll want
to be deploying against the production environment
and you'll have to wait for an upcoming seed in order
to start testing that.
When you are going to do that, please use the new flows
in Xcode 8 because they make sure that you are deploying
against the production servers.
The Xcode team also asks me to remind you
that please don't distribute apps unless you're using the
So, the other side of Developer ID is Gatekeeper.
Gatekeeper allows us to control what apps are allowed
to run on your machine.
And there is a Preference Panel.
And on that Preference Panel, in El Capitan,
there are these options; you can run apps from the Mac App Store,
you can run from the Mac App Store, unidentified developers,
or you can run apps from anywhere.
When you first run an app Gatekeeper will prompt you
before its first run.
We're actually making some changes to Gatekeeper
and we're making changes to the Gatekeeper UI in Sierra.
We are changing the default options,
and those default options are going to be --
you can run from the Mac App Store,
you can run from the Mac App Store unidentified developers.
Now if Gatekeeper rejects your app for some reason,
there's usually a button that appears in this Preference Pane
that allows you to open and -- to open anyway.
Unfortunately I have to say,
in the seed build this function is not actually working,
but it will be fixed in a later seed build.
I also want to say that we haven't actually changed the way
that this underlying mechanism works and the way
that the policy works.
So, this means that if you have managed configurations
or you use command line tools and use the policy tools,
you can re-enable the Allow Anywhere.
So, I want to talk a little bit now about some other changes
to Gatekeeper, but I first need to talk
about a repackaging problem and the Gatekeeper enhancement
that we've developed to address this.
Now the repackaging problem is a problem
because of the way certain apps are written.
There are a certain type of apps
that when they are written they reach outside of themselves
and use external resources.
These apps may be delivered in various ways and they may
in fact be correctly signed.
But the external resources that they reach
out to may not be signed
and they can be code or code equivalent.
They could be libraries, they could be plug-ins,
they could even be HTML content.
And you may not be aware that local HTML content,
These resources could also be Lua Scripts or Python Scripts
or even Apple Scripts.
So, what's the problem?
Well, if you put these two things together inside a
container and deliver the app in this way, maybe through a zip
or a disk image or using an ISO image, and you put this app
into the container, and then you put the resources alongside it,
when the app runs and it talks
to the external resources it's loading these potentially
And you'll notice that the app, with its green border,
meaning that it's signed, is reaching outside of itself,
outside of the signature,
and trying to touch these potential resources.
So, if I take that app and I repackage it,
but instead of packaging it with the external resources that you,
the developer, would like to be there,
and I put malicious content there,
then when the user runs the app they're not going
to be getting your experience, they're going
to be getting the experience of the malicious code.
Now if that malicious code is a [inaudible] library
that you may have put into a plug ins directory, for example,
then your app is almost certainly not going
to be doing what you expect it to be doing.
So, we're calling this a repackaging problem,
and there are some things
that are actually not affected directly
by the packaging problem, and those things are apps
that are delivered from the Mac App Store.
And that is the best way to get apps.
Other things that are not directly affected is
if you install your apps using a signed Apple Installer Package.
That doesn't mean to say
that the app once it's been installed can't be repackaged
and delivered in another mechanism,
but the actual Apple Installer Package doesn't have the problem
of reaching outside the resources.
So, there are things that are affected,
and that is if you are delivering your app using a zip,
or a disk image, or if you've got a combination of things
in an ISO image, and any other kind of archive format.
It's also possible that if your disk image is structured
in such a way that you are trying
to assemble an app using a combination of pulling resources
from here and there, you can also be affected by this.
So, I -- we need your help with this problem.
And we also need to protect our customers
because of this problem.
So, what do we need to do?
So, if you're delivering something that has a signed app
with some external resources, and you may be doing this
through using a zip, or a disk image, or an ISO image.
And we need to ask you to switch to using a disk image.
And the reason we're asking you to do this is
because we can now sign disk images.
You can sign a disk image using macOS 10.11.5,
and that is the current release of El Capitan,
or any of the later release of OS X.
You can use the cosigning tool to sign the app --
to sign the disk image, I'm sorry,
and that will basically bind the external resources
and the app together.
These signatures are actually embedded in the disk image
and are carried along with it,
and this signed disk image is compatible
with older OS releases.
So, I'd now like to give some packaging advice.
So, one way to avoid this packaging problem --
repackaging problem is to avoid it completely
and put your resources inside the app bundle
and then sign the whole thing.
If you're distributing just a single app bundle you should
consider delivering it via the Mac App Store.
You should certainly sign the app.
Perhaps package it in a zip archive,
but please verify the signature before you release it.
Alternatively if you have a complex set
up you can use a signed Apple Installer Package.
For a container with apps and resources, with a disk image,
please use, and switch to using a signed disk image.
Sign any content that you have inside the container
so that app that's inside there should also be signed,
and you should sign the disk image.
And please verify all the signatures before you
So, some final words on packaging,
so if you're adding personalization
or licensing information to your app
when it's being downloaded please use an extended attribute
on the bundle route of the app.
And there's a tech note that describes how to do this
and that's Tech Note 2206.
And you could also do this
by signing the personalized disk image.
Here are some things that you should definitely not do.
You should not modify your app after signing,
because this will break the code signature,
and when Gatekeeper comes
to verify your app it will get grumpy.
You should never deliver your app with a broken signature
and please stop shipping ISO images.
So, now I'd like to talk about a Gatekeeper enhancement
that we have developed that helps combat this problem.
And this is all about protecting customers.
This mechanism, we're calling Gatekeeper Path Randomization,
it supplements existing Gatekeeper --
all the existing Gatekeeper protections,
and there's no change for Mac App Store apps.
And there's no change when you're upgrading to Sierra
for any apps that were previously run on your system.
This mechanism will come into play
for any newly downloaded apps and it applies to apps that are
on unsigned disk images.
So, let me describe exactly how this mechanism is going to work.
So, this is a complex object, now this could be
in your Downloads folder and it could be an unpacked zip,
or this could actually be a mounted disk image
that contains your app and those extra resources.
So, when you double click on your app,
when your app runs it will be relocated, with some tricks,
into a randomized place on the file system and it won't be able
to access the resources.
So, this stops the repackaging attack.
So there are some situations
where this relocation does not happen.
If the user explicitly moves the app
and it's just the app itself, if they move the app
with something else, then this mechanism does not get
If the user moves the single app by itself,
maybe to Slash Applications,
then this mechanism will be turned off.
This does not happen if you sign your disk image.
So, any apps that are run
from a signed disk image will not have this mechanism apply
This mechanism also doesn't come into play
if you install software using a signed Apple Installer Package.
It also doesn't apply to any apps from the Mac App Store.
So, in summary if there were two things to take away
from this talk; sign the things that you deliver and check
that things that are valid.
So, more information about this talk can be found here.
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.