The best tvOS apps are not only beautiful and well designed, they're also highly optimized. Learn techniques for diagnosing and correcting graphics performance issues and find out about the performance hit-list you should apply during tvOS development.
So, I'm going to talk to you about tuning your apps today.
And what I mean by tuning is identifying and tackling areas
in your Apple performance matters most, your launch time,
your memory footprints, your animations, response of your UI.
And in the Evangelism team,
we work with developers all the time.
And we do this to help you create fantastic, great apps.
And we've done it for a long time.
We did it with Mac developers.
We've done it with iOS developers.
And now, we're doing it with tvOS developers.
And throughout all of these years,
we see certain trends emerging.
And just in the past few weeks looking at the first phase
of tvOS apps, we see a lot of similarities.
So, I'm going to talk to you about areas and cases
and scenarios that are particularly relevant
to tvOS apps and their characteristics.
So, let's imagine your application and all
of its assets, you have your view hierarchies, your images,
your textures, your shaders, your media, and your app walks
and processes and traverses these assets in order
to create and render one frame.
And we want -- when performance is concerned, we want this path
between their processing of assets and generating this frame
to be as lean as possible, as fast as possible
and do basically as little work as possible.
We want you to have nice, small footprint,
again very smooth animations, but we're not always there.
There are detours and your app ends up doing more work
than it thought it was going to do and effectively takes you off
of this fast path, the green path.
And when we look at each of these detours, graphic seems
to have a lot in -- to do with it, one way or the other.
Either it's causing redundant compositing
by just overdrawing things that it doesn't need to draw.
Or excessive use of memory,
resource usage can bloat your memory footprint
and cause slowdowns.
Or sometimes, we just simply see too much drawing during your
performance sensitive path.
And that could be caused
by either really overly complex view hierarchies
or complex drawing during the compositing phase.
So, the goal is to minimize or eliminate these detours and stay
as close as possible to the green path.
Again, stay lean and do as little work as possible in order
to render this one frame.
What you see here under right is this app we wrote just
for this session to kind of go over some of these scenarios.
It showcases a lot of common characteristics of tvOS apps,
especially UIKit-based apps.
It has heavy use of images, large images, a lot of blurs,
a lot of scrolling galleries, some text and all
of these are just we see in common among the tvOS apps.
So, we'll start talking about Instruments
which is a profiling tool we use to look under the cover
of what's going on with your app at runtime.
I'll talk a little bit about Blended Pixels,
why are they important?
Why are they something we want to pay very close attention to?
Images, again, a lot of Apple TV apps make heavy use of images,
so they're important as part of the performance analysis.
And finally, I want to touch on Visual Effects.
So, Instruments, let's get started.
Talking to you folks, to developers,
we noticed that not many folks have actually met Instruments
or used Instruments.
So, I wanted to start by introducing you to Instruments.
It's an amazing profiling tool.
It ships with Xcode, it allows you to take a look
under the covers and see what's going on with your app.
Nowadays, our GPUs and CPUs are so powerful that a lot
of performance issues are hidden behind graphical correctness,
behind functional correctness.
So, it's really easy to have an app that looks good,
you know it works and think that everything is fine.
So, Instruments will help you identify these cases.
There it is.
It's under the Developer Tool, Instruments.
And when you launch it, this is the open panel,
we ship many dozens of profiles that you can use,
but since this session is about graphics,
we're going to concentrate on Core Animation.
This is the open panel of Core Animation
and we particularly will work with the debugging options
under the display options.
Now, for folks who have not used Instruments,
I'm going to quickly go over what this mean.
These color options place these color overlays over your UI
to visually communicate
to you what might be happening under the covers.
Again, a lot of performance issues are masked
So it's important to get a feel for what's happening.
We're going to start with color blended layers.
And what this option does is places a green overlay
over your opaque pixels which is what we want and a red overlay
on top of your blended pixels.
These are pixels that are either transparent or have some kind
of blending effect applied to them.
And it's an accumulative color overlay.
So the darker the red, the darker the --
the worse the problem.
So I want to kind of look, show you how this looks
like before we move on.
This is part of our app,
it's a little image gallery of our trip to Italy.
And it looks good.
And when we turn on color blended layers,
this is how it looks like.
Definitely, not pretty.
So, the red pieces are areas that we're blending.
And we'll talk about what it means and what we can do
to actually mitigate the situation.
The next option we'll talk about is Color Misaligned Images
and that tracks two different issues,
we'll talk about those later.
And finally, we'll talk about Color Offscreen-Rendered.
These two last options are warnings
to the designer and to the developer.
They're not necessarily errors per se, but there are --
they're things that we want you to pay attention to.
Kind of investigate each of them on a one by one basis and see
if there's anything you can do to help the situation.
And one final note, all of these color options are also available
in the Simulator.
So, you don't even need the device to test your app.
And we recommend that you run Simulator very often,
very early, and as much as you can, a lot of the changes
that you will end
up implementing are architectural changes.
So, you don't want to wait till the last minute
until you're ready to submit before you run Instruments
and realize something is wrong.
And the most common form of blending is source-over
which a lot of people are familiar with.
That's where you blend the source
into the destination using the alpha channel,
but as our GPUs are getting more complicated,
so are our blending.
And these sophisticated blending options put more
and more pressure on the GPU.
So, the option we're going to use,
just a reminder is color blended layers and here's our app,
let's start looking at it.
It looks good.
And I know it works, I've tested it.
We turn on our color blended option layers
and I am in trouble.
This shows that every single pixel
in this view has been blended.
Some of them more, as you see there are some really dark red
pieces in there.
So, let's backtrack and see how we got to this point.
We started with our beautiful image and that was green
as we turn on blended options.
So, everything is good so far.
But then -- so, here is what our compositing path looks like.
So, we went ahead and graph the image and the routed directly
to frame buffer, no extra work, awesome.
But then, Rachel comes along and she wants this image kind
of dimmed a little bit because it's a little bit too bright
for what our usage is.
So, our first inclination is to just place a gradient layer
on top of this layer and we do that.
So, here's our background.
We'll place a gradient layer
and we blend the two together and awesome.
It looks dark.
It looks dim.
We're happy with the look of it.
But then when we turn on color blended layers,
that's where our culprit is, we are now blending everything.
Let's take a look at our composite path,
we had our background, instead of just being able to copy
that into the frame buffer and move on,
we have to go grab our gradient layer, apply a blend to the two
of them and then write them into the frame buffer,
continue on with compositing the rest of the frame.
And this is where our detour comes from.
We spent a lot of effort blending
and not necessarily the act of blending each pixel,
but what we ended up doing is we end up touching all
of these pixels multiple times.
In this case, twice, once with the background,
once with the gradient on it.
So, the solution lies
in remembering how layer-based compositing works
which is what we have here.
We have our composite is a layer-based.
And what it does is it will render and composite every layer
as long as they're not opaque.
And we ignore what's underneath an opaque layer.
So, as long as you have transparency on your layers,
remember that those pixels are getting --
pixels are getting touched multiple times.
What we did in this case is we took our gradient
and we burnt it into the image.
And the combined image is not opaque
because we actually physically touched every pixel
and we apply the gradient to them
and we did this even before building our app which is great.
So, we call this flattening.
And there are many other cases
that you can apply flattening to, adornments, badges, sashes,
a lot of gradients can be burnt
in like we did, highlights and text.
So, I want to talk about text in detail.
Text is one of the most commonly unnecessarily blending case
Let's go back to our gallery, our image gallery.
So, we have a bunch of thumbnails
on our text and it looks great.
Turning on our blended layers, as you can see,
we are blending every single text layer that we have.
And conceptually, it's really easy to think about text
as having transparent background because we think of text
as glyphs just by themselves.
And that makes a lot of sense.
And in fact, that's exactly what you want to do in lot of cases.
If you have text on top of images, if you have text on top
of other gradients or other transparent background
of something that's dynamic and is changing, of course,
you definitely want to have transparent text.
But, let's look at our case, in our case,
our background was solid black.
It really did not need any kind of blending applied to it.
So, what I did is I turned the background of my text views
to opaque by setting the set opaque, you know,
using UIKit only and then setting the background color
to my view background color.
And as you can see, it illuminated a lot
of the blending that we have which is great.
So, when you talk about one more concept,
I call it accidental transparency,
and this is where your, otherwise, opaque layer turns
into transparent layer because of additions of elements to it.
So, for example, rounded corners,
if you have an image that's typically opaque
and you add rounded in corners to it,
now that entire layer will be transparent because you have
to actually apply blending
to the corners that are transparent.
The other one is shadows.
Now, with Apple TV apps, we see a lot of use
of galleries, thumbnail galleries.
And one of the common practices that seems
to become very prolific these days is burning in these shadows
into your thumbnails especially if you have a custom view.
And what that does is it does simplify the view hierarchy
which is great because now, you just have this one thumbnail
with the shadow in it.
But what it does is it will turn that beautiful thumbnail
that was opaque, otherwise,
into transparent layer like it's done here.
So, what are the solutions?
In this particular case especially with shadows,
we have a shortcut,
Core Animation has a property called Shadow Path.
And what it does is it will distinguish the opaque part
of your layer from the shadow part of the layer.
And it's a shortcut because we see this stuff all the time.
So, setting the shadow path and this layer will turn my core
of my image back to opaque and non-transparent.
And then all you can --
all we end up blending are the edges where the shadow is.
So, let's summarize blended pixels.
Use color blended layers to identify blending.
Explicitly set your views at opaque
if you know your content is opaque.
Layers by default are transparent.
And one more note here, incidentally, if you're OK
with your photographic content being lossy, use JPEGs.
We have dedicated hardware for decoding them
and thereby definition opaque.
So, that's -- I use, JPEGs all the time, myself.
Simplify your view hierarchy by flattening.
Put your adornments into your layer if you can.
And a good rule of thumb here is if your assets can be presented
at the same time, the same way, then go ahead and burn them in.
There is no reason why you need to walk down this view hierarchy
if the resulting image is always going to be the same thing.
And again, avoid text blending, it's very common and very often,
you don't need to blend your text.
Text looks really nice on opaque background anyway.
So very often, the blending is just wasted effort.
And the last comment I want to make, I've said a lot of things
about blending but it's not all doom and gloom.
We don't want you to sacrifice your view,
your design to get rid of the red regions.
We get that you need to have transparency.
We get that you need to have some of these effects
to have a beautiful design.
What this section is about is identifying unnecessary blending
and trying to eliminate those.
So, keep what's necessary.
Don't upset your designers.
Apple TV apps, heavy use of images, large images,
often a lot of full screen background images.
So, it's an important topic to talk about.
The option we're going to use is called color-misaligned images.
And I mean -- as I mentioned earlier,
it tracks two different issues.
It tracks scaled images and misaligned images.
And I'll talk about both of those.
So, we got rid of the red on our background, which is great.
Now, we turn our color misaligned and it turns yellow.
And again, these are warnings, not necessarily red worthy.
And in this case, I know exactly what's happening.
This image is being scaled.
So, let's look at how we got here.
So, we had our background image and it is a very large image
and I'll tell you in a little bit how large this image was.
But before we could actually render from it
into the frame buffer, we had to scale it
down to a much smaller image.
This is my view size before we can continue on.
And the scaling is our detour that's causing slow down not
because we are actually necessarily scaling,
but because of the memory footprint that it gives us.
So, let's talk about that a little bit.
So, I started with a JPEG.
This JPEG, I decoded into my memory, great.
So we allocated that chunk of memory then we create a texture
from it, which then got bitmapped.
And then, finally, we were able to pick the correct level
of detail and render from.
And all of these memory that we allocated gets allocated granted
once and at run time-at -- it's a onetime hit,
but what's really important is
that you will hit this one time hit right
as your image comes into view.
This is right as the user is scrolling down a gallery.
It's when you really want that smooth animation,
you'll end up getting this hit.
So, what do we do with this?
Resize your assets to an appropriately sized asset.
Don't make them too small
but also don't make them huge if you don't need to.
If your assets are as part of your app bundle,
just resize them before you build your app.
If they're part of a dynamic download or some dynamic asset,
then go ahead and resize them on your server and save efforts
in downloading an ODR.
Now, when we talk about one more benefit of this,
this particular image, when it was decompressed,
it was 20 -- over 20 megabytes.
I burned and migrated into it and I resized it to my view size
and it went to less than half a megabyte.
So, I had a little bit of savings there,
so I'm happy about that.
So that was scaled images.
And now our gallery has these purple pixel boxes in them.
What are those?
These purple boxes are magenta.
They are misaligned images.
All that means is that these images were rendered
at fractional pixel boundary.
Why is that bad?
It's not bad, but what it's causing is that since we have
to actually cover the entire pixel,
it's causing a very teeny bit of scaling
up to include these pixels that we are touching.
Now, I have to confess that when we were writing this app
and I wanted to have the screenshot,
I had a really hard time actually creating misaligned
images because auto layout takes care of this for you.
So, this is the easiest answer you have for your problems.
Don't turn off auto layout.
Let it take care of the misaligned images for you.
But if you have a custom view
and you must have auto layout turned off,
what you can do is turn your pixel coordinates
to integral pixel coordinates.
And UIKit has APIs for this -- so that's called graphics.
So, to recap images, use Color Misaligned Images.
Again remember, it takes pair of two different issues.
Deliver appropriately sized assets.
Not too big, not too small.
Analyze your view hierarchy,
figure out what is the largest size image you really need
and resize your images ahead of time.
And finally, avoid misaligned pixels
by using integral coordinates or not turning off auto layout.
Visual effects are UI elements
that have those beautiful blurs and vibrancy to them.
They bring our UI to life.
And it's what we use all the time and we want you guys
to use all the time too.
So, let's take a look at them.
This is our menu.
If I zoom into it, you can see these pixels are not just
blurred, they have -- they're pulling the background
They're what we call multi-pass pixels.
What it means is that, for each pixel's value
to be mathematically correctly calculated, the compositing
that has to happen in multiple passes.
And these multiple passes get stored off-screen.
So, we call them off-screen passes.
The option we're going to use
for this section is Color Offscreen-Rendered.
And that refers to these off-screen passes
that are caused by these visual effects.
Now, in this case, in this section,
I'm going to completely geek you guys out and talk a little bit
about what happens down in the GPU
as we render these special effects,
these off-screen passes.
So, here is our view hierarchy and we have our texture sitting
in VRAM and this is a portion
of the texture that's underneath the menu.
The first thing we do is we blur it.
And I can tell you that this blur is actually a very large
footprint blur, which means that we rendered large are of texture
and we -- we convoluted that area and created every pixel
in this blurred region.
So, right off the bat, it caused an off-screen pass
with a lot of work.
Then I went ahead and saturated that and then I tinted that.
And finally, I lightened it.
And now I have the background to my menu that I'm happy with
and I can render from and continue
on with the rest of the menu.
So, that was everything we did once for that layer.
Now, if that thing was moving, I would do this as 60 hertz.
60 times a second, that's what would happen.
And this is where our detour comes from.
I'm telling you this not because there's anything you can do
to mitigate this portion, but I want you to be very aware
of the workload that you're placing on the GPU
by using some of these blurs.
Now for this particular case, we have a core animation,
has a very smart work around or property.
It is called shouldRasterize.
And what that does is it will snapshot yourself sub --
the sub tree, your subview into a bitmap.
Caches that bitmap and all the consequent renderings will be
done from that bitmap.
So, let's look at what -- how that works.
So, if you set shouldRasterize to this root layer,
we draw the background, we go through everything we just went
through to blur the white piece, we render the items on it
and now we cache this bitmap.
And from now on, every time I animate, we use this bitmap
to actually do the rendering.
It has massively simplified our rendering path
which is exactly what we want to do.
Now, I want to talk a little bit about used cases
of shouldRasterize or what we call bitmap caching.
Use it on layers and views with heavy rendering cost.
Don't just set it everywhere.
It will -- it might slow you down,
since it will actually do the rendering off-screen and cache
that off-screen bitmap.
Use it on subviews with static content.
As you can imagine, the -- if the content changes,
we will not render the wrong thing for you,
we will indeed render the correct data which means
if your content changes, we will toss what's in the cache region
like the bitmap, cache that and then continue on.
So, if you're doing that at 60 hertz,
it will definitely add extra effort to your compositing path.
Now, it doesn't mean you can set it to sublayers that animate
if you can, but turn off shouldRasterize,
do the animation, turn it back on.
So, during the animation, your content is not changing.
And this includes resizing of your layers if you're shrinking,
if you're expanding, anything that changes this bitmap.
And there are few caveats I want to talk about.
Don't set it on every view, these bitmaps get cached
and the cache size is limited just like any cache
and you might end up constantly evicting your own bitmaps
out of cache if you do this.
Be mindful of non-opaque views,
you might not be changing your view bounds or animating it,
but if it's non-opaque, if it's transparent that might --
that transparency might cause the cache to be evicted.
And in this note, what changes the content of your cache,
your bitmap might not even be your own view or the subview
that you're caching, it might be something underneath it.
But since they're non-opaque and transparent,
you will pull off what's in the background into your view
and you will end up evicting your cache.
And finally remember that, again,
we don't render incorrectly if the bitmap changes,
we will toss the bitmap out of cache and regenerate it.
So, if your content is changing all the time,
you will purge your cache.
So the most common design element can get that --
can get you into this off-screen passes, is using UIVisualEffect.
That has two variant, the BlurEffect and VibrancyEffect.
I'm going to mention this one case.
It's a little bit uncommon but folks who are using Metal
and Eagle layers and if you're using CoreImage filters,
there're some CoreImage filters
that have also caused off-screen passes.
There's the Gaussian blur filter that will definitely do that.
And finally, some of Core Animation's properties
like Group Opacity will also cause off-screen passes.
And that's not a very visually-sophisticated feature,
it's just that its characteristic will cause an
And if you don't know what Group Opacity is,
catch me outside, I'll let you know.
So, to take away, our option is Color Off-screen Rendered,
it's a yellow option, so it will put a layer over layer
over your blurred areas or off-screen areas.
Be smart about your design.
Simplify your design if you can.
Sometimes, I'm going to tie that into the next item.
Mathematically combine your blur --
your visual effects to simplify these off-screen passes.
In my case, except for the blur pass which had to be done
by itself, all the other off-screen passes that we saw,
the saturated tint, the light and I was able
to combine those into one extra pass.
So, quite often, you can actually do that too,
especially if you're using the CoreImage filters.
And also visually inspect your design.
If adding an extra UI visual effect doesn't really visually
change the pixels, then don't add it.
And use shouldRasterize when possible but remember
that it's a cache and it's not unlimited.
So, we talked about all of these cases
and we use Core Animation instruments.
But before wrapping up,
I want to bring your attention once again to Instruments
and talk about a couple more.
One of them is Time Profiler.
This is a lightweight sampling of your --
where your app is spending its time and it happens at runtime
and if you've never run it, I promise, you'll be surprised
at where you're spending your time in.
The second one I want to talk about is Allocations
and it goes hand in hand with Leaks.
Allocations will track your VM allocations and your heap.
This is the instruments that caught my error
of having this massive image, you know,
decompressed into memory.
It will patch the large allocations and you can --
they go investigate and see why you're doing that.
And the second one is Leaks.
And, of course, I don't have to tell you folks, what Leaks does.
It's a good profile.
To summarize our tuning session,
we talked about how graphics adds these extra detours
that we normally don't think about.
So, to reduce redundant rendering,
just flatten your assets.
Remember that we are in layer-based compositing system.
We will render the layers, one by one as long as we need to.
And we touch these pixels multiple times.
So, the flatter you make your hierarchy,
the fewer times you attach those pixels.
Reduce your download and allocations
by resizing your assets appropriately.
And finally, minimize your per-frame workload
by simplifying your hierarchy and just being very judicious
about using some of these expensive GPU
And use bitmap caching when you need.
This is the shouldRasterize but, again, it's a cache,
use it very carefully.
Of course, Instruments.
Please use Instruments, it's a great tool.
It's available for you.
It's there for a reason.
The profiles are shifting for a reason.
They represent issues that we see very often.
So it will be very helpful for you.
More information, of course,
our landing page has a lot of information.
Here, I've listed a couple of Core Animation documents.
Core Animation Programming Guide,
it has a lot of tips and tricks.
What I've talked about today is very, very relevant
to tvOS apps, but this document has other tips and tricks
that will be helpful for iOS apps.
And then CALayer Reference Guide also talks a little bit more
about how the system works under the hood
and it will help you architect your app to work better with it.
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.