Learn about the latest image formats and video technologies supported in Safari 17. Discover how you can use JPEG XL, AVIF, and HEIC in your websites and experiences and learn how they differ from previous formats. We'll also show you how the Managed Media Source API draws less power than Media Source Extensions (MSE) and explore how you can use it to more efficiently manage streaming video over 5G.
♪ ♪ Jean-Yves: Welcome to "Explore media formats for the web." I'm Jean-Yves Avenard, a WebKit engineer, and today I will go over media formats supported in Safari, focusing on images and video, and introduce you to some new technologies we made available in Safari 17. We are adding support for some new image formats, and I'm going to help you choose the right one for your site after a little presentation on the most commonly found ones. Then I'll guide you on optimizing MSE with a brand-new technology we've implemented in Safari 17. Finally I will present how you can add AirPlay support to your videos using Media Source Extension Over the years, three formats have been the most widely used. They are supported by all browsers and are easily created and deployed, but technology has made tremendous progress over the last decade, and new great formats are now available. The incumbent formats are GIF, JPEG, and PNG. Let's look at them in more details.
GIF, or as we properly say in my country of birth, "jeef," a format introduced 36 years ago that is best suited for simple animations, memes, and social media content. It does not support a full color palette being limited to 8 bits color at a time. Since it's a lossless format, file sizes can be quite large, making it less suitable for larger animations.
By using these formats, you can reach all your users, regardless of what web browser they are using.
Safari 17 supports an additional four extra modern formats, which you will want to use in addition to the legacy ones, and that I think makes the most sense. While they are great and pretty much all interchangeable, they each have their key advantages over the others. WebP was added to Safari 14 and macOS Big Sur. This is a modern image format that uses advanced compression algorithms to achieve smaller file sizes without sacrificing image quality.
WebP files are typically smaller than those earlier image formats, which can help improve website performance and loading times. WebP lets you do animations with video-like quality, and so use it where it's a bad idea to use GIF due to their size or lack of colors.
One exciting addition to Safari 17 is JPEG-XL, a new image format that's designed to offer high compression rates and image quality.
JPEG-XL uses a new compression algorithm called "Modular Entropy Coding" that allows for greater flexibility in adjusting the compression ratio.
It is well suited for images that may be served over slow connections, like JPEG, so users can see something before the whole image is fully loaded. A key feature of JPEG-XL is that you can losslessly convert, that is, not occur any data loss going from your existing JPEG files to JPEG-XL, all while significantly reducing their size by up to 60%. It's a relatively new format, and so may not be widely supported by all devices and browsers. AVIF is another modern image format that uses the AV1 video codec to achieve high compression rates without sacrificing image quality. Widely supported across all browsers, it is well suited for live photos and supports up to 12 bits color depth. It also has the broadest support, and you should include it as fallback. AVIF can be up to ten times smaller than JPEG. In Safari 17, we added support for HEIC, also known as HEIF. It's an image format that uses the HEVC video codec compression algorithm to achieve small file sizes. But since it's not widely supported on other platforms, you will probably only want to use it as an alternative format. This is the format used by iPhones and iPad to save photos, so you can directly handle photo uploaded from your iPhone with no conversion. If you intend to display images using a WKWebView inside your app, this is the format you should be using, as it's hardware accelerated and can be rendered quickly and efficiently.
One of the key advantages of JPEG-XL, AVIF and HEIC, however, is that they support both wide color gamut and HDR. Supporting billions of colors, the large color gamut allows more colors to be preserved on file and represented on screen.
HDR let you better define how dark the darks can be, and how bright the brightness can be, and how much light can be taken in. Together, you get more vibrancy to outdoor landscapes, or very bright scenes with lots of contrasts, or have perfect rendering of those beautiful, complex skin tones.
As you want your website to continue supporting all widely deployed web browsers, you will likely still need to provide GIF, JPEG, and PNG for the years to come.
However, by offering these extra formats, you can make your site quicker to load and using less bandwidth, while still being compatible. You really don't need to choose. Let me show you how. Declaring an image element with a JPEG-XL image means older browsers and browsers that don't support it will get a broken image.
The picture element in HTML allows you to specify alternative sources, allowing the browser to choose a format that it supports. You can even provide multiple alternative sources, prioritizing the format that gives the best performance. The browser will look over the list of available format from top to bottom. So here, it will use HEIC first if supported. If no matches are found or if decoding failed along the way, the URL of the image element's source is selected. That's how easy it is to provide the right format for people regardless of device support without the need for looking at the User-Agent string or worry about the browser the user will use. You don't need to choose. Let the browser do it for you. Now that we know the modern image formats we can use and when to use them, let's take a look at video, and specifically, let's dive into adaptive streaming video. The evolution of video presentation on websites has been a fascinating one, and it has come a long way since the early days of the web. Here are some of the key milestones in the evolution of video presentation on websites. In the early days of the web, video was not commonly used on websites due to technical limitations. Websites were primarily made up of text and static images. In the early 2000s, browser plugins like Flash and QuickTime emerged as a popular way to add video to websites. And in 2010, HTML5 was introduced, which made it possible to embed video directly into webpages without the need for those plugins. This made it easier to add video to websites, and also allowed for greater flexibility and control over how video content was displayed and played. And WebKit was at the forefront of this revolution. With the rise of mobile devices, it became increasingly important for websites to be able to display video content on smaller screens. This led to the development of new techniques to allow websites to adapt to different screen sizes and orientations. HTTP Live Streaming was introduced by Apple in 2009. One of the key features of HLS is its support for adaptive bitrate streaming, which allows for the delivery of the best possible video quality based on the user's internet connection speed and device capabilities. Adaptive streaming in HLS works by dividing the video content into small chunks or segments, typically between two and ten seconds in length. Each segment is encoded at multiple bitrates, and these different bitrate versions are made available to the client via a manifest file in the form of an M3U8 multi-variant playlist. HLS does a brilliant job at selecting the best suitable variant. It's very simple to use and is the best solution for the end users. Unfortunately, on desktop, not all browsers included support for HLS, and even today, only Safari supports it. Web developers wanted more control and more flexibility, such as the selection and transfer of media data, or the ability to play DRMed content on desktop. And so in 2013, the Media Source Extension was published by the W3C body. Safari 8 along with other browsers quickly added support for it Media Source Extension, or MSE, is a low-level toolkit that allows for adaptive streaming by giving the webpage more control and responsibilities for managing buffering and resolution. Overall, MSE has been a game changer for web developers. It has enabled the development of high-quality streaming experiences on the web and is now the most used web video technology. MSE has some drawbacks. It isn't particularly good at managing of buffer levels, the timing and amount of network access, and media variant selection. These inefficiencies have largely been immaterial on relatively powerful devices like modern general purpose computers. Power usage on mobile devices was much higher than with the HLS native player, and so MSE was never made available on iPhones because we couldn't achieve with MSE the required battery savings. All our testing of various sites proved that enabling MSE would have lead to a regression in battery life.
On devices with narrower capabilities or where the connectivity is spotty at best, it can also be difficult to achieve the same quality of playback with the MediaSource API than is possible with HLS.
One of the reason for this is that MSE transferred most control over the streaming of media data from the User Agent to the application running in the page.
This transfer of control added points of inefficiencies, the page does not have the same level of knowledge or even goals as the User Agent, like they may seek the cheapest networking connection path, and this typically led to much higher power usage. And this year, we wanted to address those drawbacks, and we worked hard to find a way to combine the flexibility provided by MSE with the efficiency of HLS. And so I am super pleased to introduce this new technology that combines the best of MSE with the elements that makes HLS awesome, the Managed Media Source API. A "managed" MediaSource is one where more control over the MediaSource and its associated objects has been given over to the browser. It makes it easier for media website authors to support streaming media playback on constrained capability devices, all while allowing User Agents to react to changes in available memory and networking capabilities. Let’s go over some of the differences over the old MSE. Managed Media Source can reduce power usage by telling the webpage when it's a good time to buffer more media data. When not buffering, it allows the cellular modem to go into a low power state for longer periods of time, increasing battery life.
When the system gets into a low memory state, Managed Media Source will intelligently clear out unused or abandoned buffered memory, making pages more efficient.
Because Managed Media Source tracks when buffering should start and stop, the page's job of detecting low buffer and full buffer states becomes much easier. The browser does it for you.
With these improvements in place, Safari can send media requests over the 5G modem. This allows your site to use the blazingly fast 5G network to load media data incredibly quickly, while having a minimum impact on power use. And if you need to play a live show, Managed Media Source will automatically detect it and switch to LTE or 4G, where available, to extend your battery life. You are still in the driver seat. You are still in control on which resolution to fetch, how you download each segment and from where. Managed Media Source only provides hints and gives you a more efficient version of MSE. By using ManagedMediaSource, you will save bandwidth and battery life, allowing your user to watch your video for longer, not only on their iPhones, but also their iPads and Macs. Let me show you how easy it is migrate from MSE to Managed Media Source. Transitioning your video player from MSE to Managed Media Source is easy and only requires a few steps. Here, I will open a very simple HTML page I had used multiple times in the past to test MSE development. It creates a video element and loads 12 seconds of data, and then plays it. All the logic actually occurs in a utility file mediasource.js that is included. Let's look at it, and in particular, the method runWithMSE. RunwithMSE waits for the page to load, create a video element, and attach it to a MediaSource object, and append it to our HTML's body. First, you need ensure that Managed Media Source is available. This is easily done by checking the document that ManagedMediaSource object is defined, and if not, fall back to using MSE. Then replace any call to MediaSource by ManagedMediaSource itself. Another way, and in my opinion easier, is to override MediaSource itself like so: You define a method getMediaSource() and set a MediaSource shim.
And now, whenever you are referring to MediaSource, you will actually be using ManagedMediaSource instead. Managed Media Source should always be your first choice over the older MSE. Now, back to my HTML page.
After creating SourceBuffer, now ManagedSourceBuffer, you add two event handlers. Startstreaming will notify the player when it should start fetching new content and add it to the managed sourceBuffer.
And one to handle the "endstreaming" event to tell when the player needs to stop fetching new data. The user-agent now has determined that it has enough data, and it can now enter low power mode. For this demonstration, the endstreaming event handler is just a placeholder. Unlike with MSE, your sourceBuffer may evict content at any time, and not just when appending data. With MSE it was never a good idea to assume your buffered range only increased when appending new data, which would have caused playback to stall and the MSE specification encouraged you to regularly check. So you also need to add an event handler for the bufferedchange event where you will need to check which data was evicted.
If you follow the guidance of the Managed Media Source events and only append data when the element asks you to do so, you will get access to 5G speed on iPhone and iPad, giving your users access to higher resolutions, shorter rebuffering time, and best possible battery life. You are now ready to handle adaptive streaming using the new ManagedMediaSource available in Safari 17. Though if all you care about are Apple devices, using HLS instead will likely make more sense. There's one more thing your users will want to continue doing: The ability to AirPlay to their favorite TVs. One of the great benefit of using native HLS was the automatic support for AirPlay. With AirPlay, while sitting on your couch, you can move the video from your phone to your big AirPlay device. Airplay requires a URL that you can send, which doesn't exist in MSE, and this created a problem we also wanted to resolve. In the choosing the right image format earlier, I showed you with the picture element how you could add alternative sources. The video element offers the same mechanism. Simply add your HTTP Live Streaming playlist to a child source element for the video, and when the user AirPlays your content, Safari will switch away from your Managed Media Source and play the HLS stream on the AirPlay device. Safari will automatically add an AirPlay icon to the video player controls and let the user AirPlay the video. If this all sounded too complicated, you can use frameworks, such as HLS.js, that will support Managed Media Source automatically when available and will do all the hard work for you. Making use of HLS.js for handling your video is fairly easy and usable on all web browsers, even those not natively supporting HLS. First, you will need to create a video element in your HTML file like usual. We first check if HLS is natively supported by the browser. If it is, we can directly set the video source attribute to the manifest URL. If not, we check if HLS.js can run, and if so, create a new instance of the HLS.js library and attach it to the video element with ID "my-video." We then load the HLS playlist file, in this case, my-video.m3u8. That's it. With these steps, you should be able to play HLS videos on most browsers.
When designing Managed MSE, we wanted to make sure that nothing was left out by accident and that users continue to get the same level of features as they did in the past. So to activate Managed MSE on Mac, iPad, and iPhone, your player must provide an AirPlay source alternative. You can still have access to Managed MSE without it, but you must explicitly disable AirPlay by calling disableRemotePlayback on your media element from the Remote Playback API. And that's it. Managed MSE supports all the same great technology we added last year, such as SharePlay, spatialized audio, or HDR. Managed MSE is available in Safari 17 on macOS and iPad OS and behind an experimental flag on iPhone. We are so excited for this to finally be coming to iPhone. I hope you'll try the new image formats and experiment with Managed Media Source. Please make sure to test your site with Safari. Also, we release Safari Technology Preview fortnightly, where you can test the latest new features as they come available and before it reaches end users. Like all actively developed programs, glitches and bugs happen from time to time, and we would be immensely grateful if you reported those to bugs.webkit.org should you encounter them. You can also submit comments or suggestions. We are always listening. You can learn about new CSS features in Safari in "What's new in CSS." And check out "Rediscover Safari developer features" to learn about turning on feature flags to try out Managed Media Source. Thank you for watching. ♪ ♪
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.