스트리밍은 대부분의 브라우저와
Developer 앱에서 사용할 수 있습니다.
-
Now Playing and Remote Commands on tvOS
Consistent and intuitive control of media playback is key to many apps on tvOS, and proper use and configuration of MPNowPlayingInfoCenter and MPRemoteCommandCenter are critical to delivering a great user experience. Dive deeper into these frameworks and learn how to ensure a seamless experience whether your app is being controlled using Siri, the Siri Remote, or the iOS Remote app.
리소스
-
다운로드
Welcome to Now Playing and Remote Commands on tvOS. I'm Justin Voss, an engineer on tvOS and in this talk we'll cover how you can provide great metadata and playback experiences for your viewers. First, let's talk about providing Now Playing information. Having accurate and complete metadata about your content will help viewers to understand what they're watching across a variety of interfaces across tvOS and iOS. The metadata you provide is displayed in several places. For video content, the built-in AVPlayerViewController displays metadata along the top of the video screen.
For both video and audio content the TV Remote app for iOS would both display metadata and playback controls. And for audio content, your metadata will be displayed in a badge in the corner of the screen and as a full-screen display when the user is idle.
Let's look at some screenshots of these to better understand them. In this screenshot, you can already tell that we're watching a WWDC talk from last year because the metadata onscreen makes that super clear. This is an example of using the built-in AVPlayerViewController for video content. Here you can see the same talk, but displayed in the TV Remote app for iOS. You can also see the playback controls which we'll cover later in the talk. For audio content, when the Now Playing information changes the listener will see a notification on their screen with the updated metadata. In this screenshot here, you can see what this would look like if we were listening to an audio only version of that WWDC talk.
Finally, if the user puts down the remote for a while tvOS will automatically display the Now Playing metadata in a full-screen view like you see here which really shows off the artwork.
So, hopefully I've convinced you that providing the complete set of Now Playing information is worth your time.
Let's talk about some of the different ways you can do that. You have a few choices so it depends on what kind of technology you're using in your app.
Those of you using TVML can add metadata using the MediaItem JavaScript object.
This object has several properties on it for Now Playing info, including a title, subtitle and description, you can provide a URL to an artwork image and you can even provide information about the content rating such as PG-13 or R and whether the content is explicit. In the code sample, here you can see that I'm creating a MediaItem object with a URL to a video and then configuring a title and description on it and providing a URL to an artwork image. If you're using AVKit to play your content you'll want to use the AV metadata item class to provide Now Playing info.
For each piece of metadata you want to provide, such as a title or artwork, etcetera you'll create an AV mutable metadata item and configure several properties on it. You provide an identifier which tells AVKit what kind of metadata this represents. This is how we know to tell the difference between the object that represents a title versus the description and so on. You provide the value of the metadata itself, which may be a string or raw data bytes. You should set the extendedLanguageTag to specify what language the metadata is intended for. Unless you have a good reason to do otherwise we strongly recommend that you use the value und which means undefined.
The way this property works is that it will only be visible to the user if the user's locale matches the metadata's locale. So, for example, if you had a movie that was produced in the USA you may be tempted to provide the title of that movie with the language tag of English since the title is in English. But if you did that then a viewer in Germany wouldn't see any title at all because the metadata language doesn't match their language. But if you had used the tag und then everyone will see the metadata regardless of their language settings.
Finally, in some cases you need to give us a hint about how to interpret the value you've provided. In the case of providing artwork image you'll give us the raw bytes of the image as the value and then use the dataType to tell us if the image is a JPEG or a PNG. Let's look at some code samples to get a feel for how this works. Here you can see the code where I'm creating an AV playerItem and giving it a title and description.
In the first large code block I'm creating an AVMutableMetadataItem to represent the title. The first thing that I do is I set the identifier to AVMetadataCommon IdentifierTitle so that AVKit knows that this metadata item is the title. Then I set the actual value as a string. The AV metadata item API requires these values to be NS object types so casting the string to an NSString.
Finally, I'm setting the extendedLanguageTag to und because the title of this talk should appear the same worldwide. In the second large code block, I'm doing most of the same things to assign the description. Other than providing a different value you can see that on this line I'm setting the identifier to AVMetadataCommon IdentifierDescription so that AVKit knows how to interpret this item.
Finally, on the last line I assign an array of these metadata items to the AV playerItem's external metadata property. If you wanted to provide the release date of your content as a metadata item you can do that if you format your release date as a string in a specific format. This code sample shows you how you can do that. Here you see that I'm creating the date object for an arbitrary date. Then I create a date formatter and provide a specific date format string here on this line. To create the metadata item, I use the identifier AVMetadataCommon IdentifierCreationDate and finally, I use the date formatter to convert the date object into a formatted string and cast that string as an NSString.
Now in AVPlayerViewController viewers will be able to see the release year of the video alongside the other Now Playing info.
Finally, here's an example of how to provide image metadata for your content.
Here you can see that the first thing I need to do is to get my image as a data object. I'm doing that by loading the image from a JPEG that's in my app bundle, but you could get your artwork from any source. Then to create the metadata item the first thing I need to specify is my identifier, which in this case should be AVMetadataCommon IdentifierArtwork.
Next, I provide the artwork data itself and cast the Swift data object to an NSData.
Finally, to indicate what kind of image this is I need to set the dataType property. Since I know this image is a JPEG I'm going to provide this constant value here to indicate that. Okay to give you a visual reference of where some of these items appear onscreen here's a screenshot of the AVPlayerViewController with some annotations on it. The color coding will show where each metadata identifier is displayed. So, this is where the artwork will be displayed, the title, the creation date, and the description. And here's a similar screenshot of the TV Remote app. You can see here the Remote app doesn't display exactly the same set of metadata as AVPlayerViewController. It displays the same artwork and title, but instead of displaying the description it instead displays the artist and the album name.
You can see the AV foundation identifiers to use here.
Okay that's all for AVKit. If you're just playing your video using some other technology like Video Toolbox or if you're playing audio only content then you'll need another way to provide Now Playing info. In those cases, you can use the MPNowPlayingInfoCenter.
This is a singleton object that has a dictionary property that you can write your metadata into. So, there's a key for title, for album name, for artwork and so on. In addition to the metadata that you are probably expecting you can also specify explicitly if the content is audio or video. You can also provide some information about the content duration and the user's current playback position. Your artwork is not provided as raw bytes, but as an MPMediaItemArtwork object which we'll talk about more in just a minute.
Finally, it's your responsibility to update this metadata dictionary as the playback state changes. You do not need to update it every second or even every minute, but you should update it when certain events occur. We recommend that you update it when the currently playing item changes, if any metadata about the currently playing item changes, such as the title or artist name, if the user seeks to a new position of the content or if the playback rate changes and finally, if playback begins or stops. Let's take a look at some code.
Here in the first code block, I'm creating the object that's going to represent my artwork.
The way this MPMediaItemArtwork class works is that you provide us with the native size of the image in a block that we'll call later with specific image sizes. The way this MPMediaItemArtwork class works is that you provide us with a native size of the image in a block that we'll call later with specific image sizes. This block should return a UIImage object that closely fits inside the size that's passed to the block. So, for example, your app may have downloaded several different sizes of the same artwork say in small, medium and large sizes. When we call this block, you should take the size we provide and return the image that most closely fits the requested size.
We would discourage you from trying to perform expensive image resizing operations here, just return the image that you already have.
In the second block of code, this is where I'm actually providing all of my metadata to MPNowPlayingInfoCenter. You can see that this is just a plain Swift dictionary with some keys that are provided by the framework and I'm simply providing my values as regular Swift types.
There are two properties that I want to call out in particular though. The ElapsedPlaybackTime and the PlaybackDuration. Like I mentioned earlier, you should provide these keys so that tvOS knows how long your content is and where the user is currently at within the content. As various playback events occur you should update the ElapsedPlaybackTime to match where the user currently is.
There's no AVPlayerViewController for this API on tvOS, but I can show you what this metadata looks like in the TV Remote app. You can see where each property is displayed onscreen. These should look pretty familiar compared to the AVKit version.
Take note that in this version though the scrubber bar is under your control. The ElapsedPlaybackTime and the duration need to be provided by your app to be displayed correctly here.
All right, let's change gears a bit now and talk about how your app can handle external playback commands. Of course, your app will have its own controls that the user can interact with when your app is displayed, but the user may also want to control your app from the TV Remote app on iOS or if they're listening to background audio from your app on tvOS pressing the play pause button on the Siri remote can pause your app even while it's in the background. To support these kinds of interactions you will need to make sure your app can handle these Remote Commands.
The way to do that is with an API called MPRemoteCommandCenter.
The way to do that is with an API called MPRemoteCommandCenter. This is another singleton object that has a property for each different kind of command that your app can choose to support. For each command that you want support you can register either a target-action pair or a callback block which will be invoked when the command is performed. If you provide a target-action or a block the command is assumed to be supported. If you want to provide a handler for the command, but need to temporarily indicate that it's not available you can mark it as disabled. The method or block that you provide must return an enum value to indicate if your app was able to successfully perform the command. The definition of success here is pretty broad. For example, if the user is playing the last song in a playlist and requests to skip to the next track your app may choose to simply end playback. That would still be considered successfully handling the command so you should return the success value in that case.
Some commands like play and pause are pretty simple. If the handler for the pause command is invoked it's pretty clear what your app needs to do. But other commands are more flexible and both accept parameters from your app and can provide more parameters back to your handler.
For example, the MPSkipIntervalCommand is used to allow the user to skip forward or backward by several seconds.
Your app can express a preference for how many seconds the user should be allowed to skip. The skip command object allows you to configure that preference. When the user actually performs the skip command your handler will be invoked with an object parameter. That object is of the type MPSkipIntervalCommandEvent and it has an interval property that tells your app how many seconds you should skip. And be careful this may not be the same as the skip preference you provided earlier.
Let's look at an example. Here's how an app might implement the skip backward command.
You can see here that on the first line I would prefer that the user skips backward by 10 second intervals so I'm going to assign the preferred intervals property on the SkipBackwardCommand object. Then I'm going to provide my handler block. Now technically every command handler receives the generic type MPRemoteCommandEvent as the parameter, but I need to cast it to an MPSkipIntervalCommandEvent so that I can get to its interval property.
Then I'm going to actually perform the command. In this sample, I'm playing my content with an AVPlayer. Since AVPlayer expresses its time using CMTime structs that's how I'm going to need to calculate my new playback position.
You can see here that I'm getting the player's current composition then creating a new CMTime struct with the skip interval.
Finally, I'm subtracting that interval from the current time.
I ask the player to seek to that time and then return successful from the command handler. When the player finishes seeking to the new time my completion handler will be invoked and that's where I'm going to update my Now Playing info. Remember, if we modify playback state like seeking to a new time we need to publish new Now Playing info so we can update the elapsed playback time.
Here I'm calling a function called updateNowPlaying which I'll show you on this next slide.
Here you can see how I'm going to update my Now Playing info. This is basically the same as the example I showed you at the beginning of this section, but want to call out one specific detail.
Here you can see that I'm using the AVPlayer's current time property in order to provide an accurate value for the elapsed playback time property.
In this case, since AVPlayer is going to return a CMTime, but MPNowPlayingInfoCenter requires a floating-point value, that's the number of elapsed seconds, I need to use the CMTimeGetSeconds function to convert the value.
And that's all there is to providing a great playback experience with Remote Commands and Now Playing information.
For more information, you can visit the URL shown here to find links to documentation and related sessions. Thanks for watching.
-
-
찾고 계신 콘텐츠가 있나요? 위에 주제를 입력하고 원하는 내용을 바로 검색해 보세요.
쿼리를 제출하는 중에 오류가 발생했습니다. 인터넷 연결을 확인하고 다시 시도해 주세요.