tvOS 10
This article summarizes the key developer-related features introduced in tvOS 10. The article also lists the documents that describe new features in more detail.
For late-breaking news and information about known issues, see tvOS downloads. For the complete list of new APIs added in tvOS 10, see tvOS 10.0 API Diffs.
Security and Privacy Enhancements
tvOS 10 introduces several changes and additions that help you improve the security of your code and maintain the privacy of user data. To learn more about these items, see https://developer.apple.com/security/
.
The new
NSAllowsArbitraryLoadsInWebContent
key for yourInfo.plist
file gives you a convenient way to allow arbitrary web page loads to work while retaining ATS protections for the rest of your app.The SecKey API includes improvements for asymmetric key generation. Use the SecKey API instead of the deprecated Common Data Security Architecture (CDSA) APIs.
The SSLv3 cryptographic protocol and the RC4 symmetric cipher suite will no longer be supported. It’s recommended that you stop using the SHA-1 and 3DES cryptographic algorithms as soon as possible.
The
UIPasteboard
class supports the Clipboard feature, which lets users copy and paste between devices, and includes API you can use to restrict a pasteboard to a specific device and set an expiration timestamp after which the pasteboard is cleared. Additionally, named pasteboards are no longer persistent—instead, you should use shared containers—and the “Find” pasteboard (that is, the pasteboard identified by theUIPasteboardNameFind
constant) is unavailable.You must statically declare your app’s intended use of protected data classes by including the appropriate purpose string keys in your
Info.plist
file. For example, you must include the NSCalendarsUsageDescription key to access the user’s Calendar data. If you don’t include the relevant purpose string keys, your app exits when it tries to access the data.
Swift 3
The latest release of Swift includes significant refinement to API naming designed to enhance your code's consistency and clarity. Also in this release, important frameworks have been upgraded to native Swift interfaces, such as Core Graphics and Grand Central Dispatch. To learn about what’s new in Swift, see The Swift Programming Language (Swift 3).
New User Interface Styles
tvOS 10 supports user switching between light and dark colored interface styles. The existing UIKit classes adapt their behavior to this style change. If you are implementing your own custom classes, you can use a new UITraitCollection
type to adapt to the current interface style.
Video Subscriber Account
tvOS 10 introduces the Video Subscriber Account framework (VideoSubscriberAccount.framework
) to help apps that support authenticated streaming or authenticated video on demand (also known as TV Everywhere) authenticate with their cable or satellite TV provider. Using the APIs in this framework can help you support a single sign-in experience in which users sign in once to unlock access in all of the streaming video apps that their subscription supports.
Wide Color
Most graphics frameworks throughout the system, including Core Graphics, Core Image, Metal, and AVFoundation, have substantially improved support for extended-range pixel formats and wide-gamut color spaces. In addition, UIKit standardizes on working in a new extended sRGB color space, making it easy to mix sRGB colors with colors in other, wider color gamuts without a significant performance penalty.
Here are some best practices to adopt as you start working with Wide Color.
In tvOS 10, the
UIColor
class uses the extended sRGB color space and its initializers no longer clamp raw component values to between0.0
and1.0
. If your app relies on UIKit to clamp component values (whether you’re creating a color or asking a color for its component values), you need to change your app’s behavior when you link against tvOS 10.If your app renders custom image objects, use the new
UIGraphicsImageRenderer
class to control whether the destination bitmap is created using an extended-range or standard-range format.If you are performing your own image processing on wide-gamut devices using a lower level API, such as Core Graphics or Metal, you should use an extended range color space and a pixel format that supports 16-bit floating-point component values. When clamping of color values is necessary, you should do so explicitly.
Core Graphics, Core Image, and Metal Performance Shaders provide new options for easily converting colors and images between color spaces.
Existing Frameworks Now Available in tvOS
The following iOS frameworks were added to tvOS 10:
ExternalAccessory
HomeKit
MultipeerConnectivity
Photos
ReplayKit
UserNotifications
Changes to Existing Frameworks
AVFoundation
The AVFoundation framework (AVFoundation.framework
) includes the following enhancements:
You no longer need to implement different behaviors for
AVPlayerItem
, depending on whether the content is a movie file or HLS content. In apps that link on or after tvOS 10, you simply set therate
property and AVFoundation determines when enough content has been buffered to play without stalling.The
AVPlayerLooper
class makes it easier to loop a particular piece of media content during playback.
AVKit
The AVKit framework (AVKit.framework
) includes the following enhancements:
When displaying a
AVPlayerViewController
object, you can configure the skipping behavior of the player. For example, a skipping gesture might advance the content or it might skip to the next item in the playlist.
Core Data
The Core Data framework (CoreData.framework
) includes the following enhancements:
NSPersistentStoreCoordinator
now maintains a connection pool for SQLite stores. RootNSManagedObjectContext
objects (those without parent MOCs) transparently support concurrent fetching and faulting without serializing against each other.NSManagedObjectContext
objects with SQLite stores in WAL journal_mode support a new feature called query generations. These allow a MOC to be pinned to a version of the database at a point in time and perform all future fetching and faulting against that version of the database. Pinned MOCs are moved to the most recent transaction with any save, and query generations do not survive the process's life time.The new
NSPersistentContainer
class provides your app with a high-level integration point that maintains references to yourNSPersistentStoreCoordinator
,NSManagedObjectModel
, and other configuration resources.Core Data now has tighter integration with Xcode and automatically generates and updates your
NSManagedObject
subclasses.NSManagedObject
includes several additional convenience methods, making it easier to fetch and create subclasses.NSManagedObject
subclasses that have a 1:1 relationship with an entity now supportentity
.Core Data introduces several API adjustments that provide better integration with Swift, including parameterized
NSFetchRequest
objects.
For more information, see Core Data Framework Reference.
Core Graphics
Core graphics now supports rendering to bitmaps with floating-point component values.
The CGColorConverterRef
type can be used to perform a series of color conversions.
Core Image
The Core Image framework (CoreImage.framework
) includes several enhancements:
You can now insert custom processing into a Core Image filter graph by using the
imageWithExtent:processorDescription:argumentDigest:inputFormat:outputFormat:options:roiCallback:processor:
method. This method adds a callback block that Core Image invokes in between filters when processing an image for display or output; in the block, you can access the pixel buffers or Metal textures containing the current state of the processed image and apply your own image processing algorithms.When using a custom processor block or writing filter kernels, you can process images in a color space other than the Core Image context’s working color space. Use the
imageByColorMatchingWorkingSpaceToColorSpace:
andimageByColorMatchingColorSpaceToWorkingSpace:
methods to convert into and out of your color space before and after processing.Performance is significantly improved for rendering
UIImage
objects that are backed by Core Image images (such as those created by using theinitWithCIImage:
initializer) in aUIImageView
object.Core Image kernel code can now request a specific output pixel format.
Core Image introduces five new filters:
CINinePartTiled
CINinePartStretched
CIHueSaturationValueGradient
CIEdgePreserveUpsampleFilter
CIClamp
Foundation
The Foundation framework (Foundation.framework) contains many enhancements, such as:
The new
NSDateInterval
class defines a programmatic interface for calculating the duration of a time interval and determining whether a date falls within it, as well as comparing date intervals and checking to see whether they intersect.The
NSLocale
class defines many new properties that you can use to get information about a locale and how it can be displayed.The new
NSMeasurement
class helps you convert measurements into different units, and calculate the sum or difference between two measurements. The newNSMeasurementFormatter
class helps you create localized representations of measurements when displaying quantities of units to the user.The new
NSUnit
class and concreteNSDimension
subclasses help you represent specific units of measure.
GameKit
The GameKit framework (GameKit.framework
) includes the following changes and enhancements:
A new account type, implemented by the
GKCloudPlayer
class, supports iCloud-only game accounts.Game Center provides a new generalized solution for managing persistent storage of data on Game Center. A game session (
GKGameSession
) has a list of players who are the session’s participants. Your game’s implementation defines when and how a participant stores or retrieves data from the server or exchanges data between players. Game sessions can often replace existing turn-based matches, real-time matches, and persistent save games, and also enable other models of interaction between participants.
GameplayKit
The GameplayKit framework (GameplayKit.framework
) includes the following changes and enhancements:
Procedural noise generation can be used to generate rich game worlds, create sophisticated natural-looking textures, and add realism to camera movement.
Spatial partitioning lets you partition your game world data so that the data in the game world can be searched efficiently.
A new Monte Carlo strategist (
GKMonteCarloStrategist
) helps you model games where exhaustive computation of possible moves is difficult.The new decision tree API can enhance your game-building AI when you adopt decision-tree learning to generalize behavior based on data mining of logged player actions. To learn more, see
GKDecisionTree
andGKDecisionNode
.The
GKAgent3D
andGKGraphNode3D
classes introduce 3D support to existing agent and path-finding behavior.The new
GKMeshGraph
class provides a higher performance alternative toGKObstacleGraph
, allowing you to produce more natural-looking output at the cost of less mathematically perfect paths.The new
GKScene
andGKSKNodeComponent
classes, combined with changes in SpriteKit and the Xcode editor, make integrating GameplayKit with SpriteKit easier than ever.
Metal
In tvOS 10, Metal includes several new features and enhancements, such as:
Function Specialization, which makes it easy to create a collection of highly optimized functions to handle all the material and light combinations in a scene.
Resource Heaps and Memoryless Render Targets, which grant even finer-grained control of resource allocation to further optimize the performance of Metal-based apps.
For further information, see theWhat’s New in iOS 10, tvOS 10, and OS X 10.12 chapter of the Metal Programming Guide.
Metal Performance Shaders
The Metal Performance Shaders framework (MetalPerformanceShaders.framework
) has many new kernels to help you take advantage of highly-optimized data-parallel computations, such as color space conversions and neural network operations.
ModelIO
The ModelIO framework (ModelIO.framework
) includes the following enhancements:
The USD file format is now supported.
The new
MDLMaterialPropertyGraph
class makes it easier to support runtime procedural changes to models.The
MDLVoxelArray
class adds support for signed distance fields.You can add assisted light probe placement by implementing the
MDLLightProbeIrradianceDataSource
protocol.
Photos
The Photos framework (Photos.framework
) makes Live Photo editing available to apps that use Photos framework APIs to access the user's Photos library and to photo editing app extensions for use in the Photos and Camera apps. Specifically, the new PHLivePhotoEditingContext
class lets you apply edits to the video and still photo content of a Live Photo, with an easy-to-use API based on Core Image enhancements. In addition, you can take advantage of the new Core Image processor feature to use other image processing technologies to perform edits. To learn more, see CIImageProcessorInput
and CIImageProcessorOutput
.
SceneKit
The SceneKit framework (SceneKit.framework
) includes several enhancements.
A new Physically Based Rendering (PBR) system allows you to leverage the latest in 3D graphics research to create more realistic results with simpler asset authoring. Specifically:
Use the new
SCNLightingModelPhysicallyBased
shading model to opt into PBR shading for materials. PBR materials require only three fundamental properties—diffuse
,metalness
, androughness
—to produce a wide range of realistic shading effects. (Thenormal
,ambientOcclusion
, andselfIllumination
material properties also remain useful for PBR materials, but you can now ignore the large number of other properties used for traditional materials.)PBR shading works best with environment-based lighting, which causes even diffuse surfaces to pick up the colors of the scene around them. Use the
lightingEnvironment
property to assign global image-based lighting to an entire scene, and place light probes in the Xcode scene editor to pick up the local lighting contributions from objects within your scene.Authors of PBR scene content often prefer working in physically based terms, so you can now define lighting using intensity (in lumens) and color temperature (in degrees Kelvin), and import specifications for real-world light fixtures using the
IESProfileURL
property.
Add even more realism with the new HDR features and effects in the SCNCamera
class. With HDR rendering, SceneKit captures a much wider range of brightness and contrast in a scene, then allows you to customize the tone mapping that adapts that scene for the narrower range of a device’s display. Enable exposure adaptation to create automatic effects when, for example, the player in your game moves from a darkened area into sunlight. Or use vignetting, color fringing, and color grading to add a filmic look to your game.
Although linear, more color-accurate rendering is the basis for PBR shading and HDR camera features, it produces better results even for traditional rendering. By default, SceneKit now performs all color calculations in a linear (not gamma-adjusted) color space, and uses the P3 color gamut of devices that include wide-color displays. This feature is enabled automatically for all apps linking against the tvOS 10 SDK, and has a few ramifications for content design and asset management:
SceneKit color matches all colors. In previous versions, SceneKit would read only the color values from material colors specified as
NSColor
orUIColor
objects, ignoring color profile information and assuming the sRGB color space.SceneKit interprets color component values specified within shader modifier or custom Metal or OpenGL shader code in linear RGB space.
SceneKit reads and adjusts for color profile information in texture images. Design textures for a linear brightness ramp, and use Asset Catalogs in Xcode to make sure your images use the correct color profile.
If necessary, you can disable linear space rendering with the
SCNDisableLinearSpaceRendering
key in your app’sInfo.plist
file, and wide color rendering with theSCNDisableWideGamut
key.
Geometry can now be loaded from scene files or programmatically defined using arbitrary polygon primitives (SCNGeometryPrimitiveTypePolygon
). SceneKit automatically triangulates polygon meshes for rendering, but makes use of the underlying polygon mesh for more accurate surface subdivision (to learn more, see the subdivisionLevel
property).
SpriteKit
The SpriteKit framework (SpriteKit.framework
) includes the following enhancements:
A new tilemap solution supports square, hexagonal, and isometric tilemaps that make it easy to create 2D, 2.5D, and side-scroller games. The Xcode editor provides comprehensive support for organizing your tiles and creating your tilemap. For more information, see the
SKTileMapNode
,SKTileGroup
,SKTileGroupRule
, andSKTileSet
classes .The new
SKWarpGeometry
class is used to stretch or distort how aSKSpriteNode
orSKEffectNode
object is rendered. The warp is specified by a set of control points. NewSKAction
types can be used to animate between different warp effects.A custom shader can use attributes that can be configured separately by each node that uses the shader. To add an attribute, create an
SKAttribute
object and attach it to your shader. Then, for each node that uses that shader, attach anSKAttributeValue
object.]The
SKView
class defines new methods that give you finer control over when and how your scene is rendered.
TVMLKit
The TVMLKit framework (TVMLKit.framework
) includes many enhancements, such as:
New Light and Dark appearance—Set your TVML pages to explicitly take on a light or dark appearance. You can set your TVML pages to automatically change to light or dark based on the system preferences.
Embedded video playback—Embed videos directly into TVML elements. Set videos to play immediately upon page presentation or when an element comes into focus.
Interactive video overlays—Create overlays for videos that contain focusable elements.
Now Playing—Add a ‘Now Playing’ tab to menu bars. Users can now see what audio track is currently playing.
Animatable DOM Updates—Animate changes to the DOM as a user is filtering search results.
Added several new styles and attributes
UIKit
The UIKit framework (UIKit.framework
) includes many enhancements, such as:
Previously, focus-based interaction in the user interface was limited to views. The focus API has been improved to support focus on non-view objects. A focus item implements the
UIFocusItem
protocol. In this release, onlyUIView
andSKNode
implement this protocol. Apps that used the older focus API continue to work without modification.The new
UIGraphicsRenderer
class provides an object-oriented model for creating bitmap or PDFs using UIKit rendering or Core Graphics. This class replaces the old mechanism provided by theUIGraphicsBeginImageContext
function. The new design uses a graphics renderer object to specify the default drawing characteristics of the target environment. For example, you can specify whether an image should be rendered using an extended range pixel format.A new trait collection type was added to determine which user interface style is in use. See
UIUserInterfaceStyle
.New object-based, fully interactive and interruptible animation support that lets you retain control of your animations and link them with gesture-based interactions. To learn more, see UIViewAnimating Protocol Reference, UIViewPropertyAnimator Class Reference, UITimingCurveProvider Protocol Reference, UICubicTimingParameters Class Reference, and UISpringTimingParameters Class Reference.
The new
UIPreviewInteraction
class andUIPreviewInteractionDelegate
protocol, which let you provide a custom user interface related to the peek and pop experience.The new
UIAccessibilityCustomRotor
class and related classes that help you provide custom, context-specific functionality that assistive technologies such as VoiceOver can expose to users. For example, you might create a custom rotor that lets VoiceOver users find misspelled words in a document by repeatedly returning the range of text that contains the next misspelled word.The
UIAccessibilityIsAssistiveTouchRunning
andUIAccessibilityAssistiveTouchStatusDidChangeNotification
symbols, which let you determine when AssistiveTouch is enabled, and theUIAccessibilityHearingDevicePairedEar
andUIAccessibilityHearingDevicePairedEarDidChangeNotification
symbols, which give you the pairing status of MFi hearing aids.New
UIPasteboard
API that automatically declares compatible content types for common class instances and new options that limit the lifetime of objects on the pasteboard.New options in
UIPasteboard
The new
preferredFontForTextStyle:compatibleWithTraitCollection:
UIFont
method, which lets you add support for Dynamic Type in labels, text fields, and other text areas.The
UIContentSizeCategoryAdjusting
protocol, which provides theadjustsFontForContentSizeCategory
property that you can use to determine if the adopting element should update its font when the device’sUIContentSizeCategory
changes.Additional control over the appearance of the badge on a tab bar item, such as background color and text attributes.
Support for the refresh control in all scroll views and scroll-view subclasses, such as
UICollectionView
.The new
UIApplication
methodopenURL:options:completionHandler:
, which is executed asynchronously and calls the specified completion handler on the main queue (this method replacesopenURL:
).The new
UICloudSharingController
class andUICloudSharingControllerDelegate
protocol, which help you initiate a CloudKit sharing operation and display a view controller that lets users view and modify participants and start and stop sharing.Enhancements to
UICollectionView
and the newUICollectionViewDataSourcePrefetching
protocol, which help you take advantage of automatic prefetching of cells to improve the scrolling experience.
Copyright © 2017 Apple Inc. All Rights Reserved. Terms of Use | Privacy Policy | Updated: 2017-06-06