스트리밍은 대부분의 브라우저와
Developer 앱에서 사용할 수 있습니다.
-
Going Beyond 2D with SpriteKit
SpriteKit makes it easy to create high-performance, power-efficient 2D games and more. See how to take SpriteKit objects into Augmented Reality through seamless integration with ARKit. Learn about mixing 2D and 3D content and applying realistic transformations. Take direct control over SpriteKit rendering and walk through offline rendering into a Metal texture.
리소스
-
다운로드
Morning everyone.
I'm Ross Dexter. I'm an engineer on the Games Technologies Team at Apple. And I'd like to welcome you to going beyond 2D with SpriteKit.
So before we dive in I'd like to quickly talk about what SpriteKit is and where it fits in the Apple rendering picture. SpriteKit is Apple's 2D graphics framework for games. And it's designed to be flexible, fast, and easy to use. It's supported across all of our platforms and has an Xcode integrated life editor to make lying out and previewing your game content quick and easy.
SpriteKit sits along SceneKit or other games oriented graphics framework and both sit on top of Metal. Traditionally they've all been used separately in different contexts. SpriteKit for quick and easy 2D, SceneKit as all ready to use 3D engine, and Metal to give you direct access to your devices rendering hardware. Instead of keeping all three separate we think it's time that SpriteKit breaks out of its 2D mold.
SpriteKit has a great deal to offer to make using it in combination with SceneKit and Metal attractive. Since they both use Metal under the hood, it should be trivial to render SpriteKit content in SceneKit or be able to pipe it back into Metal use however you wish.
Lots of 3D games and apps feature 2D content and SpriteKit provides the perfects means for creating and rendering that content. On top of that this year Apple is introducing ARKit, which takes all the hard work out of creating augmented reality apps. The addition of this new framework provides one more reason why it's time for SpriteKit to go beyond 2D and into the third dimension. And today we're going to show you how to do it and what you can achieve. In this session we're going to cover how to render SpriteKit content in ARKit, bringing SpriteKit to the world of augmented reality.
Next, we'll show you how to get your SpriteKit scenes into SceneKit and how that can improve your augmented reality apps. And then finally we'll introduce you to SKRenderer, which allows you to take more control of how SpriteKit updates and renders. All right, let's dive right in with how you can use SpriteKit and ARKit together.
But first we should talk about what augmented reality actually is. Augmented reality combines a real world view with computer rendered content. That content gets attached to locations in the real world so that as you move your device and the view shifts, the content appears to remain in place. That allows you to inspect the content from different angels as if it were a physical object in front of your device. And this requires a lot of complex tracking and is a real challenge to implement.
Thankfully ARKit does all the hard work for you. When you use ARKit it leverages your devices camera, accelerometer and other hardware to track its position and orientation in the real world. All you have to do is provide it the content you want to appear in AR, and ARKit automatically updates the relative positioning of your content as your device moves.
If you like a deeper dive into exactly how this all works, I highly recommend checking out the Introducing ARKit session that happened earlier this week. ARKit is able to track and update the position of your content through the use of anchors. They are what make AR work.
Anchors are 3D points that correspond to real world features that ARKit detects through Scene and understanding, which uses your devices camera to perceive and process the world around you.
Anchors are easy to create. You can request ARKit to detect one at anytime through the API. Or you can create one manually using your devices position and orientation. So how do we get ARKit working with SpriteKit content? ARKit is designed to interact directly with SpriteKit. ARKit will ask that your -- ask your first SpriteKit nodes is attached to anchors and then will automatically position, rotate, and scale those nodes as the device moves.
It does this so your SpriteKit content will stay aligned with the anchors giving the appearance that your content is rooted in the real world.
Sprites are rendered so they are always facing the camera. So no matter what angle you view them from they're always facing the camera.
This is a technique known as billboarding and it's commonly used in early 3D games.
So you may not be familiar with how billboarding works, so let's go over a few quick examples of how you can use it, it lets you use 2D content in 3D space.
So say we have sprite position in 3D space and a camera observing it.
As the camera moves closer to the sprite the sprite grows larger as you'd expect taking more of the view.
As the camera moves further away, the sprite shrinks.
Now I'll rotate the camera. As the camera changes its point of view, the sprite continues to face the camera at all times. And this should be the case from any angle we view it from.
Let's add another sprite to our 3D scene here to show how this works with multiple 2D objects. Sprites that are further away are rendered behind sprites that are closer to the camera. As the camera moves the more distant sprite comes into view.
Both sprites always face the camera and this simple technique allows your 2D sprite content to work in a 3D spaces. So now that we've shown you how ARKit and SpriteKit work together at a conceptual level, let's talk about the actual objects you'll need to implement your app. To work with ARKit and SpriteKit there are four important objects for you to know about -- ARSession, ARAnchor, ARSKView, and ARSKViewDelegate. ARSession is the heart of ARKit. It handles all the device tracking and orchestrates the interactions between ARKit and SpriteKit.
It has methods for adding and removing anchors that you create in your app. To get started up, you just call the Run method and ARSession will began tracking your device. You just need to provide it an ARSession Configuration which tells the ARKit which AR techniques it should use. So in working with SpriteKit you should just use the AR World Tracking Session Configuration, which enables all the functionality you'll need from ARKit. ARKit defines real-world features through ARAnchor. It represents a position in the real world and contains transform data as well as a unique identifier.
ARKit maps ARAnchors to the SKNodes that we provided to render our contents. ARKit interacts with SpriteKit through ARSKView, which is derived from SKView.
It creates and contains the ARSession so you don't need to create it manually. And it has methods for getting related anchors and nodes. So you need to manually track which node corresponds to what anchor and vise versa. It also has a hitTest method, which is your primary way of creating anchors. It takes a point on your devices screen and shoots a ray through it. So looking for the nearest point in the real world for you to attach stuff to. Finally, there's ARSKViewDelegate. A protocol derived from SKViewDelegate, which helps you react to anchors being added, updated, and removed from the session.
All of its methods are optional and they're the key to using SpriteKit and ARKit together. But we'll come back to that in a bit. Let's get started on creating our first ARKit app with SpriteKit.
First we'll create a new iOS project in Xcode. You'll see that in Xcode 9, there's a new augmented reality app template for you to choose. Once you choose the app template to get started with SpriteKit select it as your content technology. And that's all there is to it. You're now ready to enter augmented reality.
The resulting project looks pretty standard for an iOS app, but let's go through the files that are important to using SpriteKit and ARKit together.
First there's Scene.sks. This is a standard SpriteKit scene and it's where you create and layout any non-AR content that you want to appear in your app. It will act like an overlay for the AR content. And so it's useful for things like HUD elements, help text, stuff like that.
Nodes that have a Z position greater than or equal to 0 will draw over any AR content that ARKit adds to the scene.
All nodes are managed by -- that are managed by ARKit have Z positions that are less than 0.
Next, Scene.swift. This is a SpriteKit scene's corresponding source file.
As with normal SpriteKit apps, this is where you put code to manage your scene and Gameplay and logic and is a good place to leverage gameplay kits various features. And finally we have ViewController.swift. The view controller conforms to ARSKViewDelegate and its sceneView property is an instance of ARSKView, which contains the ARSession.
The view controller class is your primary means of interacting with ARKit. In the template it's automatically set up to call run on ARSession with an AR World Tracking Session Configuration. So you don't need to add it yourself.
This is also where you implement the ARSKView Delegate Methods that are relevant to you. Now let's talk about the ARKit Events that the view controller would need to react to.
The first event is when a new anchor gets added to SKSession.
When this happens ARKit will ask the view controller for the SpriteKit nodes you want to associate with an anchor. So this is when we create our AR content.
The second event is when an existing anchor is updated by the session.
When this occurs ARKit informs a view controller so you can react to the update.
And the third and final event is when an anchor is removed from the session.
ARKit tells the view controller so you can form any necessary cleanup in your app.
ARSKViewDelegate provides methods tied to each of these events. As we mentioned, each of these methods is optional so you only need to implement the ones that matter to you. Let's go over each one. First is the node for anchor method.
This gets called when a new anchor is added to the session. An ARKit maps that node return from this method to the anchor that's passed in.
You should implement this if you want to create a custom node for an anchor. If you don't implement the method a default empty SKNode is created for you automatically. The node that gets returned from this method will be moved, rotated, and scaled by ARKit to match its anchor. So if you try and make any changes to the transform they'll likely be overwritten by ARKit when the device moves.
It's useful to know that any children that are assigned to this node won't have their transforms modified. But we'll talk more about this with our next method.
Also know that ARKit automatically adds this node to the scene graph so you don't have to. Next, we have didAdd node for anchor. This is called after an SKNode is mapped to an anchor, so after the previous node for anchor method is executed.
If you implemented the node for anchor method, the node that gets passed in here will be the one you returned from there. If you didn't implement it, it will be a default empty node. As we mentioned in the previous slide the node that's mapped to the anchor has it transformed modified ARKit to follow the anchor as the device moves.
As such, if you want to modify the transforms of your content, you should add them as children in here as ARKit won't modify them. Next, we'll update node for anchor and didUpdate node for anchor.
These methods are called before and after the node is updated with a given anchors data.
True to their names willUpdate node for anchor is called before the update, and didUpdate node for anchor is called after the update.
This occurs when the device moves and the view changes.
The nodes position, rotation, and or scale is subject to change between calls to these methods.
Finally, didRemove node for anchor.
This gets called the node is removed from the scene graph, which occurs when its course on the anchor is removed from ARSession.
All right, that covers the important parts of the API. Let's take a look at some code. So let's cover creating an anchor. Here we're looking at a handler for the touches began event. When the device reports a touch, we get the location of the touch in our ARSKView. We then provide the touch location to the ARSKView's hit test method, which shoots a ray out into the real world looking for feature points that we can turn into anchors.
It returns an array of all the hits that it registers sorted from nearest to furthest. We take the nearest hit and we use this world transform to create and ARAnchor, which we then add to the session. And that's all there is to it. Creating anchors could not be simpler. So now that we've added a new anchor to the session, the session will ask the view controller for the SpriteKit content that we want to attach to it.
To do this we've implemented ARSKView Delegates didAdd node for anchor method.
We haven't implemented node for anchor, so default empty node was created for us. And that's of this method.
So now all we need to do is create the content we want to attach to the anchor and then add it as a child of the node that's passed into this method.
ARKit will automatically update the node so that it will follow the anchor as the device moves so you don't need to do anything else. So now that we've shown you exactly how to use the ARKit API with SpriteKit, let's enter augmented reality.
Let's open our app here. So we see we're doing our video pass through now. We can see our lovely audience out there. You're all famous now.
So let's start placing some content in augmented reality. As I tap on my screen I'm placing content half a meter out in front of me. Here I'm just placing SKLabelNode's with Emoji. Fun fact -- you can use emoji in your SpriteKit apps by just using labels. Just paste them in there. And you see they have them floating in 3D space. And as I move the camera around they move relative to one another. Just placing them in space is a little boring. I can also place them on a surface.
And it will detect intersection with the surface with the HitTest method that we were talking about before. And then it places the emoji on the surface of the table that it's detected. But placing emoji is just a little boring. So now we're going to switch into blasting mode and then we can blow up our emoji, which is a little bit more fun.
Also pay attention to the text in our -- in the lower left hand corner here and the way that our emoji are flipping around when we destroy them, because that's going to become relevant in a moment.
So you see how easy it is to quickly create an app and use a SpriteKit content to enter augmented realty. This is just a slightly tweaked version of the template app that you can create right now using Xcode 9.
And that's SpriteKit with ARKit.
So you see how easy it is to enter augmented reality when using ARKit and SpriteKit together.
As I mentioned there are a few other SpriteKit features in there, some of which that were present in the demo that I'd like to quickly cover.
So hopefully you notice the text on the bottom of our screen in the demo app animating. This was done with a single SKLabelNode, thanks to the fact that they now support attributed strings.
Attributed strings allows you to specify the attributes of each character in a string, letting you mix things like different colors and fonts in the same label. It uses NSAAttributedString and all you have to do is set it on SKLabelNodes new attributedText property.
The emoji in our augmented reality apps spun about in ways that weren't possible until now thanks to the new SKTransformNode.
SKNode already had z rotation, but SKTransformNode adds the ability for you to rotate around the x and y-axis as well.
Aligned for full 3D rotations for SpriteKit content.
And these rotations apply to all child nodes as well.
SKTransformNode uses orthographic projection. So it doesn't apply any perspective skewing to your nodes.
You can specify your rotations through the x, y, and z rotation properties or you can use a specialized getters and setters for Euler angles, rotation matrices, and quaternions. Finally, on the ender side of things, SpriteKit is fully compatible with Xcode's View Debugger.
It displays the entire scene graph and gives you the neat exploded 3D view of your scene, which can be a great help when you're trying to debug layout -- content layout issues. It also allows you to inspect all of your node properties so you can see the state of everything in your scene as the moment the app was paused. This is a really great new feature, so please check out the debugging with Xcode 9 session for more information on it.
We've shown you how quick and easy it is to get started with using ARKit and SpriteKit to create an augmented reality app.
ARKit handles all of the hard parts of augmented reality for you and SpriteKit makes rendering content a snap.
We've also introduced new features to SpriteKit that give you greater flexibility in developing your apps and give you new options for debugging SpriteKit content. But you may have a few lingering questions. What if we don't want billboarded sprites? What if we want our SpriteKit content to be effective by perspective? What if we want to mix 2D and 3D content in augmented reality? What if wanted to take SpriteKit further into 3D? The answer lies in combining SceneKit, SpriteKit, and ARKit all at once. SceneKit and SpriteKit's 3D counterpart, it's a ready to use 3D engine with it's own Xcode integrated live editor.
What you may not know is that you can use SpriteKit Scenes as the material on geometry in SceneKit.
That allows you to render SpriteKit content with complex 3D transforms and perspective effect.
Additionally you can easily mix 2D SpriteKit content with 3D SceneKit content in the same context. Like SpriteKit, SceneKit is also integrated with ARKit, creating a project that uses SceneKit in an augmented reality app is the same as with SpriteKit.
Just change the content technology to SceneKit. The API is designed to be very similar to the one you use with SpriteKit. Only the names of few objects are different. ARSKView becomes ARSCNView and ARSKViewDelegate becomes ARSCNViewDelegate.
As with SpriteKit the template creates ARSCNView for you and the ViewController conforms to ARSCNViewDelegate. Now in the interest in brevity and because the API is so similar, we won't bother covering the rest here. So next we want to get out SpriteKit content rendering within a SceneKit scene.
Normally with SpriteKit you have your scene and you set it on an SKView.
SKView then works with UIKit or AppKit on Mac OS to get your content on screen.
To get your content rendering in SceneKit things are handled a little differently. Instead of setting your scene on the view, you set it on the material property of the geometry on which you want the SpriteKit content to appear. And then that material works with SceneKit to render you SpriteKit content and then texture map it onto the geometry the material is associated with. So let's go over a few examples of rendering SpriteKit content on SceneKit Geometry.
Here was have a basic SpriteKit scene. And here's what we get when we render that on a scene on a plane in scene kit.
We also apply the SpriteKit's scene to a cube or even a sphere. You can use it just like a regular texture and the SpriteKit -- as the SpriteKit scene updates, your texture will be updated along with it.
So now I'd like to quickly show you how easy it is to use SpriteKit with SceneKit.
First, get the SpriteKit scene that you want to use in SceneKit. Next, create the geometry you want the SpriteKit scene to be rendered on. Here we're creating a simple plane.
Then you just need to set the SpriteKit scene as the contents of the diffuse property of the planes material.
That will cause the SceneKit to render the SpriteKit Scene to a texture and then apply it to the geometry. Here we're setting the material to be double-sided. This causes the SpriteKit Scene to appear on both sides of the plane. Then we just need to create a SceneKit node for the plane and add it to the SceneKit scene. Now the plane will show up in the scene with the contents of your SpriteKit scene texture mapped onto it.
So now I'd like to show you demo of some of the things you can do when you use SpriteKit and SceneKit together with ARKit.
All right.
So here I have a demo that I've built on top of the sample code that ARKit has released, that you can find to their session website. And here I just is detecting a plane for us. Now if I tap on the screen it places a SpriteKit scene for us in the world here. And you see that this a fully live SpriteKit scene. You see the trees are animating. And in fact I can actually interact with this directly. Well, here let's actually -- let's blow it up a little bit. Since we are in 3D we can do all kinds of cool things. We can move this guy around, we can rotate, and we can scale, make it a little bigger.
So now I can actually interact with this scene directly. I have controls on my device here, I can move my character around, you can jump around.
Hard to get over that thing. Yeah. And so we can jump around in real time, interact with this just like a normal SpriteKit scene rendered in 3D.
But just having it sit here on the surface is a little boring. We should do something a little more interesting, it a little more 3D. So if I actually touch this button up here, the scene flips up and -- If I touch it again it separates the different layers that I have in here. So we don't actually have just one SpriteKit scene -- We actually have three SpriteKit's scenes here, one for each layer here. One for the middle -- front, middle, and background.
So this shows you the kind of stuff that you can do when you use SpriteKit and SceneKit together with ARKit but maybe you start to feel a little constrained by the level here.
And maybe we want to have our little guy go out on an adventure on his own. He can run out into the real world here. Oh this button looks kind of tempting.
Yay.
So that gives you an idea of what you can do when you use SpriteKit in a 3D context with SceneKit together with ARKit.
So as you saw using SpriteKit with SceneKit allows for full 3D transforms and perspective, which allows you to do some pretty neat things. It allows you to mix 2D and 3D content within the same context and it's fully compatible with the ARKit and it works great in general 3D apps as well. So now that we've covered how to work with ARKit and SceneKit, I would like to introduce you to a another new SpriteKit feature -- SKRenderer.
Let's talk a bit about how SpriteKit works under the hood.
As we've mentioned in the previous section, with normal SpriteKit Rendering you have your scene and you set it on an SKView, which then works with UIKit and AppKit to get your content on the screen.
SKView handles all of the updating and rendering for you.
The upside of this is that it makes it very easy to get started with SpriteKit.
But what if you want to render SpriteKit content in a 3D context? One solution as we showed you is to use SpriteKit with SceneKit. Instead of setting your scene on the view, you use it as a material in SceneKit.
And this let's you do all kinds of cool stuff. But when SpriteKit updates and renders is still out of your hands.
What if you want more control? Maybe you want to update with exact fixed time steps or update without rendering or render without updating or update once and render twice, each time from a different viewpoint.
What if we wanted to work directly with Metal? Enter SKRenderer.
You use it instead of SKView to gain more control over SpriteKit.
Like SKView, to use it you just set it your scene -- you set your scene on it on the renderer. Unlike SKView however, SKRenderer let's you determine when SpriteKit performs updating and rendering.
It allows you to work directly with Metal and then you can do things like render SpriteKit into an off screen texture to use however you want.
This by the way is how SceneKit is able to efficiently render SpriteKit content in 3D. It uses SKRenderer under the hood.
There are four stages to using SKRenderer -- initialization, setting the scene, updating, and rendering.
Initialization occurs once. You set your scene at the start and again we want the transition to a new scene and update and renderer repeat every frame in your app. Let's look at some code for each of these stages. Stage one Initialization. To initialize SKRenderer, all you need to do is provide it with a Metal device. Stage Two, setting the scene. This works exactly the same as with SKView, just that your scene on SKRenderer's scene property.
Stage Three, updating. Also very simple, all you need to do is pass in the current time. Stage four, rendering. This is done by calling the render method of which there are two flavors. Which one you want to use is situational, depending on how you want to use your SpriteKit content with Metal. Both methods ask you to specify your viewport you like to render into, which is just the CGrect that defines the area of SpriteKit will draw into in the render target.
And they both take a Metal render pass descriptor, which describes that render target that you want the SpriteKit content to draw into. Now the first method allows you to specify the command buffer to which SpriteKit will schedule rendering commands.
A good case in which to call this method is if you're not directly mixing SpriteKit content with other Metal content in the same render target. If you want to render a SpriteKit scene into a texture and then apply it through some 3D geometry like we did in the second demo we showed you this is the method you'd want to call.
The second method gives you more granularity by allowing you to direct which render command encoder SpriteKit will encode its render commands. This is good if you want to directly mix SpriteKit and metal content in the same render target. So you want to render some 2D Metal content along with your SpriteKit content or maybe you want to overlay SpriteKit on top of a Metal scene to display HUD elements. By using the same render command and coder, you can do this much more efficiently than if you were using the first method. All right, that's enough of an API crawl. Let's jump into a quick demo showing SpriteKit rendering in 3D with Metal.
SO here we have a 3D scene in Metal. See we've got some nice lighting and some shadows going on here and we have this very tempting looking arcade cabinet with our beautiful SpriteKit framework logo on it.
We walk up to it, Insert Coin, that sounds like of tempting. I've got a quarter here. Let's plop that baby in. Oh, you see we've got a full SpriteKit scene rendering on this 3D Metal scene. And see we're using SKRenderer to render SpriteKit into a texture, which we're then mapping onto the front of our arcade cabinet here. And then we're just applying a cool CRT shader to an in Metal to give it this old school look. I can actually move around from any angle I want to here and I can interact with it like this. It's a little bit of askew, but you know, maybe this gives you some memories of playing on an arcade cabinet with your friends, you're all crowded up against it. And you can view this from any angle, you can view this all from a distance, and it shows you some of the things that you can do with SpriteKit when you use it with Metal. It let's you use it 3D and then you can do whatever you want with that texture.
So that's SpriteKit and Metal working together.
SKRenderer gives you more control over SpriteKit than ever before. It allows you to determine exactly when it updates and renders and by interacting directly with Metal you can use rendered SpriteKit content anyway you see fit.
As you've seen today SpriteKit is useful in both 2D and 3D.
It's built to work well with other graphics frameworks like SceneKit and Metal. And it's closely integrated with ARKit, so that creating augmented reality apps is as easy as possible. We've introduced new features that give you more control than ever allowing you to view SpriteKit content in the View Debugger and also giving you the ability to take direct control over when and how SpriteKit updates and renders with a new SKRenderer. Today we've shown SpriteKit in an entirely new light and we hope that it's given you some perspective on how you can use it in ways that you may have never thought of before.
So for information and access to this session video please visit developer.apple.com/wwdc17/609. And please check out these related sessions. We gave just a small taste of metal in today's section. So please check out Introducing Metal 2 if you're interested in learning more. I highly recommend watching "Introducing ARKit" to learn more about how it works in detail. And if our quick look at SceneKit got your attention you should look at their main session this year.
And there's a lot of great stuff in the Debugging with Xcode 9 on top of SpriteKit working with a new View Debugger. Like wireless debugging which is really awesome, so I recommend watching that as well.
Thanks everyone and please enjoy the rest of the conference.
-
-
찾고 계신 콘텐츠가 있나요? 위에 주제를 입력하고 원하는 내용을 바로 검색해 보세요.
쿼리를 제출하는 중에 오류가 발생했습니다. 인터넷 연결을 확인하고 다시 시도해 주세요.