Letting a touch "fall through"

When dealing with touches in SKNodes, sometimes it would be useful in the touchesBegan method of a node to let a touch "fall through" to whatever node(s) may be below it (that also handle touches). In other words, for a node to say "I'm not handling this touch, pass it to other nodes below me, if there are any".


In Cocos2d this is supported. However, it doesn't seem to be supported in SpriteKit.


The UIResponder documentation says: "The default implementation of this method does nothing. However immediate UIKit subclasses of UIResponder, particularly UIView, forward the message up the responder chain. To forward the message to the next responder, send the message to super (the superclass implementation); do not send the message directly to the next responder."


This would indicate that in order to "not handle" the touch, and let it "fall through", you would simply call the super implementation of touchesBegan from the derived class implementation. However, doing so does nothing. The touch is still swallowed by the node, and not passed to anything else. (I'm guessing the implementation of SKNode is empty.)


Is this yet another limitation of SpriteKit, and there's basically no way around it (other than implementing your own touch handler by catching touches at the scene level and then passing it to its child nodes as needed)?

WarpRulez, I have nothing to say to help you. However I can mention that I always work with touches on scene level and I find it much more convenient. When you catch the touches on the scene level you can handle them in correspondence with your state machine (and parent state machine) and have all the level logic in one class and have zero problems like this.

I asked the same question some time ago. I never really got it to work. I'm also coming from the Cocos2d world, where this was a pretty standard functionality. The reason why I generally want some classes to handle their own touches is that they may be plug-and-play components that I use in multiple projects, for example a button class. I want such a class to be as independent of it's environment as possible.


In my case I have a swiper class (similar to a button). The swiper has a mask that displays the "button area", but really it's composed of a row of swiper pages that are swiped through, with the visible page being the active one. My problem was that I wanted all touches outside of the mask to be passed through, but it wouldn't work, which meant that essentially invisible pages were swallowing touches all over my app. Anyways how I fixed it was that I wrote a function that makes all pages hidden (page.hidden = true) actively while the swiper is dragged. Being hidden takes away the ability to swallow touches.


Anyway, if someone has a nice solution for passing touches, I'm all ears 🙂.


Here's the thread:

https://forums.developer.apple.com/message/109128

All good advice here from pavelgubarev and Pinxto Creative.


One thing I may add is that you can mix both styles of touch handling input if needed. I'll give you a concrete example that I used. I have a menu with buttons, those buttons are subclassing SKSpriteNode and handle the touchesBegan/touchesMoved/touchesCancelled/touchesEnded there. What I do with these events is purely cosmetic : I simply animate the buttons when pressed/unpressed.


At the same time at the scene level I setup multiple gestureRecognizers, one of them for taps. So in didMoveToView I have something like this :


    // Tap gesture 
    tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(respondToTapGesture:)];
    tapRecognizer.numberOfTapsRequired = 1;
    [self.view addGestureRecognizer:tapRecognizer];


Both methods will get fired and in the respondToTapGesture you can get the touchedNode this way :


    CGPoint location = [recognizer locationInView:self.view];
    location.y = self.size.height-location.y;
    SKSpriteNode *touchedNode = (SKSpriteNode *)[self nodeAtPoint:location];


Notice that I used nodeAtPoint which will get you the topmost node. Considering you need, have a look at nodesAtPoint which gives you back an array of all nodes at this point. See documentation here : https://developer.apple.com/library/ios/documentation/SpriteKit/Reference/SKNode_Ref/index.html#//apple_ref/occ/instm/SKNode/nodesAtPoint:


This is convoluted but it should help you do what you want.

By the way, why don't you call parent's event from your node?


like


if (fallThrough) {
     self.parent!.touchesEnded(bla bla)
}

WarpRulez,


There is a standard convention to pass touches on to the next responder (thus, to the next SKNode below) – you call super's touches* method such as:


    override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
     
        // do some logic and decide to pass the touch to nodes below:
        if forwardToNextResponder { 
             super.touchesBegan( touches, withEvent: event )
             return
        {


(pavelgubarez has the right idea, but use `super` not parent; they *are not* the same.)

If you read OP's post closely he mentiones that "you would simply call the super implementation of touchesBegan from the derived class implementation. However, doing so does nothing.".


That's why I suppose sending touches to parent and track them there.

I realize this question is a bit aged but I ran into this issue and managed to solve it fairly cleanly. First I’ll summarize my situation for context. In my game I wanted to design the game elements using SpriteKit but to have multiple UIKit views built in a storyboard for the UI. Since presenting those views takes you out of the skview I used multiple overlapping container views embeded within the skview and ran into the issue I think you’re having. The front most view swallows up the touches or passes them to the skview which leaves nothing for the sibling views. So in my situation I made the front most view a “command” view which basically passes the touch to any sibling view that is not hidden. Also for good measure I made sure that any view that became visible was also moved to the front of the skviews stack via skview.bringToFront() call. Obviously this is a game scenario and probably not the most elegant but it offers some options such as manipulating the stack of views or creating outlets to those views globally if necessary and sending the touchesEnded, moved and began calls yourself avoiding the responder chain. Not a direct fix but a feasible workaround. Good luck.

This design flaw in SpriteKit still persists. All nodes (eg. SKSpriteNodes) swallow touches, regardless of whether they have userInteractionEnabled true of false. And specializing such a node and adding a call to super in their touchesBegan method does nothing.


There's also another annoyance in this design: Even if in a particular situation it would be enough for a node to pass the touches it receives eg. to its parent node (rather than its super class), you still need to create a new class, eg. inherited from SKSpriteNode, or whatever you are using, and write that code. This may be a design nightmare because it can't be generalized or automated, especially if you already have SKSpriteNode specializations and you are using those instead. You would need to replicate this solution for every single class type you are using.


It would be so much easier, nicer and cleaner if SKNode had a property like ignoreTouches, or touchesFallThrough, or similarly named, which would mean that it just ignores all touches altogether, and doesn't participate in deciding which node the touch should be sent to. (Or, alternatively, in its internal implementation, it causes it to pass the touches it receives to the next node below it.) This way you can just set that property to yes, as a one-liner, for any node you are using, no matter if it's a standard SpriteKit class or a custom class inherited from one.


This is how it works in cocos2d.

I agree wholeheartedly with this. The same behavior persists for me as well, and indeed this worked in Cocos2d back in the day. With SpriteKit, it's currently not logical. Indeed, it is a nightmare if building object-oriented touch handling is your thing (in any dynamic program it should be). Agreed, userInteractionEnabled should automatically disregard the node from touches. If I wish for a node to block another node, I would specifically build it to do so.


I've had to work a lot to circumvent this issue. Essentially, when I have a hidden element that would still be on-screen (such as a swipeable vertical dial), I move all hidden elements out of the screen when they aren't accessed. Massive hassle, but necessary.


It's sad that we were discussing this stuff in this very thread already two years ago and the issue still persists (it did before 2016, too). We should organize and make a feature request as a bug report to Apple!

You guys ever just do a hit test for all children in the area of the touch point and just iterate through them to see which ones you actually care about. Done that on a few projects, works well.

Letting a touch "fall through"
 
 
Q