Multitouch Events

Generally, you can handle almost all of your touch events with the standard controls and gesture recognizers in UIKit. Gesture recognizers allow you to separate the recognition of a touch from the action that the touch produces. In some cases, you want to do something in your app—such as drawing under a touch—where there is no benefit to decoupling the touch recognition from the effect of the touch. If the view’s contents are intimately related to the touch itself, you can handle touch events directly. You receive touch events when the user touches your view, interpret those events based on their properties and then respond appropriately.

Creating a Subclass of UIResponder

For your app to implement custom touch-event handling, first create a subclass of a responder class. This subclass could be any one of the following:

Subclass

Why you might choose this subclass as your first responder

UIView

Subclass UIView to implement a custom drawing view.

UIViewController

Subclass UIViewController if you are also handling other types of events, such as shake motion events.

UIControl

Subclass UIControl to implement custom controls with touch behavior.

UIApplication or UIWindow

This would be rare, because you typically do not subclass UIApplication or UIWindow.

Then, for instances of your subclass to receive multitouch events:

  1. Your subclass must implement the UIResponder methods for touch-event handling, described in “Implementing the Touch-Event Handling Methods in Your Subclass.”

  2. The view receiving touches must have its userInteractionEnabled property set to YES. If you are subclassing a view controller, the view that it manages must support user interactions.

  3. The view receiving touches must be visible; it can’t be transparent or hidden.

Implementing the Touch-Event Handling Methods in Your Subclass

iOS recognizes touches as part of a multitouch sequence. During a multitouch sequence, the app sends a series of event messages to the target responder. To receive and handle these messages, the responder object’s class must implement the following UIResponder methods:

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;

Each of these touch methods correspond to a touch phase: Began, Moved, Ended, and Canceled. When there are new or changed touches for a given phase, the app object calls one of these methods. Each method takes two parameters: a set of touches and an event.

The set of touches is a set (NSSet) of UITouch objects, representing new or changed touches for that phase. For example, when a touch transitions from the Began phase to the Moved phase, the app calls the touchesMoved:withEvent: method. The set of touches passed in to the touchesMoved:withEvent: method will now include this touch and all other touches in the Moved phase. The other parameter is an event (UIEvent object) that includes all touch objects for the event. This differs from the set of touches because some of the touch objects in the event may not have changed since the previous event message.

All views that process touches expect to receive a full touch-event stream, so when you create your subclass, keep in mind the following rules:

If you prevent a responder object from receiving touches for a certain phase of an event, the resulting behavior may be undefined and probably undesirable.

If a responder creates persistent objects while handling events, it should implement the touchesCancelled:withEvent: method to dispose of those objects if the system cancels the sequence. Cancellation occurs when an external event—for example, an incoming phone call—disrupts the current app’s event processing. Note that a responder object should also dispose of any persistent objects when it receives the last touchesEnded:withEvent: message for a multitouch sequence. See “Forwarding Touch Events” to find out how to determine the last UITouchPhaseEnded touch object in a multitouch sequence.

Tracking the Phase and Location of a Touch Event

iOS tracks touches in a multitouch sequence. It records attributes for each of them, including the phase of the touch, its location in a view, its previous location, and its timestamp. Use these properties to determine how to respond to a touch.

A touch object stores phase information in the phase property, and each phase corresponds to one of the touch event methods. A touch object stores location in three ways: the window in which the touch occurred, the view within that window, and the exact location of the touch within that view. Figure 3-1 shows an example event with two touches in progress.

Figure 3-1  Relationship of a UIEvent object and its UITouch objects
Relationship of a UIEvent object and its UITouch objects

When a finger touches the screen, that touch is associated with both the underlying window and the view for the lifetime of the event, even if the event is later passed to another view for handling. Use a touch’s location information to determine how to respond to the touch. For example, if two touches occur in quick succession, they are treated as a double tap only if they both occurred in the same view. A touch object stores both its current location and its previous location, if there is one.

Retrieving and Querying Touch Objects

Within an event handling method, you get information about the event by retrieving touch objects from:

The multipleTouchEnabled property is set to NO by default, which means that a view receives only the first touch in a multitouch sequence. When this property is disabled, you can retrieve a touch object by calling the anyObject method on the set object because there is only one object in the set.

If you want to know the location of a touch, use the locationInView: method. By passing the parameter self to this method, you get the location of the touch in the coordinate system of the receiving view. Similarly, the previousLocationInView: method tells you the previous location of the touch. You can also determine how many taps a touch has (tapCount), when the touch was created or last mutated (timestamp), and what phase the touch is in (phase).

If you are interested in touches that have not changed since the last phase or that are in a different phase than the touches in the passed-in set, you can find those in the event object. Figure 3-2 depicts an event object that contains touch objects. To get all of these touch objects, call the allTouches method on the event object.

Figure 3-2  All touches for a given touch event

If you are interested only in touches associated with a specific window, call the touchesForWindow: method of the UIEvent object. Figure 3-3 shows all the touches for window A.

Figure 3-3  All touches belonging to a specific window

If you want to get the touches associated with a specific view, call the touchesForView: method of the event object. Figure 3-4 shows all the touches for view A.

Figure 3-4  All touches belonging to a specific view

Handling Tap Gestures

Besides being able to recognize a tap gesture in your app, you’ll probably want to distinguish a single tap, a double tap, or even a triple tap. Use a touch’s tapCount property to determine the number of times the user tapped a view.

The best place to find this value is the touchesEnded:withEvent: method, because it corresponds to when the user lifts a finger from a tap. By looking for the tap count in the touch-up phase—when the sequence has ended—you ensure that the finger is really tapping and not, for instance, touching down and dragging. Listing 3-1 shows an example of how to determine whether a double tap occurred in one of your views.

Listing 3-1  Detecting a double tap gesture

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
}
 
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
}
 
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
    for (UITouch *aTouch in touches) {
        if (aTouch.tapCount >= 2) {
             // The view responds to the tap
             [self respondToDoubleTapGesture:aTouch];
        }
    }
}
 
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
}

Handling Swipe and Drag Gestures

Horizontal and vertical swipes are a simple type of gesture that you can track. To detect a swipe gesture, track the movement of the user’s finger along the desired axis of motion. Then, decide whether the movement is a swipe by examining the following questions for each gesture:

To answer these questions, store the touch’s initial location and compare its location as the touch moves.

Listing 3-2 shows some basic tracking methods you could use to detect horizontal swipes in a view. In this example, a view has a startTouchPosition property that it uses to store a touch’s initial location. In the touchesEnded: method, it compares the ending touch position to the starting location to determine if it is a swipe. If the touch moves too far vertically or does not move far enough, it is not considered to be a swipe. This example does not show the implementation for the myProcessRightSwipe: or myProcessLeftSwipe: methods, but the custom view would handle the swipe gesture there.

Listing 3-2  Tracking a swipe gesture in a view

#define HORIZ_SWIPE_DRAG_MIN  12
#define VERT_SWIPE_DRAG_MAX    4
 
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
    UITouch *aTouch = [touches anyObject];
    // startTouchPosition is a property
    self.startTouchPosition = [aTouch locationInView:self];
}
 
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
}
 
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
    UITouch *aTouch = [touches anyObject];
    CGPoint currentTouchPosition = [aTouch locationInView:self];
 
    //  Check if direction of touch is horizontal and long enough
    if (fabsf(self.startTouchPosition.x - currentTouchPosition.x) >= HORIZ_SWIPE_DRAG_MIN &&
        fabsf(self.startTouchPosition.y - currentTouchPosition.y) <= VERT_SWIPE_DRAG_MAX)
    {
        // If touch appears to be a swipe
        if (self.startTouchPosition.x < currentTouchPosition.x) {
            [self myProcessRightSwipe:touches withEvent:event];
        } else {
            [self myProcessLeftSwipe:touches withEvent:event];
    }
    self.startTouchPosition = CGPointZero;
}
 
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
    self.startTouchPosition = CGPointZero;
}

Notice that this code does not check the location of the touch in the middle of the gesture, which means that a gesture could go all over the screen but still be considered a swipe if its start and end points are in line. A more sophisticated swipe gesture recognizer should also check middle locations in the touchesMoved:withEvent: method. To detect swipe gestures in the vertical direction, you would use similar code but would swap the x and y components.

Listing 3-3 shows an even simpler implementation of tracking a single touch, this time the user is dragging a view around the screen. Here, the custom view class fully implements only the touchesMoved:withEvent: method. This method computes a delta value between the touch’s current and previous locations in the view. It then uses this delta value to reset the origin of the view’s frame.

Listing 3-3  Dragging a view using a single touch

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
}
 
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
    UITouch *aTouch = [touches anyObject];
    CGPoint loc = [aTouch locationInView:self];
    CGPoint prevloc = [aTouch previousLocationInView:self];
 
    CGRect myFrame = self.frame;
    float deltaX = loc.x - prevloc.x;
    float deltaY = loc.y - prevloc.y;
    myFrame.origin.x += deltaX;
    myFrame.origin.y += deltaY;
    [self setFrame:myFrame];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
}
 
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
}

Handling a Complex Multitouch Sequence

Taps, drags, and swipes typically involve only one touch and are simple to track. Handling a touch event consisting of two or more touches is more challenging. You may have to track all touches through all phases, recording the touch attributes that have changed and altering internal state appropriately. To track and handle multiple touches, you need to:

When handling an event with multiple touches, you often store information about a touch’s state so that you can compare touches later. As an example, say you want to compare the final location of each touch with its original location. In the touchesBegan:withEvent: method, you get the original location of each touch from the locationInView: method and store those in a CFDictionaryRef object using the addresses of the UITouch objects as keys. Then, in the touchesEnded:withEvent: method, you can use the address of each passed-in touch object to get the object’s original location and compare it with its current location.

Listing 3-4 illustrates how to store the starting locations of UITouch objects in a Core Foundation dictionary. The cacheBeginPointForTouches: method stores the location of each touch in the superview’s coordinates so that it has a common coordinate system to compare the location of all of the touches.

Listing 3-4  Storing the beginning locations of multiple touches

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
     [self cacheBeginPointForTouches:touches];
}
 
- (void)cacheBeginPointForTouches:(NSSet *)touches {
    if ([touches count] > 0) {
        for (UITouch *touch in touches) {
            CGPoint *point = (CGPoint *)CFDictionaryGetValue(touchBeginPoints, touch);
            if (point == NULL) {
                point = (CGPoint *)malloc(sizeof(CGPoint));
                CFDictionarySetValue(touchBeginPoints, touch, point);
            }
            *point = [touch locationInView:view.superview];
        }
    }
}

Listing 3-5 builds on the previous example. It illustrates how to retrieve the initial locations from the dictionary. Then, it gets the current locations of the same touches so that you can use these values to compute an affine transformation (not shown).

Listing 3-5  Retrieving the initial locations of touch objects

- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
     CGAffineTransform newTransform = [self incrementalTransformWithTouches:touches];
}
 
- (CGAffineTransform)incrementalTransformWithTouches:(NSSet *)touches {
     NSArray *sortedTouches = [[touches allObjects] sortedArrayUsingSelector:@selector(compareAddress:)];
 
     // Other code here
     CGAffineTransform transform = CGAffineTransformIdentity;
 
     UITouch *touch1 = [sortedTouches objectAtIndex:0];
     UITouch *touch2 = [sortedTouches objectAtIndex:1];
 
     CGPoint beginPoint1 = *(CGPoint *)CFDictionaryGetValue(touchBeginPoints, touch1);
     CGPoint currentPoint1 = [touch1 locationInView:view.superview];
     CGPoint beginPoint2 = *(CGPoint *)CFDictionaryGetValue(touchBeginPoints, touch2);
     CGPoint currentPoint2 = [touch2 locationInView:view.superview];
 
 
     // Compute the affine transform
     return transform;
}

The next example, Listing 3-6, does not use a dictionary to track touch mutations; however, it handles multiple touches during an event. It shows a custom UIView object responding to touches by animating the movement of a “Welcome” placard as a finger moves it around the screen. It also changes the language of the placard when the user double taps. This example comes from the MoveMe sample code project, which you can examine to get a better understanding of the event handling context.

Listing 3-6  Handling a complex multitouch sequence

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
 
     // App supports only single touches, so anyObject retrieves just
     // that touch from touches
     UITouch *touch = [touches anyObject];
 
     // Move the placard view only if the touch was in the placard view
     if ([touch view] != placardView) {
          // In case of a double tap outside the placard view, update
          // the placard's display string
          if ([touch tapCount] == 2) {
               [placardView setupNextDisplayString];
          }
          return;
     }
 
     // Animate the first touch
     CGPoint touchPoint = [touch locationInView:self];
     [self animateFirstTouchAtPoint:touchPoint];
}
 
}
 
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
 
     UITouch *touch = [touches anyObject];
 
     // If the touch was in the placardView, move the placardView to its location
     if ([touch view] == placardView) {
          CGPoint location = [touch locationInView:self];
          placardView.center = location;
     }
}
 
 
 
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
 
    UITouch *touch = [touches anyObject];
 
    // If the touch was in the placardView, bounce it back to the center
    if ([touch view] == placardView) {
        // Disable user interaction so subsequent touches
        // don't interfere with animation
        self.userInteractionEnabled = NO;
        [self animatePlacardViewToCenter];
    }
}
 
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
 
     // To impose as little impact on the device as possible, simply set
     // the placard view's center and transformation to the original values
     placardView.center = self.center;
     placardView.transform = CGAffineTransformIdentity;
}
 

To find out when the last finger in a multitouch sequence is lifted from a view, see how many touch objects are in the passed-in set and how many are in the passed-in UIEvent object. If the number is the same, then the multitouch sequence has concluded. Listing 3-7 illustrates how to do this in code.

Listing 3-7  Determining when the last touch in a multitouch sequence has ended

- (void)touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event {
    if ([touches count] == [[event touchesForView:self] count]) {
        // Last finger has lifted
    }
}

Remember that a passed-in set contains all touch objects associated with the view that are new or changed for a given phase, whereas the touch objects returned from the touchesForView: method includes all objects associated with the specified view.

Specifying Custom Touch Event Behavior

Customize the way your app handles touches by altering the behavior of a specific gesture, a specific view, or all of the touch events in your app. You can alter the stream of touch events in the following ways:

You can also turn off touch-event delivery completely, or just for a period of time:

Intercepting Touches by Overriding Hit-Testing

If you have a custom view with subviews, you need to determine whether you want to handle touches at the subview level or the superview level. If you chose to handle touches at the superview level—meaning that your subviews do not implement the touchesBegan:withEvent:, touchesEnded:withEvent:, or touchesMoved:withEvent: methods—then the superview should override hitTest:withEvent: to return itself rather than any of its subviews.

Overriding hit-testing ensures that the superview receives all touches because, by setting itself as the hit-test view, the superview intercepts and receives touches that are normally passed to the subview first. If a superview does not override hitTest:withEvent:, touch events are associated with the subviews where they first occurred and are never sent to the superview.

Recall that there are two hit-test methods: the hitTest:withEvent: method of views and the hitTest: method of layers, as described in “Hit-Testing Returns the View Where a Touch Occurred.” You rarely need to call these methods yourself. It’s more likely that you will override them to intercept touch events from subviews. However, sometimes responders perform hit-testing prior to event forwarding (see “Forwarding Touch Events”).

Forwarding Touch Events

To forward touch events to another responder object, send the appropriate touch-event handling messages to that object. Use this technique with caution because UIKit classes are not designed to receive touches that are not bound to them. For a responder object to handle a touch, the touch’s view property must hold a reference to the responder. If you want to conditionally forward touches to other responders in your app, all of the responders should be instances of your own UIView subclass.

For example, let’s say an app has three custom views: A, B, and C. When the user touches view A, the app’s window determines that it is the hit-test view and sends to it the initial touch event. Depending on certain conditions, view A forwards the event to either view B or view C. In this case, views A, B, and C must be aware of this forwarding, and views B and C must be able to deal with touches that are not bound to them.

Event forwarding often requires analyzing touch objects to determine where they should be forwarded. There are a couple of approaches you can take for this analysis:

Overriding the sendEvent: method allows you to monitor the events your app receives. Both the UIApplication object and each UIWindow object dispatch events in the sendEvent: method, so this method serves as a funnel point for events coming in to an app. This is something that very few apps need to do and, if you do override sendEvent:, be sure to invoke the superclass implementation—[super sendEvent:theEvent]. Never tamper with the distribution of events.

Listing 3-8 illustrates this technique in a subclass of UIWindow. In this example, events are forwarded to a custom helper responder that performs affine transformations on the view that it is associated with.

Listing 3-8  Forwarding touch events to helper responder objects

- (void)sendEvent:(UIEvent *)event {
    for (TransformGesture *gesture in transformGestures) {
        // Collect all the touches you care about from the event
        NSSet *touches = [gesture observedTouchesForEvent:event];
        NSMutableSet *began = nil;
        NSMutableSet *moved = nil;
        NSMutableSet *ended = nil;
        NSMutableSet *canceled = nil;
 
        // Sort touches by phase to handle—-similar to normal event dispatch
        for (UITouch *touch in touches) {
            switch ([touch phase]) {
                case UITouchPhaseBegan:
                    if (!began) began = [NSMutableSet set];
                    [began addObject:touch];
                    break;
                case UITouchPhaseMoved:
                    if (!moved) moved = [NSMutableSet set];
                    [moved addObject:touch];
                    break;
                case UITouchPhaseEnded:
                    if (!ended) ended = [NSMutableSet set];
                    [ended addObject:touch];
                    break;
                case UITouchPhaseCancelled:
                    if (!canceled) canceled = [NSMutableSet set];
                    [canceled addObject:touch];
                    break;
                default:
                    break;
            }
        }
        // Call methods to handle the touches
        if (began)     [gesture touchesBegan:began withEvent:event];
        if (moved)     [gesture touchesMoved:moved withEvent:event];
        if (ended)     [gesture touchesEnded:ended withEvent:event];
        if (canceled) [gesture touchesCancelled:canceled withEvent:event];
    }
    [super sendEvent:event];
}

Notice that the overriding subclass invokes the superclass implementation of the sendEvent: method. This is important to the integrity of the touch-event stream.

Best Practices for Handling Multitouch Events

When handling both touch and motion events, there are a few recommended techniques and patterns you should follow: