Use touch events directly on a view subclass if the touch handling is intricately linked to the view visualization.
If you don't plan to use gesture recognizers with a custom view, you can handle touch events directly from the view itself. Because views are responders, they can handle multitouch events and many other types of events. When UIKit determines that a touch event occurred in a view, it calls the view’s
touches method. You can override these methods in your custom views and use them to provide a response to touch events.
The system may cancel an ongoing touch sequence at any time—for example, when an incoming phone call interrupts the app. When it does, UIKit notifies your view by calling the
touches method. You use that method to perform any needed cleanup of your view’s data structures.
The methods you override in your views (or in any responder) to handle touches correspond to different phases of the touch event–handling process. When a finger (or Apple Pencil) touches the screen, UIKit creates a
UITouch object, sets the touch location to the appropriate point, and sets its
phase property to
began. When the same finger moves around the screen, UIKit updates the touch location and changes the
phase property of the touch object to
moved. When the user lifts the finger from the screen, UIKit changes the
phase property to
ended and the touch sequence ends. Figure 1 illustrates the different phases of a touch event.
UIKit creates a new
UITouch object for each new finger that touches the screen. The touches themselves are delivered with the current
UIEvent object. UIKit distinguishes between touches originating from a finger and from Apple Pencil, and you can treat each of them differently.