Article

Minimizing Latency with Predicted Touches

Create a smooth and responsive drawing experience using UIKit's predictions for touch locations.

Overview

It takes time for UIKit to generate and deliver touch events to your app, and it takes time for your app to process those events and render the results. In fact, it takes enough time that there can be a visible lag between the movements of the user's finger or Apple Pencil and the rendered results. To minimize the perceived latency between touch input and rendered content, you can incorporate predicted touches into your event handling.

Predicted touches are the system’s best guess of where the next touch events will occur. Figure 1 illustrates the concept in a drawing app. When a drawing sequence begins, UIKit uses previous touch locations from the user's finger or Apple Pencil to predict where the next touch may occur. UIKit generates additional UITouch objects for these predicted locations and makes them available to your app.

Figure 1

Predicting the path of touch events

A diagram demonstrating Apple Pencil tracing a path, with actual and predicted touch locations.

To retrieve predicted touch data, call the predictedTouchesForTouch: method of the UIEvent object containing the original UITouch object. That method returns an array of touches predicted to occur after the last actual touch. Always treat predicted touches as temporary data in your app and discarded them upon receipt of each new touch event.

Topics

Example

Incorporating Predicted Touches into an App

Learn how to create a simple app that incorporates predicted touches into its drawing code.

See Also

Advanced Touch Handling

Implementing a Multi-Touch App

Learn how to create a simple app that handles multitouch input.

Getting High-Fidelity Input with Coalesced Touches

Learn how to support high-precision touches in your app.