I developed a small drawing application for the iPad in Swift using Metal. It supports drawing Lines with the Apple Pencil, erase them, select them and so on.
Everything is drawn onto an offscreen texture so I don't need to redraw all of them. The App also supports zooming and panning. If you zoom you have to redraw everything. And there begins the problem. At the moment if I pan my canvas I simply draw the offscreen texture at zoom level 1.0 and pan it around. So it looks very pixelated because I zoom into the offscreen texture without redrawing the offscreen texture itself. The problem is I can't redraw the whole offscreen texture every frame while panning because for performance reasons.
My idea was to implement a tile based progressive rendering technique. I thought I could split up the screen into tiles with size of for example 256x256 pixels and if I pan I can move those tiles and only the new visible ones I need to render. But I don't really know where to start. I do not know how to render the lines onto those tiles. every line has a single VertexBuffer at the moment storing the triangles. I thought maybe you can use multiple color Attachments for The Render Encoder to draw the lines. So every colorAttachements[n].texture would be a tile. And maybe through the help of Viewport Culling? I don't have much experience in this area so I have no idea.
I also found the Sparse Texture feature, which looked like it would solve my problem. But I need to support iPads that do not support this feature. Has someone else made something similar ? Or has any example code for this? Or has a complete different Idea that could help?
Thank you very much for your help!