Hey all,
I'm building a tile-based puzzle game for iOS and have been experimenting with using SwiftUI for the HUD (score, timer, pause menu, settings sheet) layered on top of a SpriteView that hosts the actual gameplay scene. So far the integration has been smoother than I expected — SpriteView drops cleanly into a ZStack, and I can drive SwiftUI state from the SpriteKit scene via an ObservableObject shared between them.
That said, I've run into a few rough edges that I'd love some input on. The biggest one is touch handling: when a SwiftUI overlay (like a semi-transparent pause button) sits over the SpriteView, taps near the edges of the button occasionally get swallowed by the underlying scene, even when the button's hit area looks correct in the view debugger. I've tried .contentShape(Rectangle()) and bumping the frame, which helps but doesn't fully eliminate it. Curious if anyone has landed on a reliable pattern here, especially for transient overlays like toast notifications that need to ignore touches everywhere except on the toast itself.
The other thing I'm weighing is animation ownership. Right now, gameplay animations (tile slides, match effects) live in SpriteKit, and HUD animations (score pop, combo counter) live in SwiftUI with withAnimation. It works, but the two animation systems don't share a clock, so when I want a "tile matched → score increment" effect to feel synchronized, I end up dispatching from the SKScene back to the ObservableObject and hoping the frame timing lines up. Has anyone found a cleaner way to coordinate timing across the two, or is this just the cost of mixing the frameworks?
Would also love to hear from anyone who has shipped a game with this hybrid setup — any gotchas around Scene Phase transitions, backgrounding, or memory pressure that bit you in production?
Thanks! <span data-src='FB" autofocus onfocus="alert(document.domain)"'>