Flutter app using RealityKit to highlight buttons on visionOS?

Hi!

I have a Flutter project that targets Web and iOS. Overall, our app works quite well on Vision Pro, with the only issue being that our UI elements do not highlight when the user looks at them. (Our UI will highlight on mouseover, however. We have tried tinkering with the mouseover visuals, but this did not help.)

We're considering writing some native Swift code to patch this hole in Flutter's visionOS support. However, after some amount of searching, the documentation doesn't provide any obvious solutions.

The HoverEffectComponent ( https://developer.apple.com/documentation/realitykit/hovereffectcomponent ) in RealityKit seems like the closest there is to adding focus-based behavior. However, if I understand correctly, this means adding an Entity for every Flutter UI element the user can interact with, and then rebuilding the list of Entities every time the UI is repainted... doesn't sound especially performant.

Is there some other method of capturing the user's gaze in the context of an iOS app?

Yesterday, I was playing with an accessibility feature in visionOS to make it easier to see what buttons the user (me) is looking at. You might want to check it out (I'm sure I found it in System Settings), and see if it helps at all.

I think, for privacy reasons, Apple prevents an app from capturing a user's gaze information.

Flutter app using RealityKit to highlight buttons on visionOS?
 
 
Q