How to track gaze?

My visionOS app was rejected for not supporting gaze-and-pinch features, but as far as I can tell there is no way to track the users eye movements in visionOS. (The intended means of interaction for my app is an extended gamepad.) I am able to detect pinch but can't find any way to detect where the user is looking. ARFaceAnchor and lookAtPoint appear to not be available in visionOS. So how do we go about doing this? For context, I am porting a metal game to vision OS from iOS.

Answered by Vision Pro Engineer in 790177022

When using CAMetalLayer you can use UIKit and SwiftUI input support. The event location will be the intersection of the gaze with the CAMetalLayer.

For the CompositorLayer you can use the onSpatialEvent callback which provides the pinch and gaze when the pinch began: https://developer.apple.com/documentation/compositorservices/layerrenderer/4245856-onspatialevent

Are you rendering with Metal in a Window using CAMetalLayer or are you using an ImmersiveSpace with CompositorServices?

Both. The user can switch between immersive and 2D using a toggle.

When using CAMetalLayer you can use UIKit and SwiftUI input support. The event location will be the intersection of the gaze with the CAMetalLayer.

For the CompositorLayer you can use the onSpatialEvent callback which provides the pinch and gaze when the pinch began: https://developer.apple.com/documentation/compositorservices/layerrenderer/4245856-onspatialevent

Thank you. I'll take a look at that.

How to track gaze?
 
 
Q