PSVR2 controller button quirks

I have an open Feedback conversation with Apple on this topic, but I am curious if others have run into this, or want to try out my sample code in their set up.

there are two API’s for reading controller buttons, axis, and D pads: GCPhysicalInputProfile and GCControllerLiveInput. There are inconsistencies in behaviour between the two of them. Apple recommends we use GCControllerLiveInput, however, there are some capabilities on these controllers that are only accessible through GCPhysicalInputProfile, as I’ll discuss below.

  • PSVR2 R2/L2 buttons, a.k.a. triggers, have force input analogue values. These can only be accessed on GCPhysicalInputProfile

  • PSVR2 thumbstick direction values are read through “axes” on GCPhysicalInputProfile, but only “dpads” on GCControllerLiveInput

  • on both GCPhysicalInputProfile and GCControllerLiveInput, All pressed events of all buttons are fired properly using generic aliases ( Trigger, Grip ,Menu, Right Thumbstick, Left Thumbstick, Right Button A & B (Circle & Cross), Left Button A&B (Triangle and Square) ). Apple reserves the system button as the equivalent of a home button for the OS.

  • on GCPhysicalInputProfile, touch events are fired when the button is also pressed, but not for only touches.

  • on GCControllerLiveInput , Touch events only works for the following buttons: Left Thumbstick, Right Thumbstick, Right Button A (Circle), and Right Button B (Cross). But Right Button B touch event isn’t labelled correctly, it fires as the Right Button A event.

I observed this inside ALVR which uses a polling based approach to event processing:

https://github.com/alvr-org/alvr-visionos/blob/17b5968f9d894944b53e97134b39dfce0993302a/ALVRClient/WorldTracker.swift#L301

To simplify to see this on a very simple app, I used the Apple example TrackingAccessories application: https://developer.apple.com/documentation/ARKit/tracking-accessories-in-volumetric-windows

I’ve attached the code that replaces the AccessoryTrackingModel class. I added code that prints out what is touched/pressed, see the trackAllConnectedSpatialControllers method: https://github.com/svrc/TrackingAccessories

Hello

We received your feedback reports (thanks!). While I have not yet looked into why you are observing the reported inconsistencies with input when using the GCControllerLiveInput collection of APIs, let me clarify how input is expected to be reported for the PlayStation VR 2 Sense controllers on visionOS.

I am going to assume you have followed the first step in this article to declare your application's support for the Spatial Gamepad controller profile. That is, you have added the GCSupportedGameControllers key to your app's Info.plist, with a SpatialGamepad profile name entry.

Each PSVR2 controller includes one thumbstick and seven buttons. Six of these buttons are available to the application (the PS logo button is reserved for system use). Five of these buttons include capacitive sensors that can detect finger contact. The Game Controller framework provides access to these elements (minus the logo button) using the following alias names. These aliases are the same between the left and right controllers.

  • GCInputThumbstick (dpad & button): Thumbstick (includes capacitive sensor)
  • GCInputButtonA (button): Cross/Square buttons (includes capacitive sensor)
  • GCInputButtonB (button): Circle/Triangle buttons (includes capacitive sensor)
  • GCInputGripButton (button): L1/R1 button (includes capacitive sensor)
  • GCInputTrigger (button): L2/R2 (includes capacitive sensor)
  • GCInputButtonMenu (button): Menu/Create button

I recommend using the GCControllerLiveInput collection of APIs to access PSVR2 input. Elements representing each of the above control surfaces are found by querying the dpads (for dads & thumbsticks) and buttons collections on the GCControllerLiveInput object returned from controller.input. Both collections will contain an element for GCInputThumbstick because the thumbstick on PSVR2 is "clickable".

The button elements are represented by an object conforming to GCButtonElement. Your app can query the button press state, and analog trigger position, from the button element's pressedInput object. For the buttons that include a capacitive sensor, your app can query the contact state from the button element's touchedInput object. The forceInput property on GCButtonElement is (at this time) only used for the Logitech Muse stylus, and will always return nil for all PSVR2 buttons.

Now to your specific points.

PSVR2 R2/L2 buttons, a.k.a. triggers, have force input analogue values. These can only be accessed on GCPhysicalInputProfile

These should be accessible from a GCButtonElement retrieved from GCControllerLiveInput as explained above.

PSVR2 thumbstick direction values are read through “axes” on GCPhysicalInputProfile, but only “dpads” on GCControllerLiveInput

GCPhysicalInputProfile supports element "nesting". A GCDeviceDirectionPad has two child axis elements, and four child button elements, in addition to the top-level dpad element. All seven of these elements are merged into the dpads, axes, and buttons collections on GCPhysicalInputProfile.

GCControllerLiveInput elements do not have child elements.

on GCPhysicalInputProfile, touch events are fired when the button is also pressed, but not for only touches.

This is not intentional. But please use GCControllerLiveInput to access PSVR2 input state.

-- Justin

PSVR2 controller button quirks
 
 
Q