Hello! Currently watching the Envision the Future: Build great apps for visionOS" webinar, and lots of questions coming up. Thx for offering this online!
For those of us with "VR legs", how can we go about setting up custom hand/finger gestures that would enable us to add the functionality for teleporting and navigating within our fully Immersive environments? Both smooth, and snap turn/teleport options would be great, thx! This is adjacent to my previous question on how to setup a PS5 controller to do something similar. Think Half-Life: ALYX as the gold standard for VR navigation.
Game Controller
RSS for tagSupport hardware game controllers in your game using Game Controller.
Posts under Game Controller tag
23 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I recently completed a freelance project where I was tasked with creating room-scale environments that could be used as AR elements. As a bonus, I suggested that these could be done to scale, and repurposed for eventual viewing in Vision Pro. To illustrate, I was able to quickly create a simple Immersive project in Xcode, add the USDZ file (authored in Maya, with baked lighting from Arnold) to Reality Composer Pro, and compile for quick sending to headset. I then would do screen recordings inside the immersive space, which the client loved to see. However, I am unable to walk around due to the boundary limitations.
My next obvious thought is, how can I setup the “player” camera so that I can control with a PS5 controller inside AVP? In addition to Maya, I’m an Unreal Engine artist, and have been waiting patiently to get any projects compiled for AVP. With 5.5 release, I was able to get a VR Template test over to AVP, where I have rudimentary navigation control via the PS5 controller.
Ideally, I’d also love to learn how to set this up natively, so I can take simple USDZ scenes created in Maya, import to RCP, setup a simple camera controller, and then be able to use this to navigate my VR Immersive spaces on Vision Pro. How can we go about doing this?
Part two of this question/suggestion is, how would I go about controlling a rigged, animated character in AR/passthrough mode in a similar fashion? Thx!
Phone glitchs a lot more my flashlight won’t turn on half the time even when it says it’s on its not on settings page likes to freeze randomly will restart my phone as well and controller has more issues connecting on certain games
Hi,
I'm building a windowed game in visionOS which requires a gamepad. And I'm using a PS5 controller for this case.
However, I found a few problems:
When looking at the window close button, and press X in the gamepad, the window will be closed. This can happen accidentally when the user is playing.
When press the PS button(the home button), I want my app to handle it, e.g. take the user to the title screen. But visionOS will always capture this input and opens the home screen(App Library).
How to avoid these 2 issues? Or in general, how to make the gamepad input only available to my app but not the visionOS system?
The specification states that the timestamp property is a DOMHighResTimeStamp which always represents a value in milliseconds. However, in my testing (on iOS as I don't have a mac) the timestamp is given as seconds. This could break compatibility if an app/library is not tested in Safari.
Worse is that the deviation seems to be undocumented as every compatibility chart states full compatibility and defines the property as milliseconds.
Any workaround would also be up to one frame incorrect. So a stutter can affect input accuracy.
I know this might seem small but it could mean that an app or game is unplayable in Safari if it relies on this for input.
I am unable to get VisionOS 2.0 (simulator) to receive the GCControllerDidConnect notification and thus am unable to setup support for a gamepad. However, it works in VisionOS 1.2.
For VisionOS 2.0 I've tried adding:
.handlesGameControllerEvents(matching: .gamepad) attribute to the view
Supports Controller User Interaction to Info.plist
Supported game controller types -> Extended Gamepad to Info.plist
...but the notification still doesn't fire. It does when the code is run from VisionOS 1.2 simulator, both of which have the Send Game Controller To Device option enabled.
Here is the example code. It's based on the Xcode project template. The only files updated were ImmersiveView.swift and Info.plist, as detailed above:
import SwiftUI
import GameController
import RealityKit
import RealityKitContent
struct ImmersiveView: View {
var body: some View {
RealityView { content in
// Add the initial RealityKit content
if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) {
content.add(immersiveContentEntity)
}
NotificationCenter.default.addObserver(
forName: NSNotification.Name.GCControllerDidConnect,
object: nil, queue: nil) { _ in
print("Handling GCControllerDidConnect notification")
}
}
.modify {
if #available(visionOS 2.0, *) {
$0.handlesGameControllerEvents(matching: .gamepad)
} else {
$0
}
}
}
}
extension View {
func modify<T: View>(@ViewBuilder _ modifier: (Self) -> T) -> some View {
return modifier(self)
}
}
I have a simple example of a motion matching (MxM for Unity) character controller that uses Unity's input system and gamepad support. In editor the scene and inputs work as expected. When I build to headset the app stops at an initialization step where my game controller should kick in. The app doesn't crash but my character is frozen in A-Pose and doesn't respond to input.
I'm wondering if this error I'm seeing in the logs is what's causing it? And if so how do I fix it?
error 15:56:11.724200-0700 PolySpatialProjectTemplate NSBundle file:///System/Library/Frameworks/GameController.framework/ principal class is nil because all fallbacks have failed
I'm using Xcode 16 beta 6
Unity 6000.0.17f1
VisionOS 2.0 beta 9
The question is:
We did not add "GameController framework", but we do not know why the "GameController framework" is loaded at startup.
I am checking the launch time of our app, using Instruments->App launch
I was confused to find the GameConroller framework loaded
I check the project, in the plist file, no configuration GCSupportedGameControllers, GCSupportsControllerUserInteraction related key.
What else causes the GameController framework to load?
struct GameSystem: System {
static let rootQuery = EntityQuery(where: .has(GameMoveComponent.self) )
init(scene: RealityKit.Scene) { }
func update(context: SceneUpdateContext) {
let root = context.scene.performQuery(Self.rootQuery)
for entity in root{
let game = entity.components[GameMoveComponent.self]!
if let xMove = game.game.gc?.extendedGamepad?.dpad.xAxis.value ,
let yMove = game.game.gc?.extendedGamepad?.dpad.yAxis.value {
print("x:\(xMove),y:\(yMove)")
let x = entity.transform.translation.x + xMove * 0.01
let y = entity.transform.translation.z - yMove * 0.01
entity.transform.translation = [x , entity.transform.translation.y , y]
}
}
}
}
I want to use the game controller's direction keys to control the continuous movement of Entity in visionOS. When I added a query for handle button presses in the ECS System, I found that the update interface was not called at a frequency of 30 frames per second. Instead, it executes once when I press or release the key.
Is this what is the reason?
I want to keep moving by holding down the controller button, is there a better solution? I hope this moving process will be smooth and not stuck.
Hello,
I have an iPad app that users are running on their M1 / M2 MacBooks thanks to the "Designed for iPad" feature.
Some of them reported to me that the pressed keyboard keys were not recognized.
According to my source code, it seems that my custom views (that are set as firstResponder) do not get pressesBegan events.
While I could not reproduce this specific problem on my M1 Macbook, I found a strange problem that may be related.
My view also supports interaction with game controllers.
I found that the emulated controller (which is using the keyboard, when the controller emulation feature of the Designed for iPad app is set to On) has some problems.
The valueChangedHandler of the GCExtendedGamepad from the GCGameController is only fired if the app is compiled with the "Requires full screen" option set to On.
If Requires full screen is unchecked in XCode, the Emulated Controller is still present in the list of game controllers, but no button callbacks are triggered.
Note that a real game controller connected via USB will work correctly no matter how Requires full screen is set.
Could there be a focus bug of the keyboard not sending events when the app is not a full screen app (resizable on mac, and multitask on iPad)?
Or is there something I can change to avoid this problem?
I'm lost.
Hi,
I'm having a small App in the AppleTV-Simulator which is supposed to use the Siri-Remotes Swipe-Gesture.
It works perfect on the real device but on the simulator the Swipe-Gesture is not recognized in the App but it works on the Start-Screen of the Simulator using the simulated Siri-Remote app.
Here is the code which sets up the xAxis ans yAxis value change handlers:
#if targetEnvironment(simulator)
// Simulator
let siriRemote = GCController.controllers().filter { controller in
if controller.vendorName == "Gamepad" {
return true
} else {
return false
}
}
let sController = siriRemote.first!
let inputProfile = sController.physicalInputProfile
let dPad = inputProfile.dpads["Direction Pad"]
self.dPad = dPad
self.dPad!.xAxis.valueChangedHandler = self.directionPadXAxisValueChangeHandler
self.dPad!.yAxis.valueChangedHandler = self.directionPadYAxisValueChangeHandler
}
#else
// Device
if let _ = ( notification.object as? GCController)?.microGamepad {
let microProfileController = notification.object as! GCController
self.microGamePad = microProfileController.microGamepad
self.dPad = self.microGamePad!.dpad
self.dPad!.xAxis.valueChangedHandler = self.directionPadXAxisValueChangeHandler
self.dPad!.yAxis.valueChangedHandler = self.directionPadYAxisValueChangeHandler
}
#endif
Any help is greatly appreciated.
Cheers,
Frank
I am having problems getting button input from an Xbox game controller.
I have the visionOS 2 beta on my Apple Vision Pro, and I am trying to use an Xbox game controller with a RealityView following the instructions from the WWDC session Explore game input in visionOS.
The notification about a game controller is picking up the game controller, finds GCInputButtonA, and I am setting closures for touchedChangedHandler, pressedChangedHandler, and valueChangedHandler that just print an os_log statement.
buttonA.valueChangedHandler = { button, value, pressed in
os_log("Got valueChangedHandler")
}
At the end of RealityView, I have the modifier
RealityView { content in
// stuff
}
.handlesGameControllerEvents(matching: .gamepad)
But I am never seeing the log message appear in the console when I press the 'A' button (or any other button).
Any ideas what I might be doing wrong?
The Xbox controller is pretty old. Settings is reporting it as version 9.0.3
I would like to code some RealityViews to run on my Mac first (and then incorporate them in a visionOS project) so that my code/test loop is faster, but I have not been able to find a simple example that supports Mac.
Is it possible to have volumes on a Mac? Is there support for using a game controller to move around the RealityView, like in the visionOS simulator?
Hi there,
I'm reaching out to inquire about compatibility restrictions regarding the 8BitDo Ultimate 2.4G Wireless Controller on macOS systems, particularly its functionality with third-party applications such as Construct 3, a browser-based game development software.
As an indie game developer, I recently acquired the 8BitDo Ultimate controller for its touted compatibility and functionality. While I've primarily used it in Bluetooth mode on my Mac without encountering any significant issues, I've stumbled upon a peculiar problem when using the controller while developing projects in Construct 3, specifically within the preview mode of the game engine.
In attempting to test my projects using Construct 3's preview feature, the controller fails to detect any of my inputs. This is particularly confounding as the controller operates flawlessly when I export a proper macOS build of the game from the engine and play it externally.
To further investigate, I experimented with Nintendo Switch Joy-Cons, which surprisingly worked seamlessly without encountering any issues during testing within Construct 3's preview mode.
Upon reaching out to both Construct 3 and 8BitDo customer support, I was informed that the compatibility restriction with third-party controllers on macOS is due to limitations imposed by Apple, which are beyond their control. Consequently, I sought assistance from Apple Developer Program Support, who directed me to contact the Code-level Technical Support team or inquire on the developer forum.
The lack of compatibility with crucial development tools like Construct 3 significantly hampers my workflow as an indie developer, and I'm eager to find a resolution to this issue.
Thank you for your time, and I'm hopeful that someone within the community might be able to provide guidance on this matter.
Hi all, I have a macbook air 2022 with the M2 chip inside. I am trying to use a wired xbox 360 controller to play games on steam. However, whenever I try to plug it it, the light shines green for a little bit and then turns off.
I checked in my usb device tree through the system report and it says that there is a controller plugged in. The controller doesn't work in the game either, are there any fixes?
I am also on macOS Monterey version 12.4
In my code,i use GCMouse to detect mouse event and it works fine in iOS 16, but while i upgrade my iPhone to iOS 17.3.1, GCMouse doesn't works. But it works fine in IPad 17.3.1. How can i detect mouse event i iOS 17.3.1?
Im searching for a simple swiftUI code that can move a cube in immersive space when I click the X button on Sony Playstation Game Controller,
I did load a USD entity and connect the game controller but when it comes to get a simple click into the reality view to change position or rotate the entity it always not even accepted in codes. although I can control it with a finger or hand gestures, and yes I have closed the "Send Game Controller to device" in the simulator to make sure the controller would not effect the system but only effect in the game,
here is a sample
import SwiftUI
import RealityKit
import GameController
struct ImmersiveView: View {
@State var entity = ModelEntity()
@State var z: Float = -4.0
var body: some View {
RealityView { content in
if let cube = try? await ModelEntity(named: "am.usdz") {
entity = cube
entity.generateCollisionShapes(recursive: true)
entity.name = "Cube"
entity.position = [0, 0, -4]
entity.components.set(InputTargetComponent(allowedInputTypes: .indirect))
entity.components.set(HoverEffectComponent())
content.add(entity)
}
}
.gesture(
SpatialTapGesture()
.targetedToEntity(entity)
.onEnded { value in
print(value)
z -= 1.0
entity.position = [0, 0, z]
}
)
// What to do here to make game controller button X act as the gesture do
}
}
Hi guys,
Since SwiftUI is not completely suporting tvOS remote and swipe/pan gestures (targeting tvOS 16 and higher, using Xcode 15.1 and its tvOS SDK), I have implemented custom remote control handling using GameController framework and GCController class to detects Siri Remote (1st and 2nd generation), iOS Remote (tvOS remote controller on iPhone) and Nimbus+ and PS game controllers.
For video streaming app I specifically needed to handle arrows on old Siri remote and new Siri remote the same way - arrow press (not just touch) switches TV channel.
In the standard SwiftUI remote control handling the arrow presses are triggered on old remote just by touching edges of the touch pad. We use Up/Down arrow presses to switch TV channel. Our testers reported that they too often accidentally change channel when picking up remote or simply laying down finger on the pad in preparation to use the remote for some action. That's why we needed to override default behavior on old remote and detect arrow press only when user touches the edge and "clicks" the touch pad.
Simultaneously we detect input from all controllers, because for example when you have Nimbus game controller connected it becomes current controller - it is set in GCController.current - while we still need to handle Siri remote, so user didn't have to turn off Game controller and could just pick up the Siri remote and use it right away.
But there are still some problems. For example if I open tvOS remote control app on iPhone it has higher priority over Siri remote. Although I'm able to hold iPhone in one hand and Siri remote in the other and use them simulatanously, pressing down arrow buttons on Siri remote triggers event, which is being detected as if it came from iOS remote. This problem occurs even if user locks iPhone while remote app is active. User has to unlock iPhone and minimize iOS remote to fix the problem and make Siri remote fully active controller with arrows working as expected.
Is there a way of detecting from which remote the button press event came?
GCController.current reports that iOS remote is current controller, notification GCControllerDidConnect and GCControllerDidBecomeCurrent does not help either. When I use iOS remote it becomes current controller and when I start using Siri remote and press button it does not become current controller. GCController.physicalInputProfile does not help either.
It is as if iOS remote while being connected to tvOS has higher priority over Siri remote. Why don't I receive current controller change when pressing button on Siri remote?
When pressing the button on Siri remote and receiving event, it comes from handler with GCController.microGamepad.elements.count == 10, instead of 17, so although I'm pressing button on Siri remote I'm getting events as if I pressed button on old remote (or iOS remote which is handled as old Siri remote).
Even button.controller instance is wrong and holds a reference to iOS remote instead of the correct new Siri remote.
I tried to look at button.aliases to see if I find "Cardinal Direction", which would hint at arrow press on new controller, but it does not contain this type of alias. I'm trying to find some hack for detection of currently used Siri remote, but without any success.
It looks like somewhere in the core the GCController converts button presses to more generic events making the button presses anonymous (losing hardware info). But clearly the low level code must see from which controller the button press came.
Any idea if there is a way of detecting from which remote controller the button press came?
DS4macOS Compiled Things Partially and Run But it Has 30+ Yellow Warnings & Doesn't Show the Setting
Hi Apple and Swift friends.
I'm not really a fluent Swift programmer (or any language) just knowledgable, I just a vague logic of what's going on. This is a gamepad controller remapper similar to DS4Windows on the PC and DSX on Steam gaming.
On macOS Sonoma, it successfully compiled and connected the amazing Sony Playstation DualSense but because it has 33 yellow warning, the Setting doesn't show. It says something about outdated something and needs to be replaced by a newer framework.network. It also says it can't use self:
It should look like these:
What could be the syntax changes that won't produce the 30+ yellow warnings?
The file can be had here:
[https://github.com/marcowindt/ds4macos)
God bless.
I have my Xbox Controller plugged into my Macbook, but my controller won't stay on (no batteries are needed for this controller, and the plug I am using works). Also in system settings when I scroll to Xbox Controllers it says no devices found. How do I fix this?