Game Controller

RSS for tag

Support hardware game controllers in your game using Game Controller.

Posts under Game Controller tag

25 Posts

Post

Replies

Boosts

Views

Activity

Game Controller Input Limitations in visionOS Volumetric Windows - Need Clarification
Game Controller Input Limitations in visionOS Volumetric Windows Hello Apple Developer Community, I'm developing a game for visionOS and have encountered significant limitations with game controller input when using volumetric windows (WindowGroup with .volumetric style). I'd appreciate clarification on whether this is expected behavior and any guidance on best practices. 🧩 Issue Summary When using a DualSense controller with a volumetric window in visionOS, only a subset of controller inputs are available to the app. The remaining inputs appear to be reserved by the system for UI navigation. ✅ Working Inputs (Volumetric Window) D-Pad (all directions) L3 (left thumbstick button click) R3 (right thumbstick button click) Menu button Options button ❌ Not Working Inputs (Volumetric Window) Left thumbstick analog movement (used for UI scrolling instead) Right thumbstick analog movement (used for UI scrolling instead) Face buttons (Cross, Circle, Square, Triangle / A, B, X, Y) Shoulder buttons (L1, R1) Triggers (L2, R2) Key observation: When moving the left thumbstick in a volumetric window, the window's UI scrolls vertically instead of sending input to my app's GameController handlers. Similarly, face buttons seem to be reserved for system UI interactions. ⚙️ Implementation Details I'm using the standard GameController framework: Connect to controller via GCController.controllers() Access extendedGamepad profile Set up valueChangedHandler and pressedChangedHandler for all inputs Handlers confirmed registered via logging Working inputs (D-Pad, L3, R3) trigger immediately and consistently Non-working inputs (thumbsticks, face buttons) never trigger 🧠 Critical Finding: ImmersiveSpace Works Perfectly When testing the exact same code in an ImmersiveSpace (.mixed immersion style), all controller inputs work perfectly: ✅ Both thumbsticks provide full analog input ✅ All face buttons trigger their handlers ✅ All shoulder buttons and triggers work correctly ✅ 100% success rate with no intermittent issues This suggests the issue isn't with my code, but rather how visionOS handles controller input differently between Volumetric Windows and ImmersiveSpace. 🧪 Test Environment I created a minimal test project (Controller-Playground) to isolate the issue: A simple ControllerTester class that registers all GameController handlers A visual UI showing real-time input state No game logic, RealityKit physics, or other complexity Results In volumetric window: Only D-Pad, L3, R3, Menu, Options work In ImmersiveSpace: All inputs work perfectly This confirms the limitation exists at the visionOS platform level, not in app code. 🧰 Attempted Workarounds I tried the following without success: Setting GCSupportsControllerUserInteraction = false in Info.plist Setting UIRequiresFullScreen = true Changing window styles (.plain, .volumetric) Polling vs. handler-based input approaches Various threading models (MainActor, separate thread) Result: The only way to enable full controller support is to switch to ImmersiveSpace. ❓ Questions for Apple Is this input reservation behavior in volumetric windows intended and documented? Are game controllers expected to have limited functionality in volumetric windows while full functionality is reserved for ImmersiveSpace? Is there a way to request full controller input access in a volumetric window, or is ImmersiveSpace the only option for complete controller support? Where can I find official documentation about controller input differences between window types? Are there any APIs or configuration options to disable system controller shortcuts in volumetric windows? 🎯 Impact This limitation has a significant effect on game design and architecture: Volumetric windows offer a multitasking-friendly, less immersive experience ImmersiveSpace provides full controller support but may be more immersive than some games require Games that only need basic D-Pad and button input can work fine in volumetric windows Games requiring analog sticks or face buttons must currently use ImmersiveSpace It would be very helpful if Apple could clarify or reference existing documentation regarding controller input handling in different visionOS window types. If such documentation doesn't exist yet, it might be valuable to include this information in future developer guides or best-practice documents. 🕹 Current Workaround For now, I'm using: D-Pad for character movement (digital 8-direction) R3 (right stick click) as a substitute for the "X" button This setup allows the game to function within a volumetric window, though full controller support still requires ImmersiveSpace. 📄 Request If this is expected behavior, I may have simply missed the relevant documentation — could you please point me to any existing resources that explain this design? If there isn't one yet, it would be great if future visionOS documentation could: Clearly outline controller input behavior across window types Provide guidance on when to use Volumetric Windows vs. ImmersiveSpace for games Consider adding an API option to request full controller access when appropriate If this is not expected behavior, I'm happy to file a detailed bug report with sample code. 💻 System Information visionOS: Latest Simulator Xcode: Latest version Controller: Sony DualSense Framework: GameController (standard extendedGamepad profile) Test project: Minimal reproducible example available Thank you for any clarification or guidance you can provide. This information would be valuable for many developers working on visionOS games.
1
0
139
22h
PSVR2 controller button quirks
I have an open Feedback conversation with Apple on this topic, but I am curious if others have run into this, or want to try out my sample code in their set up. there are two API’s for reading controller buttons, axis, and D pads: GCPhysicalInputProfile and GCControllerLiveInput. There are inconsistencies in behaviour between the two of them. Apple recommends we use GCControllerLiveInput, however, there are some capabilities on these controllers that are only accessible through GCPhysicalInputProfile, as I’ll discuss below. PSVR2 R2/L2 buttons, a.k.a. triggers, have force input analogue values. These can only be accessed on GCPhysicalInputProfile PSVR2 thumbstick direction values are read through “axes” on GCPhysicalInputProfile, but only “dpads” on GCControllerLiveInput on both GCPhysicalInputProfile and GCControllerLiveInput, All pressed events of all buttons are fired properly using generic aliases ( Trigger, Grip ,Menu, Right Thumbstick, Left Thumbstick, Right Button A & B (Circle & Cross), Left Button A&B (Triangle and Square) ). Apple reserves the system button as the equivalent of a home button for the OS. on GCPhysicalInputProfile, touch events are fired when the button is also pressed, but not for only touches. on GCControllerLiveInput , Touch events only works for the following buttons: Left Thumbstick, Right Thumbstick, Right Button A (Circle), and Right Button B (Cross). But Right Button B touch event isn’t labelled correctly, it fires as the Right Button A event. I observed this inside ALVR which uses a polling based approach to event processing: https://github.com/alvr-org/alvr-visionos/blob/17b5968f9d894944b53e97134b39dfce0993302a/ALVRClient/WorldTracker.swift#L301 To simplify to see this on a very simple app, I used the Apple example TrackingAccessories application: https://developer.apple.com/documentation/ARKit/tracking-accessories-in-volumetric-windows I’ve attached the code that replaces the AccessoryTrackingModel class. I added code that prints out what is touched/pressed, see the trackAllConnectedSpatialControllers method: https://github.com/svrc/TrackingAccessories
3
0
473
1w
Request: Option to Disable PSVR2 Sense Controller Low-Power Mode on visionOS (ARKit + Vision Pro Development)
Hi everyone, We’re developing a Unity project for Apple Vision Pro that connects PSVR2 Sense controllers for advanced interaction and input. We’ve encountered a major limitation: when the controller is not held close to the designated hand (e.g., resting on a table or held by the non designated hand), the Sense controller enters a low-power or reduced-update mode. This results in noticeably reduced tracking update frequency and responsiveness until the controller is held again. For certain use cases, this behavior is undesirable. In our case, it prevents continuous real-time tracking of the controller even when it’s stationary or being tracked externally. Request: Please consider exposing an API flag or developer option in ARKit to disable and optionally delay the low-power mode when the app requires full-rate updates regardless of proximity or hand pose detection.
2
0
125
1w
[Bug] iPadOS 26: 4-Finger Fast Tap/Swipe Gesture Not Detected (Multitouch Issue)
Hi everyone I'm experiencing an issue with iPadOS 26 regarding multi-touch gesture detection. When performing a quick four-finger gesture (tap and swipe), the system often fails to recognize the input. This especially affects multi-touch gestures, such as rhythm games with difficult levels. Steps to Reproduce: Place four fingers on the screen. Perform a quick tap or a quick horizontal swipe (like the one used to switch apps). Observe whether the gesture is ignored or detected inconsistently. Expected Behavior: 4-finger multitouch gestures should be recognized regardless of gesture speed, just like previous iPadOS versions. Actual Behavior: Gestures fail to be detected when executed quickly—same gestures still work, and miss notes in rhythm games. You can check out my posts on Twitter/x and Facebook: [https://x.com/kokona_fwa/status/1978131164104728949?s=61] Facebook: [https://m.facebook.com/groups/idipad/permalink/24438964899058806/?]
1
0
731
2w
PSVR2 controllers don't report anything in snapshot
I typically read an extended gamepad capture() and get all state. But PSVR2 controllers seem to report nothing. So the stick and other buttons don't do anything in a built app. They register as left/right controllers. This on vOS 26, Xcode 26, etc. They work correctly in the main icon view, although they don't honor inverted vertical and horiztonal scrolling. Both of the default scrolls just feel wrong. When I move left I'm want to scroll level not right. Same for up/down.
5
1
614
Sep ’25
virtual game controller + SwiftUI warning
Hi, I've just moved my SpriteKit-based game from UIView to SwiftUI + SpriteView and I'm getting this mesage Adding 'GCControllerView' as a subview of UIHostingController.view is not supported and may result in a broken view hierarchy. Add your view above UIHostingController.view in a common superview or insert it into your SwiftUI content in a UIViewRepresentable instead. Here's how I'm doing this struct ContentView: View { @State var alreadyStarted = false let initialScene = GKScene(fileNamed: "StartScene")!.rootNode as! SKScene var body: some View { ZStack { SpriteView(scene: initialScene, transition: .crossFade(withDuration: 1), isPaused: false , preferredFramesPerSecond: 60) .edgesIgnoringSafeArea(.all) .onAppear { if !self.alreadyStarted { self.alreadyStarted.toggle() initialScene.scaleMode = .aspectFit } } VirtualControllerView() .onAppear { let virtualController = BTTSUtilities.shared.makeVirtualController() BTTSSharedData.shared.virtualGameController = virtualController BTTSSharedData.shared.virtualGameController?.connect() } .onDisappear { BTTSSharedData.shared.virtualGameController?.disconnect() } } } } struct VirtualControllerView: UIViewRepresentable { func makeUIView(context: Context) -> UIView { let result = PassthroughView() return result } func updateUIView(_ uiView: UIView, context: Context) { } } class PassthroughView: UIView { override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? { for subview in subviews.reversed() { let convertedPoint = convert(point, to: subview) if let hitView = subview.hitTest(convertedPoint, with: event) { return hitView } } return nil } }
1
0
253
Sep ’25
GCMouse causes onDisappear to not run after dismissImmersiveSpace?
I have created a simple app that enters and exits an immersive space. I have not changed the basic code that gets created when you start a new visionOS project. I have connected a Magic Mouse to the AVP using BT. I have added a simple call to print(GCMouse.mice())and I have also tried print(GCController.controllers()) when the RealityView is launched inside the body closure. If I do not have the GCMouse/GCController call, everything works fine. However, with this perfect storm (Magic Mouse connected, enter immersive space, call GCMouse/GCController, exit immersive space) the onDisappear closure is never called, so I cannot reset my ToggleImmersiveSpace button out of disabled state due to transition. Once I power off and disconnect the mouse, the onDisappear closure is finally called. I have attempted to profile the issue, and I see that right before the onDisappear is called after the controllers are disconnected (which is long after the dismissImmersiveSpace was called), I see two mentions of some deallocation/destruction of GameController: -[GCMouse.cxx_destruct] -[GCPhysicalInputProfile(Pooling) release]
2
0
99
Aug ’25
How To Move Camera with PS5 Game Controller
Hello again! So, I am trying my best to figure out how to get a simple, fully Immersive scene that is navigable via the PS5 controller to work on Vision Pro. I am even using ChatGPT (both externally AND in Xcode when it makes fix recommendations) however, I seem to be moving around in circles, having to constantly send error messages back to ChatGPT, then try the new "confirmed to work" fixed code, only to have more errors. I also have explicitly let the AI know that I'm using all the current betas, only to learn there has been lots of stuff deprecated (of course!). This is my initial query to ChatGPT: "How can I create a fully immersive VR experience for Apple Vision Pro, using a PS5 game controller to navigate the user via first person view through a 3d scene?" And then down the rabbit hole the past few days. Seriously, I thought AI was supposed to be helpful! LOL Per Xcode logs, I'm seeing thumbstick movement, but simply cannot get the camera to move. Is this even possible?! Is it a camera defined in Reality Composer Pro? Or, does it need to be defined outside. I'm trying to follow along with the AI, but it seems to go back and forth with different solutions. Here is the latest it's spitting out: __ Thanks for this super clear log — and you’re absolutely right: the gamepad input is working, CameraRoot is moving as expected… but your actual camera view is not changing. This strongly suggests the issue is that RealityKit is not using your custom camera (PerspectiveCameraComponent) for rendering the view. ⸻ 🔍 What’s happening Even though CameraRoot is moving (with correct logs), RealityKit will default to its own internal camera unless: 1. You explicitly assign your camera entity to be the one used, or 2. Your camera is the only one and RealityKit picks it up automatically. But in visionOS 2.0 / Xcode 26, there seems to be a bug or behavior change where RealityKit no longer auto-uses your custom PerspectiveCameraComponent in RealityView unless it’s: • properly set with content.camera, or • part of a .usda/Reality Composer camera hierarchy that’s already active. Unfortunately: 🟥 RealityView.camera = .entity(cameraRoot) ❌ Is no longer allowed (we confirmed this breaks with a type error) ⸻ ✅ The Reliable Workaround: Attach your camera to the Immersive scene itself Since we’re loading a .usda scene (Reality Composer Pro), here’s the most reliable path forward: ✔️ Option A: Embed your camera into the Reality Composer Pro scene 1. Open your Reality Composer Pro scene 2. Add a Camera entity directly in your Immersive.usda scene 3. Name it something like "PlayerCamera" 4. Position it at [0, 1.6, 0] 5. Leave it selected as the default camera Then in your Swift code: • Find the camera entity with .findEntity(named:) • Move that entity instead of creating a new one. ⸻ ✔️ Option B: Force RealityKit to use your custom camera (via transform parenting) Since direct camera assignment is unavailable, we can use a trick: Move the whole RealityView scene via a parent entity with a camera. But let’s go with Option A, since you’ve already got a Reality Composer Pro scene. __ Can any developer over there can make a suggestion on how to move forward? I JUST want to create a sandbox template so I can start exploring 3d scenes created in Maya and sent over to headset. I'll deal with animation next, but come on, why is this so difficult to get working? I'm not a programmer, but have been trying to wrap my head around Xcode and SwiftUI. This needs to be much simpler. Or, you need to provide us creatives with better sample templates and non programmer speak on how to set this up properly. Ideally, you HIRE us 3d professionals to work side by side with the programmers to help make these tools useable - especially Reality Composer Pro. Seriously, I am making a concerted effort to use the native tools, even though I would love to be porting Unreal Engine scenes over. If anyone can help point me in the right direction, coming from a 3D Creator/Animator/Modeler perspective, I, and my fellow peers in the XR/AR/VR community would greatly appreciate it. Thank you.
8
0
702
Jul ’25
VisionOS Beta 3 spatial sculpting apple sample crash
Hello since updating to beta 3 the sculpting sample app doesn't work it crashes on running. seems to be something in AnchorEntity or AccessoryAnchoringSource Referenced from: <00B81486-1A74-30A0-B75B-4B39E3AF57DF> /private/var/containers/Bundle/Application/3D2EBF59-19F0-4BF4-8567-6962AA36A2C6/delete.app/delete.debug.dylib Expected in: <BAA9B221-78A1-3B99-AA2F-B8DFCD179FC7> /System/Library/Frameworks/RealityFoundation.framework/RealityFoundation
1
0
309
Jul ’25
GCController.shouldMonitorBackgroundEvents = true broken?
I am suspecting that setting GCController.shouldMonitorBackgroundEvents = true does not actually make the game controllers inputs accessible to the app when it is in the background. About this value the official documentation says: A Boolean value that indicates whether the app needs to respond to controller events when it isn’t the frontmost app. Now the behavior is that when the app is in focus the users inputs do get correctly recognized but as soon as the app enters the background no inputs get recognized. The controller does not get reported as disconnecting and still works for example in launchpad. I am sure that about 2 months ago when I first used this it did work as one would expect. I also have seen that an app which lets users execute certain actions using their controller has stoped working recently, adding to my suspicion of the feature being broken. Here is a minimum reproducible example: import SwiftUI import GameController @main struct TestingControllerConnectionApp: App { @NSApplicationDelegateAdaptor(AppDelegate.self) var appDelegate var body: some Scene { WindowGroup { ContentView() } } } class AppDelegate: NSObject, NSApplicationDelegate { var statusItem: NSStatusItem? var controller: GCController? func applicationDidFinishLaunching(_ notification: Notification) { setupMenuBar() GCController.shouldMonitorBackgroundEvents = true NotificationCenter.default.addObserver( self, selector: #selector(controllerDidConnect), name: .GCControllerDidConnect, object: nil ) NotificationCenter.default.addObserver( self, selector: #selector(controllerDidDisconnect), name: .GCControllerDidDisconnect, object: nil ) } @objc private func setupMenuBar() { let menu = NSMenu() menu.addItem(NSMenuItem(title: "Quit", action: #selector(quitApp), keyEquivalent: "q")) statusItem = NSStatusBar.system.statusItem(withLength: NSStatusItem.variableLength) statusItem?.button?.image = NSImage(resource: .controllerBar) statusItem?.menu = menu } @objc private func quitApp() { NSApp.terminate(nil) } @objc private func controllerDidConnect(_ notification: Notification) { if let controller = notification.object as? GCController { print("Controller connected") self.controller = controller if let gamepad = controller.extendedGamepad { gamepad.buttonA.pressedChangedHandler = { _, _, pressed in print("Button A pressed: \(pressed)") } } } } @objc private func controllerDidDisconnect(_ notification: Notification) { print("Controller disconnected") } } This is created in a completely fresh Xcode project and NSHumanInterfaceDeviceUsageDescription has been added. I am using a PS5 Controller and a Mac running MacOS 15.4.1 which has been restarted and only Xcode and the app have been opened. I have tested this with setting a multitude of different entitlements and capabilities including: NSHumanInterfaceDeviceUsageDescription Supports Controller User Interaction Required background modes -> App communicates with an accessory com.apple.security.device.bluetooth com.apple.security.device.hid com.apple.security.device.usb I have also set this value at different points in the code with no change of effect. Does anybody see if there is any fault in my code or my understanding of the effect of the value 'shouldMonitorBackgroundEvents'? Or is this the functionality actually being broken on Apples part?
1
0
132
Apr ’25
[macOS 15.4] Game Controller Background Input Capture Broken - Accessibility App No Longer Functions
Our application, https://apps.apple.com/us/app/gamecontroller-mapper/id6737088417 which maps game controller inputs to keyboard/mouse events system-wide, has stopped functioning properly after the macOS 15.4 update. Specifically, the app can no longer capture game controller inputs when running in the background, severely impacting its core functionality. Environment macOS version: 15.4 Previous working versions: All versions prior to 15.4 App type: Background utility with accessibility permissions Hardware: All game controller brands compatible with macOS Detailed Description Before macOS 15.4 Our application correctly captured game controller inputs from any brand connected to Mac and successfully translated them to keyboard/mouse events system-wide. Users could control any application (e.g., scrolling through documents in Preview using controller buttons) while our app ran in the background with the accessibility permissions granted. After macOS 15.4 The application only works when it has active focus (is in the foreground). When any other application gains focus, our app completely stops receiving or detecting any input events from the game controller while running in the background. For instance, pressing the 'down' button on the controller while another app is active results in no event being registered within our application. We've tried updating the app to work in accessory mode (in the menubar), but the issue persists. Steps to Reproduce Install our application on macOS 15.3 or earlier Grant accessibility permissions when prompted Connect a compatible game controller (e.g., Xbox or other controller) Open another application (e.g., Preview with a PDF document) Press buttons on the controller to navigate the document without touching the keyboard Expected result on 15.3: Controller inputs are translated to keyboard events, even when our app is in the background Upgrade to macOS 15.4 Repeat steps 2-5 Actual result on 15.4: Controller inputs are only translated to keyboard events when our application has focus Technical Implementation Our app uses: CGEvent.tapCreate() to create a global event tap CGEvent for simulating keyboard and mouse events GCController.extendedGamepad?.valueChangedHandler for detecting controller inputs Proper NSAccessibilityUsageDescription and appropriate entitlements GCController.shouldMonitorBackgroundEvents = true to ensure controller events continue when the app is inactive Possible Relation to Recent Changes We noticed in the macOS 15.4 Release Notes: Game Controller - Resolved Issues: Fixed: Game controllers might stop responding when accessibility features, such as Voice Over, are enabled. (141497799) We suspect this fix might have introduced a regression or intentional limitation affecting applications like ours that rely on background event simulation with game controller input. Impact This change severely impacts: Applications designed to use game controllers as assistive input devices for users who may have difficulty using traditional keyboard and mouse inputs Applications for media control, presentation navigation, and other similar use cases Users who rely on our application for accessibility purposes Questions Is this an intentional security change or an unintended side effect of the controller fix mentioned in the release notes? Are there any new APIs or alternative approaches we should implement to restore functionality? If this is a system bug, when can we expect a fix? We would greatly appreciate any guidance on how to restore our application's functionality. Thank you for your assistance.
4
0
254
Apr ’25
VisionOS - Gamepad steals focus
I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running. Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
1
0
129
Apr ’25
CHHapticAdvancedPatternPlayer not working with GCController
Hello everyone, I want send haptics to ps4 controller. CHHapticPatternPlayer and CHHapticAdvancedPatternPlayer good work with iPhone. On PS4 controller If I use CHHapticPatternPlayer all work good, but if I use CHHapticAdvancedPatternPlayer I get error. I want use CHHapticAdvancedPatternPlayer to use additional settings. I don't found any information how to fix it - CHHapticEngine.mm:624 -[CHHapticEngine finishInit:]_block_invoke: ERROR: Server connection broke with error 'Не удалось завершить операцию. (com.apple.CoreHaptics, ошибка -4811)' The engine stopped because a system error occurred. AVHapticClient.mm:1228 -[AVHapticClient getSyncDelegateForMethod:errorHandler:]_block_invoke: ERROR: Sync XPC call for 'loadAndPrepareHapticSequenceFromEvents:reply:' (client ID 0x21) failed: Не удалось установить связь с приложением-помощником. Не удалось создать или воспроизвести паттерн: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics" UserInfo={NSDebugDescription=connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics} My Haptic class - import Foundation import CoreHaptics import GameController protocol HapticsControllerDelegate: AnyObject { func didConnectController() func didDisconnectController() func enginePlayerStart(value: Bool) } final class HapticsControllerManager { static let shared = HapticsControllerManager() private var isSetup = false private var hapticEngine: CHHapticEngine? private var hapticPlayer: CHHapticAdvancedPatternPlayer? weak var delegate: HapticsControllerDelegate? { didSet { if delegate != nil { startObserving() } } } deinit { NotificationCenter.default.removeObserver(self) } private func startObserving() { guard !isSetup else { return } NotificationCenter.default.addObserver( self, selector: #selector(controllerDidConnect), name: .GCControllerDidConnect, object: nil ) NotificationCenter.default.addObserver( self, selector: #selector(controllerDidDisconnect), name: .GCControllerDidDisconnect, object: nil ) isSetup = true } @objc private func controllerDidConnect(notification: Notification) { delegate?.didConnectController() self.createAndStartHapticEngine() } @objc private func controllerDidDisconnect(notification: Notification) { delegate?.didDisconnectController() hapticEngine = nil hapticPlayer = nil } private func createAndStartHapticEngine() { guard let controller = GCController.controllers().first else { print("No controller connected") return } guard controller.haptics != nil else { print("Haptics not supported on this controller") return } hapticEngine = createEngine(for: controller, locality: .default) hapticEngine?.playsHapticsOnly = true do { try hapticEngine?.start() } catch { print("Не удалось запустить движок тактильной обратной связи: \(error)") } } private func createEngine(for controller: GCController, locality: GCHapticsLocality) -> CHHapticEngine? { guard let engine = controller.haptics?.createEngine(withLocality: locality) else { print("Failed to create engine.") return nil } print("Successfully created engine.") engine.stoppedHandler = { reason in print("The engine stopped because \(reason.message)") } engine.resetHandler = { print("The engine reset --> Restarting now!") do { try engine.start() } catch { print("Failed to restart the engine: \(error)") } } return engine } func startHapticFeedback(haptics: [CHHapticEvent]) { do { let pattern = try CHHapticPattern(events: haptics, parameters: []) hapticPlayer = try hapticEngine?.makeAdvancedPlayer(with: pattern) hapticPlayer?.loopEnabled = true try hapticPlayer?.start(atTime: 0) self.delegate?.enginePlayerStart(value: true) } catch { self.delegate?.enginePlayerStart(value: false) print("Не удалось создать или воспроизвести паттерн: \(error)") } } func stopHapticFeedback() { do { try hapticPlayer?.stop(atTime: 0) self.delegate?.enginePlayerStart(value: false) } catch { self.delegate?.enginePlayerStart(value: true) print("Не удалось остановить воспроизведение вибрации: \(error)") } } } extension CHHapticEngine.StoppedReason { var message: String { switch self { case .audioSessionInterrupt: return "the audio session was interrupted." case .applicationSuspended: return "the application was suspended." case .idleTimeout: return "an idle timeout occurred." case .systemError: return "a system error occurred." case .notifyWhenFinished: return "playback finished." case .engineDestroyed: return "the engine was destroyed." case .gameControllerDisconnect: return "the game controller disconnected." @unknown default: return "an unknown error occurred." } } } custom haptic events - static func changeVibrationPower(power: HapricPower) -> [CHHapticEvent] { let continuousEvent = CHHapticEvent(eventType: .hapticContinuous, parameters: [ CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0), CHHapticEventParameter(parameterID: .hapticIntensity, value: power.value) ], relativeTime: 0, duration: 0.5) return [continuousEvent] }
1
0
377
Feb ’25
GCDualSenseAdaptiveTrigger API set trigger mode not working
I'm trying to add support to PS5 DualSense controller. when I try to use the API from here: https://developer.apple.com/documentation/gamecontroller/gcdualsenseadaptivetrigger?language=objc None of the API works, am I missed anything? The code is like this: if ( [ controller.extendedGamepad isKindOfClass:[ GCDualSenseGamepad class ] ] ) { GCDualSenseGamepad * dualSenseGamePad = ( GCDualSenseGamepad * )controller.extendedGamepad; auto funcSetEffectTrigger = []( TriggerEffectParams& params, GCDualSenseAdaptiveTrigger *trigger ) { if ( params.m_mode == TriggerEffectMode::Off ) { [ trigger setModeOff ]; NSLog(@"setModeOff trigger.mode:%d", trigger.mode ); } else if ( params.m_mode == TriggerEffectMode::Feedback ) { [ trigger setModeFeedbackWithStartPosition: 0.2f resistiveStrength: 0.5f ]; } else if ( params.m_mode == TriggerEffectMode::Weapon ) { [ trigger setModeWeaponWithStartPosition: 0.2f endPosition: 0.4f resistiveStrength: 0.5f ]; } else if ( params.m_mode == TriggerEffectMode::Vibration ) { [ trigger setModeVibrationWithStartPosition: position amplitude: amplitude frequency: frequency ]; } }; if ( L2 ) { funcSetEffectTrigger( params, dualSenseGamePad.leftTrigger ); } if ( R2 ) { funcSetEffectTrigger( params, dualSenseGamePad.rightTrigger ); } } I've also tested to add "Game Controllers" capability to Target, still not working. Can't find anything else from the document or forums. I've no idea what need to do.
5
0
654
Jan ’25
tvOS: GCController does not send button press events for "Button A" and "Button Center" when VoiceOver is On
When turning VoiceOver ON, GCController does not send button press events for "Button A" and "Button Center". This happens when using Siri 2nd generation remote (with dedicated arrow buttons on the circle around center button) and also when using iOS remote. I didn't test it on old Siri 1st generation with touchpad without arrow buttons. Example: gameController.microGamepad?.allButtons.forEach { button in button.valueChangedHandler = { [weak self] _, _, _ in self?.buttonHandler(gameController: gameController, button: button) } private func buttonHandler(gameController: GCController, button: GCControllerButtonInput) { print("BUTTON: Pressed \(button.description) isPressed=\(button.isPressed) isTouched=\(button.isTouched)") } #endif VoiceOver ON (incorrect behavior): BUTTON: Pressed Direction Pad Left (value: 0.030, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Direction Pad Down (value: 0.079, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Direction Pad Down (value: 0.000, pressed: 0) isPressed=false isTouched=false VoiceOver OFF (correct behavior): BUTTON: Pressed Direction Pad Left (value: 0.137, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Direction Pad Up (value: 0.078, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Button A (value: 1.000, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Button Center (value: 1.000, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Button A (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Button Center (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Direction Pad Up (value: 0.000, pressed: 0) isPressed=false isTouched=false I could use for detection Direction Pad Left/Right/Up/Down and detect position between -0.7 and +0.7 and handle it as center button press, because I use that on old Siri remote where I need to distinguish center button and arrows (for switching TV channels by Up/Down and Skip forward/back by Left/Right arrows), but for new Siri remote it would be unnecessary workaround. Does anybody know why the center/select button is not detected when VoiceOver is ON. Is there another way of detecting it using GCController? I don't want to use SwiftUI onTapGesture for this one particular case. Is it an unexpected bug in tvOS APIs or is there some specific reason why center button is not handled by GCController when VoiceOver is ON? Thanks.
0
0
622
Jan ’25
GCControllerDidConnect notification not received in VisionOS 2.0
I am unable to get VisionOS 2.0 (simulator) to receive the GCControllerDidConnect notification and thus am unable to setup support for a gamepad. However, it works in VisionOS 1.2. For VisionOS 2.0 I've tried adding: .handlesGameControllerEvents(matching: .gamepad) attribute to the view Supports Controller User Interaction to Info.plist Supported game controller types -> Extended Gamepad to Info.plist ...but the notification still doesn't fire. It does when the code is run from VisionOS 1.2 simulator, both of which have the Send Game Controller To Device option enabled. Here is the example code. It's based on the Xcode project template. The only files updated were ImmersiveView.swift and Info.plist, as detailed above: import SwiftUI import GameController import RealityKit import RealityKitContent struct ImmersiveView: View { var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) } NotificationCenter.default.addObserver( forName: NSNotification.Name.GCControllerDidConnect, object: nil, queue: nil) { _ in print("Handling GCControllerDidConnect notification") } } .modify { if #available(visionOS 2.0, *) { $0.handlesGameControllerEvents(matching: .gamepad) } else { $0 } } } } extension View { func modify<T: View>(@ViewBuilder _ modifier: (Self) -> T) -> some View { return modifier(self) } }
2
1
897
Dec ’24
prefersPointerLocked not worked properly if run on MacOS environment
Hi community. I am trying to adopt my first person shooter iOS game for running on MacOS environment. I need to lock the pointer when I enter battle mode, and unlock in lobby. On iOS all works fine (with mouse and keyboard) - pointer locks and unlocks according to my commands. However, on MacOS I faced the following behavior: after switching the pointer lock state and setNeedsUpdateOfPrefersPointerLocked invocation, the pointer does not locked immediately. To enable pointer lock, the user must click in the window. I checked the criteria listed in documentation: I do have fullscreen mode, I monitor UISceneActivationState and can confirm it is UISceneActivationStateForegroundActive, I do not use MacCatalyst (it is disabled in app's capabilities). However pointer locks only after click on window, which is weird. Can someone confirm that this is the exact behaviour as designed by Apple developers, or am I doing anything wrong. I have read the note: "Bringing an app built with Mac Catalyst to the foreground doesn’t immediately enable pointer lock. To enable pointer lock, the user must click in the window. To exit pointer lock, users can use Command-tab to switch to another app, or using Command-tilde.", but again, I don't use MacCatalyst. Any hints are highly appreciated! Best regards. refs: https://developer.apple.com/documentation/apple-silicon/running-your-ios-apps-in-macos https://developer.apple.com/documentation/uikit/uiviewcontroller/3601235-preferspointerlocked?language=objc
1
0
488
Dec ’24