Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Unity GameKit Plugin w/ Matchmaking Queue is not working
TLDR; I can't get QueueName to work with matchmaking a turn-based match in Unity using matchmaking rules. Long version: I'm using the apple unity plugin found here: https://github.com/apple/unityplugins/blob/main/plug-ins/Apple.GameKit/Apple.GameKit_Unity/Assets/Apple.GameKit/Documentation~/Apple.GameKit.md I have created a Queue, RuleSet and a simple Rule to match players by following these docs tightly: https://developer.apple.com/documentation/gamekit/finding-players-using-matchmaking-rules. Here is the single rule I have that drives matchmaking: { "data" : { "type" : "gameCenterMatchmakingRules", "id" : "[hiddden-rule-id]", "attributes" : { "referenceName" : "ComplimentaryFactionPreference", "description" : "default desc", "type" : "MATCH", "expression" : "requests[0].properties.preference != requests[1].properties.preference", "weight" : null }, "links" : { "self" : "https://api.appstoreconnect.apple.com/v1/gameCenterMatchmakingRules/[hidden-rule-id]" } }, "links" : { "self" : "https://api.appstoreconnect.apple.com/v1/gameCenterMatchmakingRules" } } which belongs to a rule set which belongs to a queue. I have verified these are setup and linked via the App Store Connect API. Additionally, when I tested queue-based matchmaking without a queue established, I got an error in Unity. Now, with this, I do not. However there is a problem when I attempt to use the queue for matchmaking. I have the basic C# function here: public override void StartSearch(NSMutableDictionary<NSString, NSObject> properties) { if (searching) return; base.StartSearch(properties); //Establish matchmaking requests _matchRequest = GKMatchRequest.Init(); _matchRequest.QueueName = _PreferencesToQueue(GetSerializedPreferences()); _matchRequest.Properties = properties; _matchRequest.MaxPlayers = PLAYERS_COUNT; _matchRequest.MinPlayers = PLAYERS_COUNT; _matchTask = GKTurnBasedMatch.Find(_matchRequest); } The _PreferencesToQueue(GetSerializedPreferences()); returns the exact name of the queue I added my ruleset to. After this function is called, I poll the task generated from the .Find(...) function. Every time I run this function, a new match is created almost instantly. No two players are ever added to the same match. Further, I'm running two built game instances, one on a mac and another on an ipad and when I simultaneously test, I am unable to join games this way. Can someone help me debug why I cannot seem to match make when using a queue based approach?
1
0
893
Sep ’25
RealityKit - How to change camera target in response of a touch event?
Hello, I’m porting my UIKit/SceneKit app to SwiftUI/RealityKit and I’m wondering how to change the camera target programmatically. I created a simple scene in Reality Composer Pro with two spheres. My goal is straightforward: when the user taps a sphere, the camera should look at it as the main target. Following Apple’s videos, I implemented the .gesture modifier and it is printing the tapped sphere correctly, but updating my targetEntity state doesn’t change anything, so the camera won't update its target. Is there a way to access the scene content at that level? Or what else should I do? Here’s my current code implementation: Thanks!
1
0
216
Sep ’25
Unity GameCenter Leaderboard issue
Hey, i have created a game in unity with the apple core and apple gamekit plugins present. I setup 5 leaderboards on the app store connect. I made a unity build and did the whole testflight build loop to test everything. When i open my gamecenter panel via the button i see my leaderboards but they show as MISSING TITLE which is weird because i have for sure set them up correctly they have a leaderboard reference name and leaderboard id as well. When debugging i can see that when i call my submit score function it gets submitted with no error but then i also dont see the score appear anywhere . Keep in mind the leaderboards are not live and are being tested on testflight first
2
0
1.2k
Jan ’25
ARView ignores multi-touch events
Hi, How to enable multitouch on ARView? Touch functions (touchesBegan, touchesMoved, ...) seem to only handle one touch at a time. In order to handle multiple touches at a time with ARView, I have to either: Use SwiftUI .simultaneousGesture on top of an ARView representable Position a UIView on top of ARView to capture touches and do hit testing by passing a reference to ARView Expected behavior: ARView should capture all touches via touchesBegan/Moved/Ended/Cancelled. Here is what I tried, on iOS 26.1 and macOS 26.1: ARView Multitouch The setup below is a minimal ARView presented by SwiftUI, with touch events handled inside ARView. Multitouch doesn't work with this setup. Note that multitouch wouldn't work either if the ARView is presented with a UIViewController instead of SwiftUI. import RealityKit import SwiftUI struct ARViewMultiTouchView: View { var body: some View { ZStack { ARViewMultiTouchRepresentable() .ignoresSafeArea() } } } #Preview { ARViewMultiTouchView() } // MARK: Representable ARView struct ARViewMultiTouchRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARViewMultiTouch(frame: .zero) let anchor = AnchorEntity() arView.scene.addAnchor(anchor) let boxWidth: Float = 0.4 let boxMaterial = SimpleMaterial(color: .red, isMetallic: false) let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial]) box.name = "Box" box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)])) anchor.addChild(box) return arView } func updateUIView(_ uiView: ARView, context: Context) { } } // MARK: ARView class ARViewMultiTouch: ARView { required init(frame: CGRect) { super.init(frame: frame) /// Enable multi-touch isMultipleTouchEnabled = true cameraMode = .nonAR automaticallyConfigureSession = false environment.background = .color(.gray) /// Disable gesture recognizers to not conflict with touch events /// But it doesn't fix the issue gestureRecognizers?.forEach { $0.isEnabled = false } } required dynamic init?(coder decoder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { for touch in touches { /// # Problem /// This should print for every new touch, up to 5 simultaneously on an iPhone (multi-touch) /// But it only fires for one touch at a time (single-touch) print("Touch began at: \(touch.location(in: self))") } } } Multitouch with an Overlay This setup works, but it doesn't seem right. There must be a solution to make ARView handle multi touch directly, right? import SwiftUI import RealityKit struct MultiTouchOverlayView: View { var body: some View { ZStack { MultiTouchOverlayRepresentable() .ignoresSafeArea() Text("Multi touch with overlay view") .font(.system(size: 24, weight: .medium)) .foregroundStyle(.white) .offset(CGSize(width: 0, height: -150)) } } } #Preview { MultiTouchOverlayView() } // MARK: Representable Container struct MultiTouchOverlayRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> UIView { /// The view that SwiftUI will present let container = UIView() /// ARView let arView = ARView(frame: container.bounds) arView.autoresizingMask = [.flexibleWidth, .flexibleHeight] arView.cameraMode = .nonAR arView.automaticallyConfigureSession = false arView.environment.background = .color(.gray) let anchor = AnchorEntity() arView.scene.addAnchor(anchor) let boxWidth: Float = 0.4 let boxMaterial = SimpleMaterial(color: .red, isMetallic: false) let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial]) box.name = "Box" box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)])) anchor.addChild(box) /// The view that will capture touches let touchOverlay = TouchOverlayView(frame: container.bounds) touchOverlay.autoresizingMask = [.flexibleWidth, .flexibleHeight] touchOverlay.backgroundColor = .clear /// Pass an arView reference to the overlay for hit testing touchOverlay.arView = arView /// Add views to the container. /// ARView goes in first, at the bottom. container.addSubview(arView) /// TouchOverlay goes in last, on top. container.addSubview(touchOverlay) return container } func updateUIView(_ uiView: UIView, context: Context) { } } // MARK: Touch Overlay View /// A UIView to handle multi-touch on top of ARView class TouchOverlayView: UIView { weak var arView: ARView? override init(frame: CGRect) { super.init(frame: frame) isMultipleTouchEnabled = true isUserInteractionEnabled = true } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { let totalTouches = event?.allTouches?.count ?? touches.count print("--- Touches Began --- (New: \(touches.count), Total: \(totalTouches))") for touch in touches { let location = touch.location(in: self) /// Hit testing. /// ARView and Touch View must be of the same size if let arView = arView { let entity = arView.entity(at: location) if let entity = entity { print("Touched entity: \(entity.name)") } else { print("Touched: none") } } } } override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) { let totalTouches = event?.allTouches?.count ?? touches.count print("--- Touches Cancelled --- (Cancelled: \(touches.count), Total: \(totalTouches))") } }
1
0
252
Nov ’25
Morphing Animation Not Playing in an Xcode VisionOS App
I have a 3D model with morphing animation that works correctly in Blender. I exported this model as a USDZ file and tried to display it in an Xcode-developed visionOS app, but the morphing animation does not play. What I Have Tried: Morphing animation works correctly in Blender. After exporting to USDZ, the morphing animation does not play in the Xcode app. Linear motion animations (such as object movement) work fine. Behavior in Reality Converter: GLB files do not display. USDZ files load, but morphing animations do not play. What I Want to Know: Is there a way to play morphing animations in an Xcode-developed app? Does RealityKit support morphing animations? Can morphing animations be played in an Xcode-developed app? If RealityKit does not support morphing animations, what alternative methods can be used to play them? I am looking for a way to use the existing animations without recreating them. Additional Information: I have both the Blender file (where animations work) and the USDZ file (where animations do not play). I am developing a visionOS app using Xcode. Any advice or solutions would be greatly appreciated. Thank you in advance!
2
0
487
Feb ’25
How to apply the same SystemImage to both mainEmitter and spawnedEmitter without clipping in ParticleEmitterComponent?
Hi everyone, I’m currently learning about ParticleEmitterComponentParticleEmitterComponent and exploring the sample app provided in the Simulating particles in your visionOS app documentation. In the sample app, when I set the EmitterPreset to fireworks from the settings panel on the left side of the window and choose SystemImage, I noticed two issues: The image applied to mainEmitter appears clipped or cropped. The image on spawnedEmitter does not update to the selected SystemImage. What I want to achieve: Apply the same SystemImage to both mainEmittermainEmitter and spawnedEmitterspawnedEmitter so that it displays correctly without clipping. Remove the animation that changes the size of spawnedEmitterspawnedEmitter over time and keep it at a constant size. Could someone explain which properties should be adjusted to achieve this behavior? Any guidance or examples would be greatly appreciated. Thanks in advance!
0
0
449
Sep ’25
Game Center Challenges and Activities are not appearing
Hi, I'm trying to add game center challenges and activities to an already live game, but they are not appearing in game for testing, GameCenter, or the Games app. I know the game is setup with GameKit entitlements since this is a live game and it has working leaderboards and achievements. I've updated to Tahoe beta 8, added a challenge and activity on app store connect, added that to a new distribution and added that distribution to 'Add for Review' I'm using Unity and the Apple Unity plugin Not sure what other steps I'm missing Thanks
0
0
1.1k
Sep ’25
MTLCaptureManager.sharedCaptureManager generates corrupted .gputrace files (0KB, invalid internal structure)
Hello, I am experiencing an issue with programmatically capturing a GPU trace using MTLCaptureManager. The .gputrace file that is generated appears to be corrupted, and I'm looking for guidance or a solution. Description of the Problem: I am using MTLCaptureManager.sharedCaptureManager to capture a Metal frame and save it to disk. The generated .gputrace file is consistently reported as 0 bytes in size by the file system. Crucially, when I compress this 0-byte .gputrace file into a .zip archive, the resulting archive contains the full, expected data. After unzipping, the file can be opened and viewed correctly in Xcode. However,When inspecting the file's contents using NSFileManager in Objective-C (treating it as a directory), the internal structure is different from a .gputrace file captured directly from Xcode's Metal Debugger. capture in xcode capture in file Finally,When capturing multiple frames programmatically, the first captured frame contains valid buffer data. However, for subsequent frames (starting from the second frame), the corresponding buffer contents are all zero-filled. Frame 1: All MTLBuffer data is correctly captured and populated. Frame 2 and onward: The same MTLBuffer objects are present in the trace, but their contents are entirely 0 (i.e., the data is not captured or is corrupted). In this case, the on-screen display is normal, but the captured frame is incorrect. The frame captured directly in Xcode is also correct. Only the frame captured to a file is abnormal.
1
0
474
Aug ’25
Sparse Texture Writes
Hey, I've been struggling with this for some days now. I am trying to write to a sparse texture in a compute shader. I'm performing the following steps: Set up a sparse heap and create a texture from it Map the whole area of the sparse texture using updateTextureMapping(..) Overwrite every value with the value "4" in a compute shader Blit the texture to a shared buffer Assert that the values in the buffer are "4". I have a minimal example (which is still pretty long unfortunately). It works perfectly when removing the line heapDesc.type = .sparse. What am I missing? I could not find any information that writes to sparse textures are unsupported. Any help would be greatly appreciated. import Metal func sparseTexture64x64Demo() throws { // ── Metal objects guard let device = MTLCreateSystemDefaultDevice() else { throw NSError(domain: "SparseNotSupported", code: -1) } let queue = device.makeCommandQueue()! let lib = device.makeDefaultLibrary()! let pipeline = try device.makeComputePipelineState(function: lib.makeFunction(name: "addOne")!) // ── Texture descriptor let width = 64, height = 64 let format: MTLPixelFormat = .r32Uint // 4 B per texel let desc = MTLTextureDescriptor() desc.textureType = .type2D desc.pixelFormat = format desc.width = width desc.height = height desc.storageMode = .private desc.usage = [.shaderWrite, .shaderRead] // ── Sparse heap let bytesPerTile = device.sparseTileSizeInBytes let meta = device.heapTextureSizeAndAlign(descriptor: desc) let heapBytes = ((bytesPerTile + meta.size + bytesPerTile - 1) / bytesPerTile) * bytesPerTile let heapDesc = MTLHeapDescriptor() heapDesc.type = .sparse heapDesc.storageMode = .private heapDesc.size = heapBytes let heap = device.makeHeap(descriptor: heapDesc)! let tex = heap.makeTexture(descriptor: desc)! // ── CPU buffers let bytesPerPixel = MemoryLayout<UInt32>.stride let rowStride = width * bytesPerPixel let totalBytes = rowStride * height let dstBuf = device.makeBuffer(length: totalBytes, options: .storageModeShared)! let cb = queue.makeCommandBuffer()! let fence = device.makeFence()! // 2. Map the sparse tile, then signal the fence let rse = cb.makeResourceStateCommandEncoder()! rse.updateTextureMapping( tex, mode: .map, region: MTLRegionMake2D(0, 0, width, height), mipLevel: 0, slice: 0) rse.update(fence) // ← capture all work so far rse.endEncoding() let ce = cb.makeComputeCommandEncoder()! ce.waitForFence(fence) ce.setComputePipelineState(pipeline) ce.setTexture(tex, index: 0) let threadsPerTG = MTLSize(width: 8, height: 8, depth: 1) let tgCount = MTLSize(width: (width + 7) / 8, height: (height + 7) / 8, depth: 1) ce.dispatchThreadgroups(tgCount, threadsPerThreadgroup: threadsPerTG) ce.updateFence(fence) ce.endEncoding() // Blit texture into shared buffer let blit = cb.makeBlitCommandEncoder()! blit.waitForFence(fence) blit.copy( from: tex, sourceSlice: 0, sourceLevel: 0, sourceOrigin: MTLOrigin(x: 0, y: 0, z: 0), sourceSize: MTLSize(width: width, height: height, depth: 1), to: dstBuf, destinationOffset: 0, destinationBytesPerRow: rowStride, destinationBytesPerImage: totalBytes) blit.endEncoding() cb.commit() cb.waitUntilCompleted() assert(cb.error == nil, "GPU error: \(String(describing: cb.error))") // ── Verify a few texels let out = dstBuf.contents().bindMemory(to: UInt32.self, capacity: width * height) print("first three texels:", out[0], out[1], out[width]) // 0 1 64 assert(out[0] == 4 && out[1] == 4 && out[width] == 4) } Metal shader: #include <metal_stdlib> using namespace metal; kernel void addOne(texture2d<uint, access::write> tex [[texture(0)]], uint2 gid [[thread_position_in_grid]]) { tex.write(4, gid); }
1
0
118
May ’25
Metal 4 & Acceleration Structures
I have really enjoyed looking through the code and videos related to Metal 4. Currently, my interest is to update a ReSTIR Project and take advantage of more robust ways to refit acceleration Structures and more powerful ways to access resources. I am working in Swift and have encountered a couple of puzzles: What is the 'accepted' way to create a MTL4BufferRange to store indices and vertices? How do I properly rewrite Swift code to build and compact an Acceleration Structure? I do realize that this is all in Beta and will happily look through Code Samples this Fall. If other guidance is available earlier, that would be fabulous! Thank you
4
0
611
Sep ’25
Metal useResource vs. MTLFence
Hello, I'm tracking down a bug where useResource doesn't seem to apply proper synchronization when a resource is produced by the render pass then consumed by the compute pass, but when I use MTLFence between the to signal and wait between the render/compute encoders, the artifact goes away. The resource is created with MTLHazardTrackingModeTracked and useResource is called on the compute encoder after the render pass. Metal API Validation doesn't report any warnings/errors. Am I misunderstanding the difference between the two APIs? I dug through the Metal documentation and it looks like useResource should handle synchronization given the resource has MTLHazardTrackingModeTracked but on the other hand, MTLFence should be used to ensure proper synchronization between command encoders. Can someone can clarify the difference between the two APIs and when to use them.
3
0
144
Jul ’25
Distortion Artifacts on VisionOS When Rendering Opaque/Alpha Clipped Foliage in URP (Unity 6.0, Metal)
I'm running into a persistent visual issue while deploying a floral corridor scene to Apple Vision Pro using Unity 6.0 with URP and Metal. The issue only appears on the Vision Pro device — everything looks fine in the Unity Editor. Issue Description When the frame rate drops to around 60–70 FPS, noticeable distortion artifacts appear around the edges of foliage models. It seems like the background meshes (behind the plants) get warped and leak through the edges of the foliage. Although this is most visible around the leaves, even solid objects like standard URP wall or box models show distorted edges when the issue occurs. All the foliage uses Opaque or Alpha Clipping materials. Things I've Tried Changing the foliage materials to Transparent mode —distortion around edges disappears, but using Transparent for a large number of foliage assets is not ideal for performance or sorting complexity. Reducing the number of foliage objects — with only a few plants in the scene and the frame rate staying around 100 FPS, the distortion disappears. However, this isn’t a practical solution for a full environment. Possible Cause? I came across this note in the Unity documentation: "Ensure depth-buffer for each pixel is non-zero - on visionOS, the depth buffer is used for reprojection. To ensure visual effects like skyboxes and shaders are displayed beautifully, ensure that some value is written to the depth for each pixel." Could this be related to the issue? Is it possible that Alpha Clipping with low pixel coverage leads to some pixels not writing to the depth buffer, which then causes problems during Vision Pro’s reprojection or foveated rendering? However, even when I disable Alpha Clipping entirely, the distortion issue still persists, so it may not be solely caused by clipping itself. Project Setup Unity 6.0 (URP) Depth Texture: Enable Using Metal as the graphics backend Running on real Vision Pro hardware (not simulator) Any advice on how to avoid these distortion issues on Vision Pro would be greatly appreciated. Thanks!
1
0
117
Jul ’25
RealityKit SIMD3<Float> precision decreases with distance?
The farther away the center of a large entity is, the less accurate the positioning is? For example I am changing only the y-axis position of an entity that is tens of meters long, but i notice x and z drifting slowly the farther away the center of the entity is. I would not expect the x and z to move. It might be compounding rounding errors somewhere, or maybe the RealityKit engine is deciding not to be super precise about distant objects? Otherwise I just have a bug somewhere.
5
0
562
Mar ’25
Regression: RealityKit spatial audio crackles and pops on iOS 26.0 beta 5 (FB19423059)
RealityKit spatial audio crackles and pops on iOS 26.0 beta 5. It works correctly on iOS 18.6 and visionOS 26.0 beta 5. The APIs used are AudioPlaybackController, Entity.prepareAudio, Entity.play Videos of the expected and observed behavior are attached to the feedback FB19423059. The audio should be a consistent, repeating sound, but it seems oddly abbreviated and the volume varies unexpectedly. Thank you for investigating this issue.
0
0
258
Aug ’25
Broadcast Upload Extension
I am trying to use Broadcast upload extension but Broadcast picker starts countdown and stops (swiftUI). Steps i followed. added BroadcastUploadExtension as target same app group for for main app and extension added packages using SPM i seems the extension functions are not getting triggered, i check using UIScreen.main.isCaptured also which always comes as false. i tried Using Logs which never Appeared.
1
0
582
Mar ’25
GPTK did not build error
Hey there, I tried to install GPTK again, since I had to reinstall the OS for irrelevant reasons. But every time I try to install the tool kit, it gives me theError: apple/apple/game-porting-toolkit 1.1 did not build error. Before that error occired, I had the Openssl error, which I fixed with the rbenv version of openssl. Is there any way to fix this error? Down bellow you'll find the full error message it gave me. The specs for my Mac are (if they are helpful in any way): M1 Pro MBP 14" with 16GB Ram and 512GB SSD. Thanks! ``Error: apple/apple/game-porting-toolkit 1.1 did not build Logs: /Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/00.options.out /Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/01.configure /Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc /Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/wine64-build If reporting this issue please do so at (not Homebrew/brew or Homebrew/homebrew-core): https://github.com/apple/homebrew-apple/issues```
2
0
638
Mar ’25