Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

VisionOS VideoMaterial on 3D Mesh
I'm trying to get video material to work on an imported 3D asset, and this asset is a USDC file. There's actually an example in this WWDC video from Apple. You can see it running on the flag in this airplane, but there are no examples of this, and there are no other examples on the internet. Does anybody know how to do this? You can look at 10:34 in this video. https://developer.apple.com/documentation/realitykit/videomaterial
2
0
992
Dec ’25
How can I assign priorities to my app’s GPU workloads?
My app has a number of heterogeneous GPU workloads that all run concurrently. Some of these should be executed with the highest priority because the app’s responsiveness depends on them, while others are triggered by file imports and the like which should have a low priority. If this was running on the CPU I’d assign the former User Interactive QoS and the latter Utility QoS. Is there an equivalent to this for GPU work?
1
0
906
Jan ’26
Physics bug in WWE 2K25 with GPTK2.1
The game physics work as expected using GTPK 2.0 using Crossover 24 or Whisky. However, using GPTK 2.1 with Crossover 25, the player and camera physics misbehave. See https://www.reddit.com/r/WWEGames/comments/1jx9mph/the_siamese_elbow/ and https://www.reddit.com/r/WWEGames/comments/1jx9ow4/camera_glitch/ Full video also linked in the Reddit post. I have also submitted this bug via the feedback assistant.
2
0
212
Apr ’25
SCNCamera SSAO on visionOS
Hi Looking at the documentation for screenSpaceAmbientOcclusionIntensity, I noticed that it says this is supported on visionOS 1.0+: https://developer.apple.com/documentation/scenekit/scncamera/screenspaceambientocclusionintensity Could someone enlighten me as to how that would work? As far as I know, we don't use an SCNCamera on visionOS. So, what's the idea here? Can we activate SSAO on visionOS?
1
0
528
Feb ’25
Issue with Non-Consumable In-App Purchase in Unity (iOS Sandbox Environment)
Question: I'm encountering an issue with in-app purchases (IAP) in Unity, specifically for a non-consumable product in the iOS sandbox environment. Below are the details: Environment: Unity Version: 2022.3.55f1 Unity In-App Purchasing Version: v4.12.2 Device: iPhone (15, iOS 18.1.1) Connection: Wi-Fi iOS Settings: In-App Purchases set to “Allowed” initially Problem Behavior: I attempted to purchase a non-consumable item for the first time. The payment is successfully completed by entering the password. I then background the game app and navigate to the iOS Settings to set In-App Purchases to "Don't Allow." After returning to the game and either closing or killing the app, I try to purchase the same non-consumable item again. I checked canMakePayments() through the Apple configuration, and the app correctly detected that I could not make purchases due to the restriction. I then navigate back to Settings and set In-App Purchases to "Allow." Upon returning to the game, I try purchasing the non-consumable item again. A pop-up appears, saying, "You’ve already purchased this. Would you like to get it again for free?" The issue is: Will it deduct money for the second time, and why is the system allowing the user to purchase the same non-consumable item multiple times after purchasing it once? Is this the expected behavior for Unity In-App Purchasing, or is there something I might be missing in handling non-consumable purchases in this scenario? Additional Information: I’ve confirmed that the "In-App Purchases" are set to “Allowed” before attempting the purchase again. I understand that non-consumable products should not be purchased more than once, so I’m unsure why the system is offering to let the user purchase it again. I appreciate any insights into whether this is expected behavior or if I need to adjust how I handle the purchase flow.
1
0
462
Apr ’25
Game Center leaderboards not posting scores
My app is live but the leaderboards still aren’t updating. App was built with unreal engine 5 with blueprints. I have the leaderboard stat info entered into the node for write integer to leaderboard and a node for show platform specific leaderboard. The leaderboards are shown as live on app connect. When I run the app, the Game Center login functions and the leaderboard interface launches as expected but it just lists a group of friends to invite. There are no scores listed and it says number of players 0 even though I have scored on two different devices and accounts. I have the Game Center entitlement added in Xcode. Not sure where else to look.
0
0
713
Nov ’25
D3DMetal unsupported CreateCommandQueue1 API while running simple wglgears using Mesa 24.3 GLon12 driver..
Hi, wanted to test if possible to use Mesa3D OGLon12+D3DMetal 2b3 to get GL>4.1 support on windows apps via D3D12Metal.. using simple wglgears.c app (similar glxgears) and running like: GALLIUM_DRIVER=d3d12 wine64 wglgears64 -info with overridden opengl32.dll using contents from: https://github.com/pal1000/mesa-dist-win/releases/download/24.3.0-rc1/mesa3d-24.3.0-rc1-release-msvc.7z I get: [D3DMetal:LOG:5E53] Unsupported API: CreateCommandQueue1 caused by: https://gitlab.freedesktop.org/mesa/mesa/-/commit/c022c9603d500b59ff5e6f93c8a214d1785ab20a API: https://learn.microsoft.com/en-us/windows/win32/api/d3d12/nf-d3d12-id3d12device9-createcommandqueue1 note setup is correct as using: GALLIUM_DRIVER=llvmpipe wine64 wglgears64 -info I get: GL_RENDERER = llvmpipe (LLVM 19.1.3, 128 bits) GL_VERSION = 4.5 (Compatibility Profile) Mesa 24.3.0-rc1 (git-85ba713d76) GL_VENDOR = Mesa GL_EXTENSIONS = GL_ARB_multisample GL_EXT_abgr GL_EXT_bgra GL_EXT_blend_color GL_EXT_blend_minmax GL_EXT_blend_subtract r GL_EXT_texture.. etc..
1
0
598
May ’25
Multiple App Icons
Hi, I have an Unity game. I need to have multiple App Icons for my game for it to be able to be recognized in different countries. In other words, is it possible to have an iOS app in which the App Icon changes based on device locale/language? On Android this is possible using Unity Localization package "com.unity.localization"
0
0
259
Oct ’25
How can I uninstall game-porting-toolkit completely
So, I'm done with GPTK and decided to delete it. The only thing I installed was brew -v install apple/apple/game-porting-toolkit and the external libraries from the ditto command. Now, I tried to remove it, but even after brew remove game-porting-toolkit brew autoremove all of the dependencies installed with brew are still there. The most obvious was game-porting-toolkit-compiler, but even after removing this there are so many libraries that are now orphaned and it's just impossible to manually identify those. Is there a way or is the easiest way to simply uninstall Homebrew completely and reinstall it again?
0
0
256
May ’25
GKAccessPoint triggerAccessPointWithState handler not invoked on iOS 26.0 and iOS 15.8.4
GKAccessPoint triggerAccessPointWithState handler not invoked on iOS 26.0 and iOS 15.8.4 Incorrect/Unexpected Behaviour: When calling [GKAccessPoint.shared triggerAccessPointWithState:GKGameCenterViewControllerStateAchievements handler:^{}] on a real device running iOS 26 beta (iOS 26), the overlay appears as expected, but the handler block is never called. This behavior also not working correctly on previous iOS versions(tested on iOS 15.8.4) Steps to Reproduce: Authenticate GKLocalPlayer Call triggerAccessPointWithState:handler: with a block that logs or performs logic Observe that overlay appears, but block is not executed Behavior: UI appears correctly Handler is not invoked at all Expected Result: The handler should fire immediately after the dashboard is shown. Actual Result: The handler is never called. Usecase: As GKGameCenterViewController is deprecated we are moving to GKAccesspoint but due to above functionality issue we are unable to. Environment: Device: iPhone 16, iPhone 7 iOS: 26.0 and iOS 15.8.4 Xcode: 26.0 beta and Xcode 16.4
6
0
534
Oct ’25
vsync, drawable present, instrument gui
hi When analyzing our game using Instruments, I've always been confused about the two items "Drawable Present" and "Drawable Presented" in the GPU column. The timing of Drawable Present seems to be when the CPU layer calls commandbuffer:present, rather than when the actual encoding is completed on the GPU. Also, what does drawable presented specifically mean? In our case, when a CPU stall occurs, it appears that the vsync interval changes in the next frame, and a surface that has already been calculated is not displayed. Why is this happening?
0
0
155
May ’25
Custom Cameras in RealityKit
Hi all, I've developed some code that enables an arcball camera interaction with my scene. I've done this using components and systems. The implementation feels a bit messy as I've got gesture code on my realityView, and then a bunch of other code that uses those gesture inputs in my component and system. Is there a demo app, or some example code that shows a nice way to encapsulate these things in to one item for custom cameras, something like Apple's .realityViewCameraControls(.orbit) If not can anyone recommend an approach to take?
0
0
288
Oct ’25
How to configure RealityKit entities for animations on a modular character?
I am currently using RealityKit (perspective camera) to render a character in my swiftUI app. The character has customization such as clothing items and hair and all objects are properly weighted to the rig. The way the model is setup in Blender is like so: Groups of objects that will be swapped (ex: Shoes -> Shoes objects) and an armature. I then export it to usdc with all objects active. This is the resulting entity hierarchy, viewed in Reality Composer Pro: My problem is that when I export with the Armature Modifier applied to the objects, so that animations get exported, the ModelComponent gets flattened to the armature and swapping entities is no longer as simple as removing the entity with the corresponding name. What's the best practice here? Should animation be exported separately and then applied to the skeleton? If so, how is that achieved? I'm not really sure how to proceed here.
1
0
91
May ’25
How to use CharacterControllerComponent.
I am trying to implement a ChacterControllerComponent using the following URL. https://developer.apple.com/documentation/realitykit/charactercontrollercomponent I have written sample code, but PhysicsSimulationEvents.WillSimulate is not executed and nothing happens. import SwiftUI import RealityKit import RealityKitContent struct ImmersiveView: View { let gravity: SIMD3<Float> = [0, -50, 0] let jumpSpeed: Float = 10 enum PlayerInput { case none, jump } @State private var testCharacter: Entity = Entity() @State private var myPlayerInput = PlayerInput.none var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) testCharacter = immersiveContentEntity.findEntity(named: "Capsule")! testCharacter.components.set(CharacterControllerComponent()) let _ = content.subscribe(to: PhysicsSimulationEvents.WillSimulate.self, on: testCharacter) { event in print("subscribe run") let deltaTime: Float = Float(event.deltaTime) var velocity: SIMD3<Float> = .zero var isOnGround: Bool = false // RealityKit automatically adds `CharacterControllerStateComponent` after moving the character for the first time. if let ccState = testCharacter.components[CharacterControllerStateComponent.self] { velocity = ccState.velocity isOnGround = ccState.isOnGround } if !isOnGround { // Gravity is a force, so you need to accumulate it for each frame. velocity += gravity * deltaTime } else if myPlayerInput == .jump { // Set the character's velocity directly to launch it in the air when the player jumps. velocity.y = jumpSpeed } testCharacter.moveCharacter(by: velocity * deltaTime, deltaTime: deltaTime, relativeTo: nil) { event in print("playerEntity collided with \(event.hitEntity.name)") } } } } } } The scene is loaded from RCP. It is simple, just a capsule on a pedestal. Do I need a separate code to run testCharacter from this state?
0
0
162
May ’25
MacOS Catalina 10.15.7 CoreGraphic.framework not find symbol
I recently needed to develop an application to obtain the window list, which requires Screen Recording permissions. Apple's official documentation mentions using the two functions CGPreflightScreenCaptureAccess and CGRequestScreenCaptureAccess to request permissions. These functions are stated to be available since version 10.15. However, when I used these two functions on a device running macOS 10.15.7, I encountered the errors shown in the attached screenshot. I used the nm tool to inspect the symbols in the CoreGraphics.framework and found that these two functions were not present. Could you help me understand why this is happening?
0
0
96
May ’25
Core Image recipe for QR code icon image
Create the QRCode CIFilter<CIBlendWithMask> *f = CIFilter.QRCodeGenerator; f.message = [@"Message" dataUsingEncoding:NSASCIIStringEncoding]; f.correctionLevel = @"Q"; // increase level CIImage *qrcode = f.outputImage; Overlay the icon CIImage *icon = [CIImage imageWithURL:url]; CGAffineTransform *t = CGAffineTransformMakeTranslation( (qrcode.extent.width-icon.extent.width)/2.0, (qrcode.extent.height-icon.extent.height)/2.0); icon = [icon imageByApplyingTransform:t]; qrcode = [icon imageByCompositingOver:qrcode]; Round off the corners static dispatch_once_t onceToken; static CIWarpKernel *k; dispatch_once(&onceToken, ^ { k = [CIWarpKernel kernelWithFunctionName:name fromMetalLibraryData:metalLibData() error:nil]; }); CGRect iExtent = image.extent; qrcode = [k applyWithExtent:qrcode.extent roiCallback:^CGRect(int i, CGRect r) { return CGRectInset(r, -radius, -radius); } inputImage:qrcode arguments:@[[CIVector vectorWithCGRect:qrcode.extent], @(radius)]]; …and this code for the kernel should go in a separate .ci.metal source file: float2 bend_corners (float4 extent, float s, destination dest) { float2 p, dc = dest.coord(); float ratio = 1.0; // Round lower left corner p = float2(extent.x+s,extent.y+s); if (dc.x < p.x && dc.y < p.y) { float2 d = abs(dc - p); ratio = min(d.x,d.y)/max(d.x,d.y); ratio = sqrt(1.0 + ratio*ratio); return (dc - p)*ratio + p; } // Round lower right corner p = float2(extent.x+extent.z-s, extent.y+s); if (dc.x > p.x && dc.y < p.y) { float2 d = abs(dc - p); ratio = min(d.x,d.y)/max(d.x,d.y); ratio = sqrt(1.0 + ratio*ratio); return (dc - p)*ratio + p; } // Round upper left corner p = float2(extent.x+s,extent.y+extent.w-s); if (dc.x < p.x && dc.y > p.y) { float2 d = abs(dc - p); ratio = min(d.x,d.y)/max(d.x,d.y); ratio = sqrt(1.0 + ratio*ratio); return (dc - p)*ratio + p; } // Round upper right corner p = float2(extent.x+extent.z-s, extent.y+extent.w-s); if (dc.x > p.x && dc.y > p.y) { float2 d = abs(dc - p); ratio = min(d.x,d.y)/max(d.x,d.y); ratio = sqrt(1.0 + ratio*ratio); return (dc - p)*ratio + p; } return dc; }
0
0
117
Mar ’25
Animations for streaming
We have a macOS app (not yet released, but in use by ourselves), that provides scoreboards for streaming sport events. Today it is expected, that there are nice animations for goals, etc. We are streaming using NDI, which requires a CVPixelBuffer for each frame. We currently create these animations using CABasicAnimation, CAAnimation and CAKeyframeAnimation. In addition we use ScreenCaptureKit to generate the frames. This works fine with 25/30 fps, as long as the window where our animations are performed in is visible. But this is not what it should be. We have a smaller window as main app window and control display performing the animations in reduced size, while the streaming animations need to be in HD format and later maybe in 4K. When using an offscreen window, the animations are not calculated. We get 1 frame per second or so. So we actually have to connect an external display to the MacBook and open the large windows there. Ugly solution. Do we use a completely wrong approach? Or is there a way to tell the macOS to perform the animations although it is an offscreen window? If it cannot work that way, what is an alternative?
0
0
141
May ’25