Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

SCNGeometry and .copy()
Up to now I have created multiple new SCNNodes using an instance of SCNGeometry and it was OK that they all had the same appearance. Now I want variety and when I make a copy of that instance using: let newGeo = myGeoInstance.copy() as! SCNGeometry (must be force cast because copy() -> any?) all elements are verified present. :-) Likewise: node.geometry?.replaceMaterial(at: index, with: myNewMaterial) is verified to correctly change the material(s) at the correct index(s). The only problem is the modified "teapot" is not visible, and yes I have set node.isHidden = false. Has anyone experienced this? In the old days reversing the verts was a solution. In desperation I tried that. |-(
6
0
809
Dec ’24
Per-vertex color. in a custom RealityKit mesh? (macOS)
I'm working on an application for viewing AMF models on macOS, using RealityKit. AMF supports several different ways to color models, including per-vertex color (where the color of a triangle is interpolated from vertex to vertex) as well as per-face color (where the color of the triangle is the same across the entire face). I'm trying to figure out how to support those color models using a RealityKit mesh. Apple's documentation (https://developer.apple.com/documentation/realitykit/modifying-realitykit-rendering-using-custom-materials) talks about per-vertex colors, but I haven't found a way to create a mesh that includes per-vertex colors, other than use a texture map (which might be the correct solution). Can someone give me some pointers?
6
2
1.8k
3w
GKAccessPoint triggerAccessPointWithState handler not invoked on iOS 26.0 and iOS 15.8.4
GKAccessPoint triggerAccessPointWithState handler not invoked on iOS 26.0 and iOS 15.8.4 Incorrect/Unexpected Behaviour: When calling [GKAccessPoint.shared triggerAccessPointWithState:GKGameCenterViewControllerStateAchievements handler:^{}] on a real device running iOS 26 beta (iOS 26), the overlay appears as expected, but the handler block is never called. This behavior also not working correctly on previous iOS versions(tested on iOS 15.8.4) Steps to Reproduce: Authenticate GKLocalPlayer Call triggerAccessPointWithState:handler: with a block that logs or performs logic Observe that overlay appears, but block is not executed Behavior: UI appears correctly Handler is not invoked at all Expected Result: The handler should fire immediately after the dashboard is shown. Actual Result: The handler is never called. Usecase: As GKGameCenterViewController is deprecated we are moving to GKAccesspoint but due to above functionality issue we are unable to. Environment: Device: iPhone 16, iPhone 7 iOS: 26.0 and iOS 15.8.4 Xcode: 26.0 beta and Xcode 16.4
6
0
430
Oct ’25
Image textures cause runtime crashes - what's the workaround?
I've had no issue calling image files in my .swift files, but they are causing crashes when used in my .SKS files. When I set a sprite texture to an image in the inspector/ editor bar, at runtime when that sprite is being called I get the error: "Cannot get value with size 16. The type encoded as {CGRect={CGPoint=dd}{CGSize=dd}} is expected to be 32 bytes." From my research it has something to do with Apple switching from 32 to 64 bite machines. From chatGPT “SpriteKit under the hood uses NSKeyedUnarchiver to load your .sks file. That unarchiver decodes each archived property by reading a fixed‑size blob of bytes and mapping it into a C struct. In your case it ran into a mismatch”. I am using a 64-bite machine to write my code and 64-bite simulators and physical devices, so there isn't a clear cause of the mismatch. My scenes play fine in Xcode 16's preview window and my code builds, it just crashes at runtime. When I don’t use image textured assets in the SKS file it works fine. It loads animated labels, and plain color squares. I’ve been able to work around this for static things like a sprite with a background texture by. in a normal non-game swift file, writing code like: if let scene = SKScene(fileNamed: "GameScene2") { let bg = SKSpriteNode(imageNamed: "YourBackgroundImage") bg.position = CGPoint(x: scene.frame.midX, y: scene.frame.midY) bg.zPosition = -1 scene.addChild(bg) } The issue now is I want to make a particle emitter and other non static sprites, but my understanding of their properities isn’t deep enough to create them without the editor. Also when I set SKTexture in a swift file that causes the same runtime crash with the 16/32 error. Could you help me figure out how to fix the bug so I can use the editor again? Otherwise could you help me figure out how to write a workaround like I do for background images? I have a feeling the answer is in writing my own NSKeyedUnarchiver but I don’t know how to make sure it’s called instead of the default one. I've already tried cleaning my code multiple times and deleting and reading sprite nodes. Thank you.
6
1
1.4k
3w
How to get GKMatch instance after accepting GKInvite?
In my SceneKit game I'm able to connect two players with GKMatchmakerViewController. Now I want to support the scenario where one of them disconnects and wants to reconnect. I tried to do this with this code: nonisolated public func match(_ match: GKMatch, player: GKPlayer, didChange state: GKPlayerConnectionState) { Task { @MainActor in switch state { case .connected: break case .disconnected, .unknown: let matchRequest = GKMatchRequest() matchRequest.recipients = [player] do { try await GKMatchmaker.shared().addPlayers(to: match, matchRequest: matchRequest) } catch { } @unknown default: break } } } nonisolated public func player(_ player: GKPlayer, didAccept invite: GKInvite) { guard let viewController = GKMatchmakerViewController(invite: invite) else { return } viewController.matchmakerDelegate = self present(viewController) } But after presenting the view controller with GKMatchmakerViewController(invite:), nothing else happens. I would expect matchmakerViewController(_:didFind:) to be called, or how would I get an instance of GKMatch? Here is the code I use to reproduce the issue, and below the reproduction steps. Code Run the attached project on an iPad and a Mac simultaneously. On both devices, tap the ship to connect to GameCenter. Create an automatched match by tapping the rightmost icon on both devices. When the two devices are matched, on iPad close the dialog and tap on the ship to disconnect from GameCenter. Wait some time until the Mac detects the disconnect and automatically sends an invitation to join again. When the notification arrives on the iPad, tap it, then tap the ship to connect to GameCenter again. The iPad receives the call player(_:didAccept:), but nothing else, so there’s no way to get a GKMatch instance again.
6
0
746
Apr ’25
Multiply exr lightmap in Reality Composer Pro Shader Graph
I’m trying to use EXR lightmaps to overlay baked lighting on top of a base texture in the RCP Shader Graph. When I multiply an EXR image set to Image(float) with an 8-bit base texture, the output becomes Image(float). I can’t connect that to the BaseColor input on the UnlitSurface node, since it only accepts Color3f. I expected to be able to use a Convert node between the Multiply node and the BaseColor input, but when I do that, the result becomes black and white instead of the expected outcome: the EXR multiplied with the base texture using a baseline value of 1, where values below 1 in the EXR would darken the base texture and values above 1 would brighten it. Is there any documentation on how to properly overlay a 32-bit EXR lightmap in the RCP Shader Graph, or is the black-and-white output from the Convert node a bug?
6
0
586
3w
SceneKit app seriously hangs when run in fullscreen
I've been running my SceneKit game for many weeks in Xcode without performance issues. The game itself is finished, so I thought I could go on with publishing it on the App Store, but when archiving it in Xcode and running the archived app, I noticed that it seriously hangs. The hangs only seem to happen when I run the game in fullscreen mode. I tried disabling game mode, but the hangs still happen. Only when I run in windowed mode the game runs smoothly. Instruments confirms that there are many serious hangs, but it also reports that CPU usage is quite low during those hangs, on average about 15%. From what I know, hangs happen when the main thread is busy, but how can that be when CPU usage is so low, and why does it only happen in fullscreen mode for release builds?
6
0
1.1k
Dec ’24
Issues building Unity plug-in project: Cannot locate native library Apple.Core/Apple.GameKit for iOS
I'm having issues getting a well built package from the Apple Unity Plug-in project. When building the my game project in Unity the following error is printed to the console: Apple.Core.AppleNativeLibraryUtility] Cannot locate a Debug or Release Apple.Core native library for iOS. Please ensure that the build invocation (build.py, xcodebuild, or Xcode) compiled cleanly and that the build was configured to support Debug on iOS. As far as I can tell the build did compile cleanly, but I might be missing something. If anyone can see what I'm doing wrong or has any insight it would be greatly appreciated. Setup is the following: macOS Tahoe 26 Beta Xcode-beta Version 26.0 beta 3 (17A5276g) Unity Plug-in branch: 2025-beta1 Unity game project version: 2022.3.60f M1 Macbook Pro The built packages have been imported into the game project through the Unity Package Manager using the tarball option pointing to the built packages from the Unity Plug-in project. The Unity Plug-in project has been built using the build.py file with the following: python3 build.py -m iOS iPhoneSimulator -p Core GameKit CoreHaptics GameController -k all The output is available in the attached file. build-output.txt Here's an image of the NativeLibraries~ folder inside the built Apple.Core package.
6
1
1k
Oct ’25
Utilizing Point Cloud data from `ObjectCaptureSession` in WWDC23
I am currently developing a mobile and server-side application using the new ObjectCaptureSession on iOS and PhotogrammetrySession on MacOS. I have two questions regarding the newly updated APIs. From WWDC23 session: "Meet Object Capture for iOS", I know that the Object Capture API uses Point Cloud data captured from iPhone LiDAR sensor. I want to know how to use the Point Cloud data captured on iPhone ObjectCaptureSession and use it to create 3D models on PhotogrammetrySession on MacOS. From the example code from WWDC21, I know that the PhotogrammetrySession utilizes depth map from captured photo images by embedding it into the HEIC image and use those data to create a 3D asset on PhotogrammetrySession on MacOS. I would like to know if Point Cloud data is also embedded into the image to be used during 3D reconstruction and if not, how else the Point Cloud data is inserted to be used during reconstruction. Another question is, I know that Point Cloud data is returned as a result from request to the PhtogrammetrySession.Request. I would like to know if this PointCloud data is the same set of data captured during ObjectCaptureSession from WWDC23 that is used to create ObjectCapturePointCloudView. Thank you to everyone for the help in advance. It's a real pleasure to be developing with all the updates to RealityKit and the Object Capture API.
6
0
2.2k
Jul ’25
OS choosing performance state poorly for GPU use case
I am building a MacOS desktop app (https://anukari.com) that is using Metal compute to do real-time audio/DSP processing, as I have a problem that is highly parallelizable and too computationally expensive for the CPU. However it seems that the way in which I am using the GPU, even when my app is fully compute-limited, the OS never increases the power/performance state. Because this is a real-time audio synthesis application, it's a huge problem to not be able to take advantage of the full clock speeds that the GPU is capable of, because the app can't keep up with real-time. I discovered this issue while profiling the app using Instrument's Metal tracing (and Game tracing) modes. In the profiling configuration under "Metal Application" there is a drop-down to select the "Performance State." If I run the application under Instruments with Performance State set to Maximum, it runs amazingly well, and all my problems go away. For comparison, when I run the app on its own, outside of Instruments, the expensive GPU computation it's doing takes around 2x as long to complete, meaning that the app performs half as well. I've done a ton of work to micro-optimize my Metal compute code, based on every scrap of information from the WWDC videos, etc. A problem I'm running into is that I think that the more efficient I make my code, the less it signals to the OS that I want high GPU clock speeds! I think part of why the OS is confused is that in most use cases, my computation can be done using only a small number of Metal threadgroups. I'm guessing that the OS heuristics see that only a small fraction of the GPU is saturated and fail to scale up the power/clock state. I'm not sure what to do here; I'm in a bit of a bind. One possibility is that I intentionally schedule busy work -- spin threadgroups just to waste energy and signal to the OS that I need higher clock speeds. This is obviously a really bad idea, but it might work. Is there any other (better) way for my app to signal to the OS that it is doing real-time latency-sensitive computation on the GPU and needs the clock speeds to be scaled up? Note that game mode is not really an option, as my app also runs as an AU plugin inside hosts like Garageband, so it can't be made fullscreen, etc.
6
0
898
May ’25
GCDualSenseAdaptiveTrigger API set trigger mode not working
I'm trying to add support to PS5 DualSense controller. when I try to use the API from here: https://developer.apple.com/documentation/gamecontroller/gcdualsenseadaptivetrigger?language=objc None of the API works, am I missed anything? The code is like this: if ( [ controller.extendedGamepad isKindOfClass:[ GCDualSenseGamepad class ] ] ) { GCDualSenseGamepad * dualSenseGamePad = ( GCDualSenseGamepad * )controller.extendedGamepad; auto funcSetEffectTrigger = []( TriggerEffectParams& params, GCDualSenseAdaptiveTrigger *trigger ) { if ( params.m_mode == TriggerEffectMode::Off ) { [ trigger setModeOff ]; NSLog(@"setModeOff trigger.mode:%d", trigger.mode ); } else if ( params.m_mode == TriggerEffectMode::Feedback ) { [ trigger setModeFeedbackWithStartPosition: 0.2f resistiveStrength: 0.5f ]; } else if ( params.m_mode == TriggerEffectMode::Weapon ) { [ trigger setModeWeaponWithStartPosition: 0.2f endPosition: 0.4f resistiveStrength: 0.5f ]; } else if ( params.m_mode == TriggerEffectMode::Vibration ) { [ trigger setModeVibrationWithStartPosition: position amplitude: amplitude frequency: frequency ]; } }; if ( L2 ) { funcSetEffectTrigger( params, dualSenseGamePad.leftTrigger ); } if ( R2 ) { funcSetEffectTrigger( params, dualSenseGamePad.rightTrigger ); } } I've also tested to add "Game Controllers" capability to Target, still not working. Can't find anything else from the document or forums. I've no idea what need to do.
5
0
672
Jan ’25
RealityKit and USDZ: Winding Order Issue with Negatively Scaled Meshes
Hi all, I've encountered a potential issue with how the winding order of geometry is handled when their transformations involve negative scaling. I created a simple test asset, a single triangle, to demonstrate this. The triangle's vertices are defined in a counter-clockwise ("right-handed") winding order, and its transform has a negative scale on the X-axis. According to the OpenUSD specification, this negative determinant in the transformation matrix should effectively reverse the winding order of the geometry: However, any given gprim's local-to-world transformation can flip its effective orientation, when it contains an odd number of negative scales. This condition can be reliably detected using the (Jacobian) determinant of the local-to-world transform: if the determinant is less than zero, then the gprim's orientation has been flipped, and therefore one must apply the opposite handedness rule when computing its surface normals (or just flip the computed normals) for the purposes of hidden surface detection and lighting calculations. When I view the asset in tools like Blender or Preview on macOS, it behaves as expected. The triangle's effective orientation is flipped to CW. However, when the same asset is viewed in Reality Composer Pro or with QuickLook on iOS, its effective orientation remains CCW. In other words, the triangle faces the opposite direction. My questions for the community and Apple are: Is this behavior in RealityKit a known issue? If this is a known issue, is there official guidance for DCC tools on how to export USDZ assets to ensure they appear correctly in the Apple ecosystem? Any insights or recommendations would be greatly appreciated.
5
0
886
3w
RealityKit SIMD3<Float> precision decreases with distance?
The farther away the center of a large entity is, the less accurate the positioning is? For example I am changing only the y-axis position of an entity that is tens of meters long, but i notice x and z drifting slowly the farther away the center of the entity is. I would not expect the x and z to move. It might be compounding rounding errors somewhere, or maybe the RealityKit engine is deciding not to be super precise about distant objects? Otherwise I just have a bug somewhere.
5
0
562
Mar ’25
playSoundFileNamed not working on Tahoe?
I have published a number of games that use SpriteKit for everything important. Since the release of macOS Tahoe, I've had a lot of end user reports saying that sound effects have stopped working in many (but not all) of my titles. I'm not doing anything unusual here – typical code is: sndGameOver = [SKAction playSoundFileNamed:@"Audio/GameOver.wav" waitForCompletion:YES]; Then at the appropriate time: [self runAction:sndGameOver]; Has anyone else encountered this? The code still works fine on previous operating systems, and appears to be fine on iOS too. Has something changed in macOS Tahoe? I'm at a bit of a loss. There's nothing obviously different between the titles that do work and the titles that don't. Suggestions welcomed! Thanks
5
2
1.2k
2w
GameKit not working as expected in iOS 26.
I just upgraded my macOS, Xcode and Simulator all to the newest beta version 26. Then I found two issues when building my app with Xcode 26 and running it on simulator 26. The game center access point no longer shows up in the app. This is how it's configured in the past. And it still works on simulator 18.4 func authenticatePlayer() { GKAccessPoint.shared.location = .topTrailing self.localPlayer.authenticateHandler = { viewController, error in if let viewController = viewController { // can present Game Center login screen } else if self.localPlayer.isAuthenticated { // game can be started } else { // user didn't log in, continue the game without game center } } } After game ended, the leaderboard won't load. This is how it's implemented in the past. It's still working in simulator 18.4 struct GameCenterView: UIViewControllerRepresentable { @Environment(\.presentationMode) var presentationMode ... func makeUIViewController(context: Context) -> GKGameCenterViewController { let viewController = GKGameCenterViewController( leaderboardID: getLeaderBoardID(with: leaderBoardGameMode), playerScope: .global, timeScope: .allTime ) viewController.gameCenterDelegate = context.coordinator return viewController } func updateUIViewController(_ uiViewController: GKGameCenterViewController, context: Context) {} func makeCoordinator() -> Coordinator { Coordinator(self) } class Coordinator: NSObject, GKGameCenterControllerDelegate { let parent: GameCenterView init(_ parent: GameCenterView) { self.parent = parent } func gameCenterViewControllerDidFinish(_ gameCenterViewController: GKGameCenterViewController) { parent.presentationMode.wrappedValue.dismiss() } } }
5
2
367
Sep ’25
Showing a MTLTexture on an Entity in RealityKit
Is there any standard way of efficiently showing a MTLTexture on a RealityKit Entity? I can't find anything proper on how to , for example, generate a LowLevelTexture out of a MTLTexture. Closest match was this two year old thread. In the old SceneKit app, we would just do guard let material = someNode.geometry?.materials.first else { return } material.diffuse.contents = mtlTexture Our flow is as follows (for visualizing the currently detected object): Camera-Stream -> CoreML Segmentation -> Send the relevant part of the MLShapedArray-Tensor to a MTLComputeShader that returns a MTLTexture -> Show the resulting texture on a 3D object to the user
5
0
1k
Sep ’25
PSVR2 controllers don't report anything in snapshot
I typically read an extended gamepad capture() and get all state. But PSVR2 controllers seem to report nothing. So the stick and other buttons don't do anything in a built app. They register as left/right controllers. This on vOS 26, Xcode 26, etc. They work correctly in the main icon view, although they don't honor inverted vertical and horiztonal scrolling. Both of the default scrolls just feel wrong. When I move left I'm want to scroll level not right. Same for up/down.
5
1
639
Sep ’25
Metal and Swift Concurrency
Hi, Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool. GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks. My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance? To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work? Thanks!
5
0
1.1k
Apr ’25