Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

RealityKit, Attachments - not working
The simplest realityView (content, attachments in ... causes Contextual closure expects 1 argument but 2 were used in closure body. I have checked every example and i cannot understand why i get this error regardless of any content. Note: i have added Attachment(id: "test") to the attachment closure and get Attachment not is scope. imported both realityKit and SwiftUI.
2
0
176
2d
Metal 4 Argument Tables
I am puzzled by the setAddress(_:attributeStride:index:) of MTL4ArgumentTable. Can anyone please explain what the attributeStride parameter is for? The doc says that it is "The stride between attributes in the buffer." but why? Who uses this for what? On the C++ side in the shaders the stride is determined by the C++ type, as far as I know. What am I missing here? Thanks!
0
0
427
2w
MPSMatrixRandom SEGFAULTs when ran in an async context
The following minimal snippet SEGFAULTS with SDK 26.0 and 26.1. Won't crash if I remove async from the enclosing function signature - but it's impractical in a real project. import Metal import MetalPerformanceShaders let SEED = UInt64(0x0) typealias T = Float16 /* Why ran in async context? Because global GPU object, and async makeMTLFunction, and async makeMTLComputePipelineState. Nevertheless, can trigger the bug without using global @MainActor let myGPU = MyGPU() */ @main struct CMDLine { static func main() async { let ptr = UnsafeMutablePointer<T>.allocate(capacity: 0) async let future: Void = randomFillOnGPU(ptr, count: 0) print("Main thread is playing around") await future print("Successfully reached the end.") } static func randomFillOnGPU(_ buf: UnsafeMutablePointer<T>, count destbufcount: Int) async { // let (device, queue) = await (myGPU.device, myGPU.commandqueue) let myGPU = MyGPU() let (device, queue) = (myGPU.device, myGPU.commandqueue) // Init MTLBuffer, async let makeFunction, makeComputePipelineState, etc. let tempDataType = MPSDataType.uInt32 let randfiller = MPSMatrixRandomMTGP32(device: device, destinationDataType: tempDataType, seed: Int(bitPattern:UInt(SEED))) print("randomFillOnGPU: successfully created MPSMatrixRandom.") // try await computePipelineState // ^ Crashes before this could return // Or in this minimal case, after randomFillOnGPU() returns // make encoder, set pso, dispatch, commit... } } actor MyGPU { let device : MTLDevice let commandqueue : MTLCommandQueue init() { guard let dev: MTLDevice = MPSGetPreferredDevice(.skipRemovable), let cq = dev.makeCommandQueue(), dev.supportsFamily(.apple6) || dev.supportsFamily(.mac2) else { print("Unable to get Metal Device! Exiting"); exit(EX_UNAVAILABLE) } print("Selected device: \(String(format: "%llX", dev.registryID))") self.device = dev self.commandqueue = cq print("myGPU: initialization complete.") } } See FB20916929. Apparently objc autorelease pool is releasing the wrong address during context switch (across suspension points). I wonder why such obvious case has not been caught before.
0
0
63
Nov ’25
Game Center Challenges and Activities are not appearing
Hi, I'm trying to add game center challenges and activities to an already live game, but they are not appearing in game for testing, GameCenter, or the Games app. I know the game is setup with GameKit entitlements since this is a live game and it has working leaderboards and achievements. I've updated to Tahoe beta 8, added a challenge and activity on app store connect, added that to a new distribution and added that distribution to 'Add for Review' I'm using Unity and the Apple Unity plugin Not sure what other steps I'm missing Thanks
0
0
1.1k
Sep ’25
Float8 and Float16 "Reserved_Name__Do_not_use"
I am developing a macOS terminal app, running on an M4 Pro, and using Metal. I am not able use float8 or float16, both reporting Variable has incomplete type 'float16' (aka '__Reserved_Name__Do_not_use_float16'). Based on the system I should be able to use these. Either it is because it is also compiling to Intel, which they are not allowed, or something else. Either way I have not been able to figure out how to get past this. IIs there a compiler setting I need to set to make this work? if so which one and what setting do I need? I only want to run this on M processes, on the latest version of OS so not interested in Intel version or backward compatibility.
Topic: Graphics & Games SubTopic: Metal Tags:
4
0
178
Aug ’25
Metal useResource vs. MTLFence
Hello, I'm tracking down a bug where useResource doesn't seem to apply proper synchronization when a resource is produced by the render pass then consumed by the compute pass, but when I use MTLFence between the to signal and wait between the render/compute encoders, the artifact goes away. The resource is created with MTLHazardTrackingModeTracked and useResource is called on the compute encoder after the render pass. Metal API Validation doesn't report any warnings/errors. Am I misunderstanding the difference between the two APIs? I dug through the Metal documentation and it looks like useResource should handle synchronization given the resource has MTLHazardTrackingModeTracked but on the other hand, MTLFence should be used to ensure proper synchronization between command encoders. Can someone can clarify the difference between the two APIs and when to use them.
3
0
146
Jul ’25
Improving person segmentation and occlusion quality in RealityKit
I’m building an app that uses RealityKit and specifically ARConfiguration.FrameSemantics.personSegmentationWithDepth. The goal is to insert an AR object into the scene behind a person, and an additional AR object in front of the person, while being as photo realistic as possible. Through testing, I’ve noticed that many times, the edges of the person segmentation mask are not well matched to the actual person, and parts of the person are transparent, with the AR object bleeding through. It’s sort of like a “bad green screen” effect, which I’d expect to see a little bit, but not to this extent. I’ve been testing on iPhone 16, iPhone 14 Pro, iPad Pro 12.9 inch 6th Generation, and iPhone 12 Pro, with similar results across all devices. I’m wondering what else I can do to improve this… either code changes, platform (like different iPhone models), or environment (like lighting, distance, etc). Attaching some example screen grabs and a minimum reproducible code sample. Appreciate any insights! import ARKit import SwiftUI import RealityKit struct RealityViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.renderOptions.insert(.disableMotionBlur) arView.renderOptions.insert(.disableDepthOfField) let configuration = ARWorldTrackingConfiguration() configuration.planeDetection = [.horizontal] if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) { configuration.frameSemantics.insert(.personSegmentationWithDepth) } arView.session.run(configuration) arView.session.delegate = context.coordinator context.coordinator.arView = arView } func makeCoordinator() -> Coordinator { Coordinator(self) } class Coordinator: NSObject, ARSessionDelegate { var parent: RealityViewContainer var floorAnchor: ARPlaneAnchor? init(_ parent: RealityViewContainer) { self.parent = parent } func session(_ session: ARSession, didAdd anchors: [ARAnchor]) { if let arView,floorAnchor == nil { for anchor in anchors { if let horizontalPlaneAnchor = anchor as? ARPlaneAnchor, horizontalPlaneAnchor.alignment == .horizontal, horizontalPlaneAnchor.transform.columns.3.y < arView.cameraTransform.translation.y { // filter out ceiling floorAnchor = horizontalPlaneAnchor let backgroundEntity = BackgroundEntity() let anchorEntity = AnchorEntity(anchor: horizontalPlaneAnchor) anchorEntity.addChild(background) let foregroundEntity = ForegroundEntity() backgroundEntity.addChild(foregroundEntity) arView.scene.addAnchor(anchorEntity) arView.installGestures([.rotation, .translation], for: backgroundEntity) break // Stop after adding the first horizontal plane (floor) } } } } } }
1
0
103
May ’25
Is there a way to get all the turnbasematches that can join
I am using Unity's GameKit to implement a turnbase game. I want to make a UI in Unity to show all the games I can join. I tried using var matches = await GKTurnBasedMatch.LoadMatches(); to get all the open matches. But it seems that I can only get the matcm related to the current apple account. Can you help me get all the matches? ALSO I used var match = await GKTurnBasedMatchmakerViewController.Request(request); to exit the gamecenter interface and start a game (automatic matching, no one was invited) Another device used var match = await GKTurnBasedMatch.Find(request); to find the game, but it did not find the game, but it start a new game (automatic matching). Can you help me solve these problems?
0
0
219
Mar ’25
JPEG2000 (JP2) Decoding Works on iOS 16 but Fails on iOS 18
I am extracting a JPEG2000 (JP2) facial image from an NFC passport chip (ISO/IEC 19794-5) and attempting to create a UIImage from it. On iOS 16, the following code works fine: import ImageIO import UIKit func getUIImage(from imageData: [UInt8]) -&gt; UIImage? { let data = Data(imageData) guard let imageSource = CGImageSourceCreateWithData(data as CFData, nil), let cgImage = CGImageSourceCreateImageAtIndex(imageSource, 0, nil) else { print("Failed to decode JP2 image!") return nil } return UIImage(cgImage: cgImage) } However, on iOS 18, this fails with errors like: initialize:1415: *** invalid JPEG2000 file *** makeImagePlus:3752: *** ERROR: 'JP2 ' - failed to create image [-50] CGImageSourceCreateImageAtIndex: *** ERROR: failed to create image [-59] Questions: Did Apple remove or modify JPEG2000 support in iOS 18? Is there an official workaround for decoding JPEG2000 on iOS 18? Should I use Vision/Metal/Core Image instead? Is there a recommended way to convert JPEG2000 to JPEG/PNG before creating a UIImage? Are there any Apple-provided APIs that maintain backward compatibility for JPEG2000 decoding? Additional Info: The UInt8 array has a valid JPEG2000 header (0x00 0x00 0x00 0x0C 6A 50 ...). The image works on iOS 16 but fails on iOS 18. Tested on iPhone running iOS 18.0 beta. Any insights on how to handle JPEG2000 decoding in iOS 18 would be greatly appreciated! 🚀
3
0
336
Mar ’25
How to use imageblock_slice
Is there a working example of imageblock_slice with implicit layout somewhere? I get a compilation error when i write this: imageblock_slilce color_slice = img_blk.slice(frag->color); Error: No matching member function for call to 'slice' candidate template ignored: couldn't infer template argument 'E' candidate function template not viable: requires 2 arguments, but 1 was provided Too few template arguments for class template 'imageblock_slice' It seems the syntax has changed since the Imageblocks presentation https://developer.apple.com/videos/play/tech-talks/603/ I tried supplying the struct type of the image block between <> but it still does not work.
1
0
667
Dec ’24
Comparing colors of two ModelEntities
I want to compare the colors of two model entities (spheres). How can i do it? The method i'm currently trying to apply is as follows case let .color(controlColor) = controlMaterial.baseColor, controlColor == .green { // Flip target sphere colour if let targetMaterial = targetsphere.model?.materials.first as? SimpleMaterial, case let .color(targetColor) = targetMaterial.baseColor, targetColor == .blue { targetsphere.model?.materials = [SimpleMaterial(color: .green, isMetallic: false)] // Change to |1⟩ } else { targetsphere.model?.materials = [SimpleMaterial(color: .blue, isMetallic: false)] // Change to |0⟩ } } This method (baseColor) was deprecated in swift 15.0 changes to 'color' but i cannot compare the value color to each other.👾
1
0
636
Jan ’25
SceneView selective draw since concurrency
I have used SceneKit for several years but recently have a problem where a scene with fewer than 50 nodes is partially drawn, i.e., some nodes are, some aren't, and greater than 50 nodes are always draw correctly. This seems to have happened since concurrency was introduced. (w.r.t. concurrency, I had been using DispatchQueue successfully before then.) Since all nodes (few or many) are constructed and implemented by the same functions etc. I'm baffled. When I print the node hierarchy all nodes are present whether few or many. SceneView() has [.rendersContinually] option selected. Every node created (few or many) has .opacity = 1.0, .isHidden = false I haven't tried setting-back the compiler version as that is not a long term solution, and I know the same code worked fine then.
8
0
739
Feb ’25
SceneKit SCNMorpher Supports SCNGeometry with Some SCNLevelOfDetail
In my project, I have several nodes (SCNNode) with some levels of detail (SCNLevelOfDetail) and everything works correctly, but when I add animation using morphing (SCNMorpher), the animation works correctly but without the levels of detail. Note: the entire scene is created in Autodesk 3D Studio Max and then exported in (.ASE) format. The goal is to make animations using morphing that have some levels of detail. Does anyone know if SCNMorpher supports geometry with some levels of detail? I appreciate any information about this case. Thanks everyone!!! Part of the code I use to load geometries (SCNGeometry) with some levels of detail (SCNLevelOfDetail) using morphing (SCNMorpher). node.morpher = [SCNMorpher new]; SCNGeometry *geometry = [self geometryWithMesh:mesh]; NSMutableArray <SCNLevelOfDetail*> *mutLevelOfDetail = [NSMutableArray arrayWithCapacity:self.mutLevelsOfDetail.count]; for (int i = 0; i < self.mutLevelsOfDetail.count; i++) { ASCGeomObject *geomObject = self.mutLevelsOfDetail[i]; SCNGeometry *geometry = [self geometryWithMesh:geomObject.mesh.mutMeshAnimation[i]]; [mutLevelOfDetail addObject:[SCNLevelOfDetail levelOfDetailWithGeometry:geometry worldSpaceDistance:geomObject.worldSpaceDistance]]; } geometry.levelsOfDetail = mutLevelOfDetail; node.morpher.targets = [node.morpher.targets arrayByAddingObject:geometry];
2
0
195
Mar ’25
How can I get pixel coordinates in the fragment tile function?
In this video, tile fragment shading is recommended for image processing. In this example, the unpack function takes two arguments, one of which is RasterizerData. As I understand it, this is the data passed to us from the previous stage (Vertex) of the graphics pipeline. However, the properties of MTLTileRenderPipelineDescriptor do not include an option for specifying a Vertex function. Therefore, in this render pass, a mix of commands is used: first, a draw command is executed to obtain UV coordinates, and then threads are dispatched. My question is: without using a draw command, only dispatch, how can I get pixel coordinates in the fragment tile function? For the kernel tile function, everything is clear. typedef struct { float4 OPTexture [[ color(0) ]]; float4 IntermediateTex [[ color(1) ]]; } FragmentIO; fragment FragmentIO Unpack(RasterizerData in [[ stage_in ]], texture2d<float, access::sample> srcImageTexture [[texture(0)]]) { FragmentIO out; //... // Run necessary per-pixel operations out.OPTexture = // assign computed value; out.IntermediateTex = // assign computed value; return out; }
1
0
168
Mar ’25
Game Center save game data to iCloud
We are trying to implement saving and fetching data to and from iCloud, but it have some problems. MacOS: 15.3 Here is what I do: Enable Game Center and iCloud capbility in Signing & Capabilities, pick iCloud Documents, create and select a Container. Sample code: void SaveDataToCloud( const void* buffer, unsigned int datasize, const char* name ) { if(!GKLocalPlayer.localPlayer.authenticated) return; NSData* data = [ NSData dataWithBytes:databuffer length:datasize]; NSString* filename = [ NSString stringWithUTF8String:name ]; [[GKLocalPlayer localPlayer] saveGameData:data withName:filename completionHandler:^(GKSavedGame * _Nullable savedGame, NSError * _Nullable){ if (error != nil) { NSLog( @"SaveDataToCloud error:%@", [ error localizedDescription ] ); } }]; } void FetchCloudSavedGameData() { if ( !GKLocalPlayer.localPlayer.authenticated ) return; [ [ GKLocalPlayer localPlayer ] fetchSavedGamesWithCompletionHandler:^(NSArray<GKSavedGame *> * _Nullable savedGames, NSError * _Nullable error) { if ( error == nil ) { for ( GKSavedGame *item in savedGames ) { [ item loadDataWithCompletionHandler:^(NSData * _Nullable data, NSError * _Nullable error) { if ( error == nil ) { //handle data } else { NSLog( @"FetchCloudSavedGameData failed to load iCloud file: %@, error:%@", item.name, [ error localizedDescription ] ); } } ]; } } else { NSLog( @"FetchCloudSavedGameData error:%@", [ error localizedDescription ] ); } } ]; } Both saveGameData and fetchSavedGamesWithCompletionHandler are not reporting any error, when debugging, saveGameData completionHandler got a nil error, and can get a valid "savedGame", but when try to rebot the game and use "fetchSavedGamesWithCompletionHandler" to fetch data, we got nothing, no error reported, and the savedGames got a 0 length. From this page https://developer.apple.com/forums/thread/718541?answerId=825596022#825596022 we try to wait 30sec after authenticated , then try fetchSavedGamesWithCompletionHandler, still got the same error. Checked: Game Center and iCloud are enabled and login with the same account. iCloud have enough space to save. So what's wrong with it.
2
0
610
Mar ’25
Unable to package in UE5.6
Im new in the Mac area but for sure not UE. Windows is a long process to packaging but it could be done. All the documentation for Epic and from the internet is basically non existent with exactly how to package a project within UE. I have Xcode installed which makes sense, agreed to terms and install for MacOS, I've been able to make a project for several weeks now and want to package for a test run for my friends to play on Windows. Now I just get this in the log: UATHelper: Packaging (Mac): ERROR: Failed to finalize the .app with Xcode. Check the log for more information UATHelper: Packaging (Mac): Trace written to file /Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.uba with size 12.6kb UATHelper: Packaging (Mac): Total time in Unreal Build Accelerator local executor: 8.12 seconds UATHelper: Packaging (Mac): Result: Failed (OtherCompilationError) UATHelper: Packaging (Mac): Total execution time: 9.71 seconds PackagingResults: Error: Failed to finalize the .app with Xcode. Check the log for more information UATHelper: Packaging (Mac): Took 9.77s to run dotnet, ExitCode=6 UATHelper: Packaging (Mac): UnrealBuildTool failed. See log for more details. (/Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.txt) UATHelper: Packaging (Mac): AutomationTool executed for 0h 0m 10s UATHelper: Packaging (Mac): AutomationTool exiting with ExitCode=6 (6) UATHelper: Packaging (Mac): RunUAT ERROR: AutomationTool was unable to run successfully. Exited with code: 6 PackagingResults: Error: AutomationTool was unable to run successfully. Exited with code: 6 PackagingResults: Error: Unknown Error This absolutely makes no sense to me. Anyone have ideas?
2
0
271
Jul ’25