Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Metal-cpp-extensions isn't working inside frameworks
I am making a framework in C++ using metal-cpp, basically a small game engine. I am also consequently using metal-cpp-extensions provided in LearnMetalCPP to make applications work. For one of my classes, I needed to add AppKit.hpp inside a public header file, so I moved it and its associate headers(NSApplication.hpp, NSMenu.hpp, etc.) from Project headers to Public in Build Phases' Headers, however, it started giving me the error "cast of C pointer type 'void *' to Objective-C pointer type 'Class' requires a bridged cast" at several points in the AppKit headers. They don't appear when AppKit and its associates are in the Project headers, or when they are in the Private headers and no headers import it. I imagined that disabling Objective-C ARC and Using __bridge casts outside of ARC in Build Settings would solve it, but it didn't budge. I imagined it wouldn't involve actively changing the headers would be the answer, but even if I try to put __bridge before the problematic casts, it didn't recognize __bridge. How do I solve this? And why is it only happening in Public and not Project headers?
2
0
798
Jan ’25
Metal: Non-uniform thread groups unsupported in Simulator? Is it?
My app is running Compute Shaders that use non-uniform thread groups. When I run the app in the debugger with a simulator target the app crashes on encoder.dispatchThreads and the error message is: Dispatch Threads with Non-Uniform Threadgroup Size is not supported on this device. Previously the log output states that: Metal Shader Validation is unsupported for Simulator. However: When I stop the debugger and just run the app in the simulator without the debugger attached, the app just runs fine and does not crash. The SwiftUI Preview that also triggers the Compute Shader when preparing data also just runs fine without a crash. I can run and debug on a real device no problem - I just don't have all sizes available. Is there anything I need to check in my lldb/simulator configuration? It obviously does work, just the debugger cannot really deal with it? Any input would be nice as this really slows my down as I have to be extremely careful when debugging on the simulator.
2
0
567
Mar ’25
Learn Metal
I am interested in learning the Metal framework for rendering development. However, most of Apple’s official documentation uses Objective-C code. Therefore, I am seeking guidance on whether it is more advantageous for me to focus solely on learning Swift to gain proficiency in Metal.
2
0
815
Jan ’25
[tvOS] Reacting to button taps
I've just started working on my first SpriteKit game that will eventually run on both tvOS and iOS and am looking at how to build a "button". So far, I've got a custom node that looks like: class MyButton: SKSpriteNode { ... #if os(tvOS) override var canBecomeFocused: Bool { true } override func didUpdateFocus(...) { ... } #endif } The above let me nicely handle focus changes in tvOS and now I'm looking at reacting to selecting the button. Searching around, all the articles/questions/posts are from 2015-2016 - which is a LOOOONG time ago. Most of the guidance appears to be to add a tap gesture recognizer in the owning scene and getting the scene to hand it off to the button. That seems pretty brittle and I'd much prefer if the button itself is responsible for its own tap management. So, I guess my question is whether I should just add a gesture recognizer to my custom button class? Is this inefficient if I end up having 7-8 buttons on the screen and each one has its own gesture recognizer? Somewhat related, all of the 10-year-old advice is that if we add recognizers to scenes, then they need to be removed from the view controller... however, in the modern day world with SwiftUI, my project doesn't even have a view controller (yet, anyway)... what gesture recognizer lifecycle management do I need in a SpriteKit scene that is presented within a SpriteKitView? Or, is there a better way? I was kind of hoping that overriding pressesBegan() (or something similar) in my custom button might have been triggered on tvOS (like touchesBegan() lets me manage touches for the iOS variant of my app) Any pointers or suggestions would be gladly received. Thanks.
2
0
693
Jan ’25
Resources for Retro style Games wanting 90 degree Window corners
I've been thinking of bringing some older games back to the modern Mac. Rewriting old titles in Swift but using the original data files that assume use of non-rounded corners Windows. Many of these games require all the Window space of a 90 degree cornered Window. Can anyone point me at some useful workarounds or Is Apple simply deaf to the needs of this type of product?
2
1
428
6d
RealityView postProcess effect depth texture
Hello, Question re: iOS RealityView postProcess. I've got a working postProcess kernel and I'd like to add some depth-based effects to it. Theoretically I should be able to just do: encoder.setTexture(context.sourceDepthTexture, index: 1) and then in the kernel: texture2d<float, access::read> depthIn [[texture(1)]] ... outTexture.write(depthIn.read(gid), gid); And I consistently see all black rendered to the view. The postProcess shader works, so that's not the issue. It just seems to not be receiving actual depth information. (If I set a breakpoint at the encoder setTexture step, I can see preview the color texture of the scene, but the context's depthTexture looks like all NaN / blank.) I've looked at all the WWDC samples, but they include ARView for all the depth sample code, which has a different set of configuration options than RealityView. So far I haven't seen anywhere to explicitly tell RealityView "include the depth information". So I'm not sure if I'm missing something there. It appears that there is indeed a depth texture being passed, but it looks blank. Is there a working example somewhere that we can reference?
2
0
500
Nov ’25
SceneKit SCNMorpher Supports SCNGeometry with Some SCNLevelOfDetail
In my project, I have several nodes (SCNNode) with some levels of detail (SCNLevelOfDetail) and everything works correctly, but when I add animation using morphing (SCNMorpher), the animation works correctly but without the levels of detail. Note: the entire scene is created in Autodesk 3D Studio Max and then exported in (.ASE) format. The goal is to make animations using morphing that have some levels of detail. Does anyone know if SCNMorpher supports geometry with some levels of detail? I appreciate any information about this case. Thanks everyone!!! Part of the code I use to load geometries (SCNGeometry) with some levels of detail (SCNLevelOfDetail) using morphing (SCNMorpher). node.morpher = [SCNMorpher new]; SCNGeometry *geometry = [self geometryWithMesh:mesh]; NSMutableArray <SCNLevelOfDetail*> *mutLevelOfDetail = [NSMutableArray arrayWithCapacity:self.mutLevelsOfDetail.count]; for (int i = 0; i < self.mutLevelsOfDetail.count; i++) { ASCGeomObject *geomObject = self.mutLevelsOfDetail[i]; SCNGeometry *geometry = [self geometryWithMesh:geomObject.mesh.mutMeshAnimation[i]]; [mutLevelOfDetail addObject:[SCNLevelOfDetail levelOfDetailWithGeometry:geometry worldSpaceDistance:geomObject.worldSpaceDistance]]; } geometry.levelsOfDetail = mutLevelOfDetail; node.morpher.targets = [node.morpher.targets arrayByAddingObject:geometry];
2
0
191
Mar ’25
Game Center save game data to iCloud
We are trying to implement saving and fetching data to and from iCloud, but it have some problems. MacOS: 15.3 Here is what I do: Enable Game Center and iCloud capbility in Signing & Capabilities, pick iCloud Documents, create and select a Container. Sample code: void SaveDataToCloud( const void* buffer, unsigned int datasize, const char* name ) { if(!GKLocalPlayer.localPlayer.authenticated) return; NSData* data = [ NSData dataWithBytes:databuffer length:datasize]; NSString* filename = [ NSString stringWithUTF8String:name ]; [[GKLocalPlayer localPlayer] saveGameData:data withName:filename completionHandler:^(GKSavedGame * _Nullable savedGame, NSError * _Nullable){ if (error != nil) { NSLog( @"SaveDataToCloud error:%@", [ error localizedDescription ] ); } }]; } void FetchCloudSavedGameData() { if ( !GKLocalPlayer.localPlayer.authenticated ) return; [ [ GKLocalPlayer localPlayer ] fetchSavedGamesWithCompletionHandler:^(NSArray<GKSavedGame *> * _Nullable savedGames, NSError * _Nullable error) { if ( error == nil ) { for ( GKSavedGame *item in savedGames ) { [ item loadDataWithCompletionHandler:^(NSData * _Nullable data, NSError * _Nullable error) { if ( error == nil ) { //handle data } else { NSLog( @"FetchCloudSavedGameData failed to load iCloud file: %@, error:%@", item.name, [ error localizedDescription ] ); } } ]; } } else { NSLog( @"FetchCloudSavedGameData error:%@", [ error localizedDescription ] ); } } ]; } Both saveGameData and fetchSavedGamesWithCompletionHandler are not reporting any error, when debugging, saveGameData completionHandler got a nil error, and can get a valid "savedGame", but when try to rebot the game and use "fetchSavedGamesWithCompletionHandler" to fetch data, we got nothing, no error reported, and the savedGames got a 0 length. From this page https://developer.apple.com/forums/thread/718541?answerId=825596022#825596022 we try to wait 30sec after authenticated , then try fetchSavedGamesWithCompletionHandler, still got the same error. Checked: Game Center and iCloud are enabled and login with the same account. iCloud have enough space to save. So what's wrong with it.
2
0
603
Mar ’25
The delay issue of 4G TCP connection for iPhone 17 in China's mobile network
Reproduce Same SIM card with 4G, same testing location, connected to the same server, xcode debugging game applications, network/profile retrotransmitted, Avg round trip to view data iPhone17, Turn off 4G and turn on WiFi. All the above indicators are acceptable iPhone17, Turn on 4G, turn off WiFi, retry with retransmission and very high Avg round trip iPhone14-16, Turn on 4G and turn off WiFi. All the above indicators are acceptable App Unity3d project .netframe4.0 C# Socket Other Many developers in Chinese forums have provided feedback on this issue
2
0
715
Oct ’25
Editor Ok, but running batch get: Cannot locate a Release or Debug Apple.GameKit native library for macOS
After running build.py -p Core GameKit and adding the tar balls to the Unity project in Assets/ExternalPackages no packages seem to be found when running the build using our continuous integration system. This was not the case when the project was opened in the Editor. It looks like in Apple.Core, the ApplePluginEnvironment hasn't run the OnEditorUpdate function and so the _appleUnityPackages Dictionary is empty. A change to ApplePlugInEnvironment.cs seemed to fix the issue: public static AppleNativeLibrary GetLibrary(string packageDisplayName, string appleBuildConfig, string applePlatform) { // ?FIX?: If we're not in the editor, we might not have updated the package list. if (_appleUnityPackages.Count == 0 && _updateState == UpdateState.Initializing) { OnEditorUpdate(); // UpdateState.Initializing OnEditorUpdate(); // UpdateState.Updating } I'm not sure if this is something we're doing incorrectly, the documentation for the plug-in mostly covered building the package.
2
0
691
Dec ’24
Game Center SignIn alert appears on macOS 26 (Tahoe) without capability or entitlement
Hello, On macOS 26 (Tahoe), when building a OSX app that includes GameKit code, calling GKLocalPlayer.local.authenticateHandler shows the "Sign In to Game Center" alert (e.g. didShowFullscreenSignIn) — even if the app does not have the Game Center capability enabled or any related entitlement (com.apple.developer.game-center). This alert only appears when the user is not signed in to Game Center in system settings. However, when testing the same code path on iOS app built with macOS 26 (Tahoe), the alert does not appear unless the proper capability and entitlement are included. This behavior is different from macOS 15 (Sequoia) + Xcode 15.x. Prior to the update, Game Center features did not work at all even with the OSX app without Capability and Entitlements. Steps to Reproduce Create a new OSX app target (App Sandbox enabled, no Game Center capability). Add minimal GameKit code: GKLocalPlayer.local.authenticateHandler = { _, _, _ in } Build OSX app and run on macOS 26 (Tahoe). Ensure Game Center is signed out in System Settings. Observe: “Sign In to Game Center” alert appears automatically. Expected Behavior When Game Center capability and entitlement are not present, authenticateHandler should fail silently, and no signIn alert should appear. Actual Behavior On OSX app, the Game Center signIn UI appears even without any Game Center capability or entitlement. On iOS app, this alert does not appear. *Build Configuration: built with the same condition. (macOS 26 + Xcode 26) Question Could you please confirm whether this behavior is an intentional change in macOS 26 or a bug only for OSX apps in the GameKit authentication flow? Thank you.
2
0
352
Oct ’25
RealityKit migration
Hey there, I’m currently planning to use RealityKit in a new multiplatform app I’m building. Unfortunately, I noticed that WatchOS is not supported for RealityKit, while SceneKit is getting deprecated. However, I’d like to maintain the same codebase across platforms. What are my options?
2
0
412
Oct ’25
Validation error, when I try to upload a game to Testflight
Hello everyone and thank you for you're time! I got an issue with uploading several icons to testflight. I'm using Game Maker Studio as my engine for the game. This is the error that I'm getting, even when I try to use the old icons for the game, that worked in the past. I tried to transform the icons, using this site "https://makeappicon.com/", but I still got the same validation error. Can you help me fixing the issue - thank you so much!
2
0
557
Oct ’25
GCControllerDidConnect notification not received in VisionOS 2.0
I am unable to get VisionOS 2.0 (simulator) to receive the GCControllerDidConnect notification and thus am unable to setup support for a gamepad. However, it works in VisionOS 1.2. For VisionOS 2.0 I've tried adding: .handlesGameControllerEvents(matching: .gamepad) attribute to the view Supports Controller User Interaction to Info.plist Supported game controller types -> Extended Gamepad to Info.plist ...but the notification still doesn't fire. It does when the code is run from VisionOS 1.2 simulator, both of which have the Send Game Controller To Device option enabled. Here is the example code. It's based on the Xcode project template. The only files updated were ImmersiveView.swift and Info.plist, as detailed above: import SwiftUI import GameController import RealityKit import RealityKitContent struct ImmersiveView: View { var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) } NotificationCenter.default.addObserver( forName: NSNotification.Name.GCControllerDidConnect, object: nil, queue: nil) { _ in print("Handling GCControllerDidConnect notification") } } .modify { if #available(visionOS 2.0, *) { $0.handlesGameControllerEvents(matching: .gamepad) } else { $0 } } } } extension View { func modify<T: View>(@ViewBuilder _ modifier: (Self) -> T) -> some View { return modifier(self) } }
2
1
936
Dec ’24
Can Game Mode be activated when a child (Java) process's window is fullscreened?
Imagine a native macOS app that acts as a "launcher" for a Java game.** For example, the "launcher" app might use the Swift Process API or a similar method to run the java command line tool (lets assume the user has installed Java themselves) to run the game. I have seen How to Enable Game Mode. If the native launcher app's Info.plist has the following keys set: LSApplicationCategoryType set to public.app-category.games LSSupportsGameMode set to true (for macOS 26+) GCSupportsGameMode set to true The launcher itself can cause Game Mode to activate if the launcher is fullscreened. However, if the launcher opens a Java process that opens a window, then the Java window is fullscreened, Game Mode doesn't seem to activate. In this case activating Game Mode for the launcher itself is unnecessary, but you'd expect Game Mode to activate when the actual game in the Java window is fullscreened. Is there a way to get Game Mode to activate in the latter case? ** The concrete case I'm thinking of is a third-party Minecraft Java Edition launcher, but the issue can also be demonstrated in a sample project (FB13786152). It seems like the official Minecraft launcher is able to do this, though it's not clear how. (Is its bundle identifier hardcoded in the OS to allow for this? Changing a sample app's bundle identifier to be the same as the official Minecraft launcher gets the behavior I want, but obviously this is not a practical solution.)
2
0
180
Jun ’25
metal-cpp syntax for MTL::Buffer float2 parameter
I'm trying to pass a buffer of float2 items from CPU to GPU. In the kernel, I can provide a parameter for the buffer: device const float2* values for example. How do I specify float2 as the type for the MTL::Buffer? I managed to get the code to work by "cheating" by defining a simple class that has the same data members as a float2, but there is probably a better way. class Coord_f { public: float x{0.0f}; float y{0.0f}; }; then using code to allocate like this: NS::TransferPtr(device->newBuffer(n_elements * sizeof(Coord_f), MTL::ResourceStorageModeManaged)) The headers for metal-cpp do not appear to define vector objects like float2, but I'm doubtless missing something. Thanks.
2
0
648
Jan ’25
SceneKit
Helle there Currently, I’m attempting to create an interactive learning application with a 3D view. I’ve discovered the framework SceneKit, but I lack the necessary knowledge to animate, load and moving objects. Could someone kindly suggest some good articles or tutorials on this topic?
2
0
579
Feb ’25
Float64 (Double Precision) Support on MPS with PyTorch on Apple Silicon?
Hi everyone, This project uses PyTorch on an Apple Silicon Mac (M1/M2/etc.), and the goal is to use the MPS backend for GPU acceleration, notes Apple Developer. However, the workflow depends on Float64 (double-precision) floating-point numbers for certain computations, notes PyTorch Forums. The error "Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead" has been encountered, notes GitHub. It seems that the MPS backend doesn't currently support Float64 for direct GPU computation. Questions for the community: Are there any known workarounds or best practices for handling Float64-dependent operations when using the MPS backend with PyTorch? For those working with high-precision tasks on Apple Silicon, what strategies are being used to balance performance with the need for Float64? Offloading to the CPU is an option, and it's of interest to know if there are any specific techniques or libraries within the Apple ecosystem that could streamline this process while aiming for optimal performance. Any insights, tips, or experiences would be appreciated. Thanks in advance, Jonaid MacBook Pro M3 Max
2
0
350
Oct ’25
Error this SDK is not supported by the compiler - Xcode version 16.2 (16C5032a) to 16.3(16E140)
In Xcode version 16.2 (16C5032a) I created: One [ScenekitApp] [File/New/Project/iOS/Game/Game Technology: SceneKit]. Three custom Frameworks [File/New/Project/iOS/Framework] [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework]. In the four projects the [Minimum Deployments=iOS 15.0], [Swift version=6.0] and [BUILD_LIBRARY_FOR_DISTRIBUTION=Yes]. I directly installed the [Zip.framework] in [ASCSource.framework] without [pod init/pod install] and in the [General tab] [Frameworks, Libraries and Embeded Content] [Zip.framework Embed&Sign]. I installed the three frameworks [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework] in [ScenekitApp] and everything works perfectly in Xcode version 16.2(16C5032a). I updated Xcode version 16.2(16C5032a) to Xcode version 16.3(16E140) and some errors in the SDKs were indicated. [Failed to build module 'ASCSource'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK]. [Failed to build module 'Zip'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK]. If I recompile all frameworks in Xcode version 16.3 (16E140) it will work, but I don't think it will be a good solution. I found some discussions in links like the following without success. https://github.com/firebase/firebase-ios-sdk/issues/13727 https://stackoverflow.com/questions/70556401/swift-version-conflict-this-sdk-is-not-supported-by-the-compiler-please-select Any help is welcome. Thanks everyone!!!
2
0
298
Apr ’25