Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

SpriteKit framerate drop on iOS 26.0
Hello, I have noticed a performance drop on SpriteKit-based projects running on iOS 26.0 (23A341). Below is a SpriteKit scene used to test framerate on different devices: import SpriteKit import SwiftUI class BareboneScene: SKScene { override func didMove(to view: SKView) { size = view.bounds.size anchorPoint = CGPoint(x: 0.5, y: 0.5) backgroundColor = .darkGray let roundedSquare = SKShapeNode(rectOf: CGSize(width: 150, height: 75), cornerRadius: 12) roundedSquare.fillColor = .systemRed roundedSquare.strokeColor = .black roundedSquare.lineWidth = 3 addChild(roundedSquare) let action = SKAction.rotate(byAngle: .pi, duration: 1) roundedSquare.run(.repeatForever(action)) } } struct BareboneSceneView: View { var body: some View { SpriteView( scene: BareboneScene(), debugOptions: [.showsFPS] ) .ignoresSafeArea() } } #Preview { BareboneSceneView() } The scene is very simple, yet framerate drops to ~40 fps as shown by the Metal HUD. Tested on: iPhone 13, iOS 26.0: framerate drops to 40 fps. Sometimes it runs at near 60fps. But if the screen is touched repeatedly, the framerate drops to 40-50 fps again. iPhone 11 Pro, iOS 26.0: ~40fps. iPad 9th Gen, iOS 18.6.2: 60fps, no issues. See screenshots attached. These numbers were observed by me and members of our beloved SpriteKit Discord server. Thank you for your attention.
6
4
922
3d
Unable to compile Core Image filter on Xcode 26 due to missing Metal toolchain
I have a Core Image filter in my app that uses Metal. I cannot compile it because it complains that the executable tool metal is not available, but I have installed it in Xcode. If I go to the "Components" section of Xcode Settings, it shows it as downloaded. And if I run the suggested command, it also shows it as installed. Any advice? Xcode Version Version 26.0 beta (17A5241e) Build Output Showing All Errors Only Build target Lessons of project StudyJapanese with configuration Light RuleScriptExecution /Users/chris/Library/Developer/Xcode/DerivedData/StudyJapanese-glbneyedpsgxhscqueifpekwaofk/Build/Intermediates.noindex/StudyJapanese.build/Light-iphonesimulator/Lessons.build/DerivedSources/OtsuThresholdKernel.ci.air /Users/chris/Code/SerpentiSei/Shared/iOS/CoreImage/OtsuThresholdKernel.ci.metal normal undefined_arch (in target 'Lessons' from project 'StudyJapanese') cd /Users/chris/Code/SerpentiSei/StudyJapanese /bin/sh -c xcrun\ metal\ -w\ -c\ -fcikernel\ \"\$\{INPUT_FILE_PATH\}\"\ -o\ \"\$\{SCRIPT_OUTPUT_FILE_0\}\"' ' error: error: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain /Users/chris/Code/SerpentiSei/StudyJapanese/error:1:1: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain Build failed 6/9/25, 8:31 PM 27.1 seconds Result of xcodebuild -downloadComponent MetalToolchain (after switching Xcode-beta.app with xcode-select) xcodebuild -downloadComponent MetalToolchain Beginning asset download... Downloaded asset to: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/4d77809b60771042e514cfcf39662c6d1c195f7d.asset/AssetData/Restore/022-19457-035.dmg Done downloading: Metal Toolchain (17A5241c). Screenshots from Xcode Result of "Copy Information" Metal Toolchain 26.0 [com.apple.MobileAsset.MetalToolchain: 17.0 (17A5241c)] (Installed)
23
0
1.7k
1w
Error: apple/apple/game-porting-toolkit 1.1 did not build
My MacBook Air M1 has installed Mac OS Sonoma 14.3.1, and I tried to install game-poring-toolkit tonight. After the step which it requires me to input the command "brew -v install apple/apple/game-porting-toolkit", Terminal ran for minutes. But at the end this error appeared: Error: apple/apple/game-porting-toolkit 1.1 did not build. I don't know anything about coding and software. Could someone please tell me what cause this error and how to fix it after you read my post? I will appreciate your help!
32
8
18k
Dec ’24
GKLocalPlayer.authenticateHandler not called on iOS 26 when Game Center auth overlay is shown
Hi — we’re testing our app on iOS 26 and ran into strange behavior with GKLocalPlayer.local.authenticateHandler. GKLocalPlayer.local.authenticateHandler = { [weak self] viewController, error in // additional code } What happens: When we assign authenticateHandler on iOS 26 and the user is not signed in to Game Center, the system shows a full-screen Game Center overlay asking the user to sign in. If the user taps Cancel, nothing further happens — the closure is not invoked again, so we don’t receive an error or any callback. The app never learns whether the auth was cancelled or failed. In previous iOS versions the closure was called (with viewController / error as appropriate) and the flow worked as expected. What we tried: Verified authenticateHandler is being set. Checked GKLocalPlayer.local.isAuthenticated after the overlay dismisses — it’s unchanged. Observed system logs: a com.apple.GameOverlayUI scene is created and later removed (so the auth overlay is shown by the system). Confirmed the same code works on earlier iOS versions. :thinking: Question: Has anyone seen authenticateHandler not being invoked on iOS 26 when the Game Center auth overlay is presented? Could this be a behavioral change in iOS 26 (overlay runs in a separate system process), or a bug? Any suggested workarounds to reliably detect that the user cancelled the sign-in (for example: listening for willResignActive / didBecomeActive, watching for a system overlay, or saving/presenting the viewController manually)? Thanks in advance for any advice — we’d appreciate pointers or suggested diagnostics ?
7
2
787
1w
RealityKit visionOS anchor to POV
Hi, is there a way in visionOS to anchor an entity to the POV via RealityKit? I need an entity which is always fixed to the 'camera'. I'm aware that this is discouraged from a design perspective as it can be visually distracting. In my case though I want to use it to attach a fixed collider entity, so that the camera can collide with objects in the scene. Edit: ARView on iOS has a lot of very useful helper properties and functions like cameraTransform (https://developer.apple.com/documentation/realitykit/arview/cameratransform) How would I get this information on visionOS? RealityViews content does not seem offer anything comparable. An example use case would be that I would like to add an entity to the scene at my users eye-level, basically depending on their height. I found https://developer.apple.com/documentation/realitykit/realityrenderer which has an activeCamera property but so far it's unclear to me in which context RealityRenderer is used and how I could access it. Appreciate any hints, thanks!
9
6
5.1k
Dec ’24
Performance Regression: In iOS 18.2 3D rotation in volume rendering is not smooth as compared to iOS 18.1
We as a team of engineers work on an app intended to visualize medical images. The type of situations where the app is used involves time critical decision making for acute clinical conditions. Stability of the app and performance are of utmost importance and can directly help timely treatment action. The app we are developing uses multiple libraries and tools like vtk, webgl, opengl, webkit, gl-matrix etc. The problem specifically can be described as follows, it has been observed that when 3D volume is rendered in the app and we try to rotate the volume the rotation is slow, unresposive and laggy. Specifically, we have noticed that iOS 18.1 the volume rotation is much smoother as compared to latest iOS 18.2. Eariler, we have faced somewhat similar issue with iOS 17 but it got improved in iOS 18.1. This performance regression is affecting the user experience in our healthcare application. We have taken reference from the cornerstone.js code and you can reproduce the issue using the following example: https://www.cornerstonejs.org/live-examples/volumeviewport3d Steps to Reproduce: Load the above mentioned test example on an iPhone running version 18.2 using safari. Perform volume rendering using the provided dataset. Measure the time taken by volume for each rotate or drag action. Repeat the same steps on an iPhone running version 18.1 for comparison. Additional Information: Device Model Tested: iPhone12, iPhone13, iPhone14 iOS Version With Issue: 18.2 18.3(Beta) I would appreciate any insights or suggestions on how to address this performance regression. If additional information is needed, please let me know. Thank you.
4
6
939
Feb ’25
macOS Tahoe Beta 4 disabled __asm keyword for Metal
Hi, developers, I maintain a shipped app that uses string concatenation to construct Metal shader and compile on-device. Beta 4 seems disabled __asm keyword, resulting the compilation failure. The error is: v2/GEMMKernel.cpp:229: error: program_source:23:9: error: illegal string literal in 'asm' __asm("air.simdgroup_async_copy_1d.p3i8.p1i8"); The relevant code is available at https://github.com/liuliu/ccv/blob/unstable/lib/nnc/mfa/v2/GEMMHeaders.cpp#L30 although any __asm will trip this. Please give us guidance on whether this is a regression or this will be something enforced in 26 release. Personally, I would consider this as a bug given it won't impact anything "compiled" shaders. Thanks for your patience reading this!
Topic: Graphics & Games SubTopic: Metal Tags:
4
6
826
Jul ’25
Scene with over 12,000 duplicate entities
I am creating a RealityKit scene that will contain over 12,000 duplicate cubes arranged in a circle (see image below). This is for some high-energy physical simulations I am doing. I accomplish this scene by creating a single cube and cloning it a bunch of times. So, I there is a single MeshResource and Material even though there are a lot of entities. I have confirmed this by checking with Swift's === operator. Even with this, the program is unworkably slow. Any suggestions or tricks that could help with this type of scene? Using a single geometry was the trick to getting SceneKit to work fast with geometries like this. I've been updating my software to RealityKit because I far prefer the structure of RealityKit over SceneKit.
4
0
1.3k
Jan ’25
Invert a 3x3 matrix in a Metal compute kernel
I searched the Metal Shading Language Specification Version 3.0 document, however I cannot see any function for inverting a matrix. Is there really no function in Metal for inverting a matrix? I often need to this in linear equations and have so far resorted to writing the necessary function each time, most of the time just copy-and-pasting code. inverse exists in SIMD and GLSL, so why not in Metal? It seems so unexpected that this function does not exist that I am almost certain I have just overlooked something obvious. I even tried 1 / M, to no avail.
2
1
1.9k
Feb ’25
EXC_BREAKPOINT, QuartzCore , Crash CA::Render::Image::new_image
We are seeing crashes in Xcode organizer. So far we are not able to reproduce them locally. They affect multiple app releases (some older, built with Xcode 15.x and newer built with Xcode 16.0). They only affect iOS 18.5. Is there anything that changed in latest iOS? It's hard to tell what exactly is causing this crash because setting symbolic breakpoint on CA::Render::Image::new_image(unsigned int, unsigned int, unsigned int, unsigned int, CGColorSpace*, void const*, unsigned long const*, void (*)(void const*, void*), void*) triggers this breakpoint all the time, but not necessarily with exactly the previous stack frames matching the crash report. Is it a known issue? crash.crash Thank you.
0
5
242
Jul ’25
Game Porting Toolkit installation problem
Hello, Im trying to install it following these steps https://www.applegamingwiki.com/wiki/Game_Porting_Toolkit but i get an error with 'brew install apple/apple/game-porting-toolkit' ==> tar -xf crossover-sources-22.1.1.tar.gz --include=sources/clang/* --strip-components=2 ==> cmake -G Ninja -DCMAKE_VERBOSE_MAKEFILE=Off -DCMAKE_MAKE_PROGRAM=ninja -DCMAKE_VERBOSE_MAKEFILE=On -DCMAKE_OSX_ARCHITECTUR Last 15 lines from /Users/rafael/Library/Logs/Homebrew/game-porting-toolkit-compiler/02.cmake: -DLLVM_INSTALL_TOOLCHAIN_ONLY=On -DLLVM_ENABLE_PROJECTS=clang /private/tmp/game-porting-toolkit-compiler-20250519-44600-qwrjgl/llvm CMake Error at CMakeLists.txt:3 (cmake_minimum_required): Compatibility with CMake < 3.5 has been removed from CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. Or, add -DCMAKE_POLICY_VERSION_MINIMUM=3.5 to try configuring anyway. -- Configuring incomplete, errors occurred! If reporting this issue please do so to (not Homebrew/* repositories): apple/apple MacOS 15.3.1 Thank you in advanced Regards
2
5
291
May ’25
PortalComponent – allow world content to peek out
Hello, I've been tinkering with PortalComponent on visionOS a bit but noticed that the content of the WorldComponent is always clipped to the mesh geometry of whatever entities have the PortalComponent applied. Now I'm wondering if there is any way or trick to allow contents of the portal to peek out – similar to the Encounter Dinosaurs experience on Vision Pro (I assume it also uses PortalComponent?). I saw that PortalComponent has a clippingPlane property (https://developer.apple.com/documentation/realitykit/portalcomponent/clippingplane-swift.property). But so far I haven't been able to achieve a perceptible visual difference with it. If possible I would like to avoid hacky tricks using duplicate meshes or similar to achieve this. Thanks for any hints!
5
0
1.5k
Dec ’24
Export Armatures from Blender to USDC for use in RealityKit
I'm an experienced SceneKit developer and I want to begin work on a new project using RealityKit. So I appreciated as timely, the WWDC 2025 Session, "Bring your SceneKit project to RealityKit". However, now I am finding that: Blender does not properly support exporting armatures in usdc files, and usdc is really the only file format that should be used for creating 3D assets for RealityKit. The option of exporting from Blender to fbx or some other intermediate format, and then converting that to usdc, is a challenge. Apple's Reality Converter App, which supposedly can support importing and converting fbx files to usdc, is no longer available from Apple's website. And an older copy of it I found at the Kodeco website requires Rosetta on Apple Silicon. As well, this older copy does not in fact import fbx or anything else - I find it doesn't work at all. Apple's Reality Composer Pro, at least as far as I can tell, only supports importing usdc - it is not a file conversion tool. Alternatively, I am under the impression that Maya supports producing usdc files with armatures, but Maya costs over $2000 per year and I am skilled with Blender, so I believe strongly that I should be able to continue with Blender. Maya's expense and skillset simply shouldn't be a requirement for building RealityKit applications. What are my options then, if any, to produce assets with armatures and armature based animations using Blender, and then bring them into RealityKit?
0
4
106
Jun ’25
GameplayKit usage with Swift 6: Call to main actor-isolated instance method 'run' in a synchronous nonisolated context
Hi there, With a couple of other developers we have been busy with migrating our SpriteKit games and frameworks to Swift 6. There is one issue we are unable to resolve, and this involves the interaction between SpriteKit and GameplayKit. There is a very small demo repo created that clearly demonstrates the issue. It can be found here: https://github.com/AchrafKassioui/GameplayKitExplorer/blob/main/GameplayKitExplorer/Basic.swift The relevant code also pasted here: import SwiftUI import SpriteKit struct BasicView: View { var body: some View { SpriteView(scene: BasicScene()) .ignoresSafeArea() } } #Preview { BasicView() } class BasicScene: SKScene { override func didMove(to view: SKView) { size = view.bounds.size anchorPoint = CGPoint(x: 0.5, y: 0.5) backgroundColor = .gray view.isMultipleTouchEnabled = true let entity = BasicEntity(color: .systemYellow, size: CGSize(width: 100, height: 100)) if let renderComponent = entity.component(ofType: BasicRenderComponent.self) { addChild(renderComponent.sprite) } } } @MainActor class BasicEntity: GKEntity { init(color: SKColor, size: CGSize) { super.init() let renderComponent = BasicRenderComponent(color: color, size: size) addComponent(renderComponent) let animationComponent = BasicAnimationComponent() addComponent(animationComponent) } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } } @MainActor class BasicRenderComponent: GKComponent { let sprite: SKSpriteNode init(color: SKColor, size: CGSize) { self.sprite = SKSpriteNode(texture: nil, color: color, size: size) super.init() } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } } class BasicAnimationComponent: GKComponent { let action1 = SKAction.scale(to: 1.3, duration: 0.07) let action2 = SKAction.scale(to: 1, duration: 0.15) override init() { super.init() } override func didAddToEntity() { if let renderComponent = entity?.component(ofType: BasicRenderComponent.self) { renderComponent.sprite.run(SKAction.repeatForever(SKAction.sequence([action1, action2]))) } } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } } As SKNode is designed to run on the MainActor, the BasicRenderComponent is attributed with MainActor as well. This is needed as this GKComponent is dedicated to encapsulate the node that is rendered to the scene. There is also a BasicAnimationComponent, this GKComponent is responsible for animating the rendered node. Obviously, this is just an example, but when using GameplayKit in combination with SpriteKit it is very common that a GKComponent instance manipulates an SKNode referenced from another GKComponent instance, often done via open func update(deltaTime seconds: TimeInterval) or as in this example, inside didAddToEntity. Now, the problem is that in the above example (but the same goes foupdate(deltaTime seconds: TimeInterval) the methoddidAddToEntity is not isolated to the MainActor, as GKComponent is not either. This leads to the error Call to main actor-isolated instance method 'run' in a synchronous nonisolated context, as indeed the compiler can not infer that didAddToEntity is isolated to the MainActor. Marking BasicAnimationComponent as @MainActor does not help, as this isolation is not propogated back to the superclass inherited methods. In fact, we tried a plethora of other options, but none resolved this issue. How should we proceed with this? As of now, this is really holding us back migrating to Swift 6. Hope someone is able to help out here!
2
4
1.1k
Oct ’24
Synchronizing Physics in TableTopKit
I am working on adding synchronized physical properties to EntityEquipment in TableTopKit, allowing seamless coordination during GroupActivities sessions between players. Treating EntityEquipment's state to DieState is not a way, because it doesn't support custom collision shapes. I have also tried adding PhysicsBodyComponent and CollisionComponent to EntityEquipment's Entity. However, the main issue is that the position of EntityEquipment itself does not synchronize with the Entity's physics body, resulting in two separate instances of one object. struct PlayerPawn: EntityEquipment { let id: ID let entity: Entity var initialState: BaseEquipmentState init(id: ID, entity: Entity) { self.id = id let massProperties = PhysicsMassProperties(mass: 1.0) let material = PhysicsMaterialResource.generate(friction: 0.5, restitution: 0.5) let shape = ShapeResource.generateBox(size: [0.4, 0.2, 0.2]) let physicsBody = PhysicsBodyComponent(massProperties: massProperties, material: material, mode: .dynamic) let collisionComponent = CollisionComponent(shapes: [shape]) entity.components.set(physicsBody) entity.components.set(collisionComponent) self.entity = entity initialState = .init(parentID: .tableID, pose: .init(position: .init(), rotation: .zero), entity: self.entity) } } I’d appreciate any guidance on the recommended approach to adding synchronized physical properties to EntityEquipment.
2
4
1k
Dec ’24
Sample code for WWDC25 Session
Hi! I watched the WWDC25 session "Bring your SceneKit project to RealityKit" which seemed like a great resource for those of us transitioning from the now-deprecated SceneKit framework. The session mentioned that the full sample code for the project would be available to download, but I haven't been able to find it in the Code section of the video page or in the Sample Code Library. Has the sample code been released yet? Having the project code would make it much easier to follow along with the RealityKit changes shown in the video. Thanks again for the great session.
4
4
247
Jun ’25
BlendShapes don’t animate while playing animation in RealityKit
Hi everyone, I’m running into an issue with RealityKit when trying to animate BlendShapes (ShapeKeys) while a skeletal animation is playing. The model is a rigged character in .usdz format with both predefined skeletal animations and BlendShapes (exported from Blender). The problem: when I play any animation using entity.playAnimation(...), the BlendShapes stop responding. Calling setBlendShapes(...) still logs that weights are being updated, but the visual changes are not visible. The exact same blend shape animation works perfectly when no animation is playing. In SceneKit the same model works as expected: shape keys get animated during animation playback. But not in realitykit Still, as soon as an animation starts, the shape keys don’t animate anymore. Here’s the test project on GitHub that demonstrates the issue clearly: https://github.com/IAMTHEBURT/RealityKitWitnBlendShapesSample The goal is to play facial expressions (like blinking or talking) while a body animation (like waving) is playing. Is this a known limitation in RealityKit? Or is there a recommended way to combine skeletal animations with real-time BlendShape updates? Thanks in advance for any insights.
3
3
250
Jul ’25
Showing a MTLTexture on an Entity in RealityKit
Is there any standard way of efficiently showing a MTLTexture on a RealityKit Entity? I can't find anything proper on how to , for example, generate a LowLevelTexture out of a MTLTexture. Closest match was this two year old thread. In the old SceneKit app, we would just do guard let material = someNode.geometry?.materials.first else { return } material.diffuse.contents = mtlTexture Our flow is as follows (for visualizing the currently detected object): Camera-Stream -> CoreML Segmentation -> Send the relevant part of the MLShapedArray-Tensor to a MTLComputeShader that returns a MTLTexture -> Show the resulting texture on a 3D object to the user
5
0
963
2w