I have a Core Image filter in my app that uses Metal. I cannot compile it because it complains that the executable tool metal is not available, but I have installed it in Xcode.
If I go to the "Components" section of Xcode Settings, it shows it as downloaded. And if I run the suggested command, it also shows it as installed. Any advice?
Xcode Version
Version 26.0 beta (17A5241e)
Build Output
Showing All Errors Only
Build target Lessons of project StudyJapanese with configuration Light
RuleScriptExecution /Users/chris/Library/Developer/Xcode/DerivedData/StudyJapanese-glbneyedpsgxhscqueifpekwaofk/Build/Intermediates.noindex/StudyJapanese.build/Light-iphonesimulator/Lessons.build/DerivedSources/OtsuThresholdKernel.ci.air /Users/chris/Code/SerpentiSei/Shared/iOS/CoreImage/OtsuThresholdKernel.ci.metal normal undefined_arch (in target 'Lessons' from project 'StudyJapanese')
cd /Users/chris/Code/SerpentiSei/StudyJapanese
/bin/sh -c xcrun\ metal\ -w\ -c\ -fcikernel\ \"\$\{INPUT_FILE_PATH\}\"\ -o\ \"\$\{SCRIPT_OUTPUT_FILE_0\}\"'
'
error: error: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain
/Users/chris/Code/SerpentiSei/StudyJapanese/error:1:1: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain
Build failed 6/9/25, 8:31 PM 27.1 seconds
Result of xcodebuild -downloadComponent MetalToolchain (after switching Xcode-beta.app with xcode-select)
xcodebuild -downloadComponent MetalToolchain
Beginning asset download...
Downloaded asset to: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/4d77809b60771042e514cfcf39662c6d1c195f7d.asset/AssetData/Restore/022-19457-035.dmg
Done downloading: Metal Toolchain (17A5241c).
Screenshots from Xcode
Result of "Copy Information"
Metal Toolchain 26.0 [com.apple.MobileAsset.MetalToolchain: 17.0 (17A5241c)] (Installed)
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi — we’re testing our app on iOS 26 and ran into strange behavior with GKLocalPlayer.local.authenticateHandler.
GKLocalPlayer.local.authenticateHandler = { [weak self] viewController, error in
// additional code
}
What happens:
When we assign authenticateHandler on iOS 26 and the user is not signed in to Game Center, the system shows a full-screen Game Center overlay asking the user to sign in.
If the user taps Cancel, nothing further happens — the closure is not invoked again, so we don’t receive an error or any callback. The app never learns whether the auth was cancelled or failed.
In previous iOS versions the closure was called (with viewController / error as appropriate) and the flow worked as expected.
What we tried:
Verified authenticateHandler is being set.
Checked GKLocalPlayer.local.isAuthenticated after the overlay dismisses — it’s unchanged.
Observed system logs: a com.apple.GameOverlayUI scene is created and later removed (so the auth overlay is shown by the system).
Confirmed the same code works on earlier iOS versions. :thinking:
Question:
Has anyone seen authenticateHandler not being invoked on iOS 26 when the Game Center auth overlay is presented? Could this be a behavioral change in iOS 26 (overlay runs in a separate system process), or a bug? Any suggested workarounds to reliably detect that the user cancelled the sign-in (for example: listening for willResignActive / didBecomeActive, watching for a system overlay, or saving/presenting the viewController manually)?
Thanks in advance for any advice — we’d appreciate pointers or suggested diagnostics ?
My MacBook Air M1 has installed Mac OS Sonoma 14.3.1, and I tried to install game-poring-toolkit tonight. After the step which it requires me to input the command "brew -v install apple/apple/game-porting-toolkit", Terminal ran for minutes. But at the end this error appeared: Error: apple/apple/game-porting-toolkit 1.1 did not build.
I don't know anything about coding and software. Could someone please tell me what cause this error and how to fix it after you read my post? I will appreciate your help!
I've had no issue calling image files in my .swift files, but they are causing crashes when used in my .SKS files. When I set a sprite texture to an image in the inspector/ editor bar, at runtime when that sprite is being called I get the error: "Cannot get value with size 16. The type encoded as {CGRect={CGPoint=dd}{CGSize=dd}} is expected to be 32 bytes." From my research it has something to do with Apple switching from 32 to 64 bite machines. From chatGPT “SpriteKit under the hood uses NSKeyedUnarchiver to load your .sks file. That unarchiver decodes each archived property by reading a fixed‑size blob of bytes and mapping it into a C struct. In your case it ran into a mismatch”. I am using a 64-bite machine to write my code and 64-bite simulators and physical devices, so there isn't a clear cause of the mismatch. My scenes play fine in Xcode 16's preview window and my code builds, it just crashes at runtime.
When I don’t use image textured assets in the SKS file it works fine. It loads animated labels, and plain color squares. I’ve been able to work around this for static things like a sprite with a background texture by. in a normal non-game swift file, writing code like:
if let scene = SKScene(fileNamed: "GameScene2") {
let bg = SKSpriteNode(imageNamed: "YourBackgroundImage")
bg.position = CGPoint(x: scene.frame.midX, y: scene.frame.midY)
bg.zPosition = -1
scene.addChild(bg)
}
The issue now is I want to make a particle emitter and other non static sprites, but my understanding of their properities isn’t deep enough to create them without the editor. Also when I set SKTexture in a swift file that causes the same runtime crash with the 16/32 error. Could you help me figure out how to fix the bug so I can use the editor again? Otherwise could you help me figure out how to write a workaround like I do for background images? I have a feeling the answer is in writing my own NSKeyedUnarchiver but I don’t know how to make sure it’s called instead of the default one. I've already tried cleaning my code multiple times and deleting and reading sprite nodes. Thank you.
Hello
XQuartz is an open-source effort to develop a version of the X.Org X Window System (https://www.xquartz.org/), widely used to bring graphical support to applications running in remote servers (usually via SSH).
Since macOS Tahoe, XQuartz fails to refresh properly on window resize (more info here https://github.com/XQuartz/XQuartz/issues/438#issuecomment-3371409500), leading to severe usability issues.
The XQuartz developers are already aware of the issue, but I’m wondering if there’s anything we can do at the OS level to resolve it and restore the usual behavior from before macOS Tahoe.
Thanks,
KiM
Topic:
Graphics & Games
SubTopic:
General
Hi, developers,
I maintain a shipped app that uses string concatenation to construct Metal shader and compile on-device. Beta 4 seems disabled __asm keyword, resulting the compilation failure.
The error is:
v2/GEMMKernel.cpp:229: error: program_source:23:9: error: illegal string literal in 'asm'
__asm("air.simdgroup_async_copy_1d.p3i8.p1i8");
The relevant code is available at https://github.com/liuliu/ccv/blob/unstable/lib/nnc/mfa/v2/GEMMHeaders.cpp#L30 although any __asm will trip this.
Please give us guidance on whether this is a regression or this will be something enforced in 26 release. Personally, I would consider this as a bug given it won't impact anything "compiled" shaders.
Thanks for your patience reading this!
We as a team of engineers work on an app intended to visualize medical images. The type of situations where the app is used involves time critical decision making for acute clinical conditions. Stability of the app and performance are of utmost importance and can directly help timely treatment action. The app we are developing uses multiple libraries and tools like vtk, webgl, opengl, webkit, gl-matrix etc.
The problem specifically can be described as follows, it has been observed that when 3D volume is rendered in the app and we try to rotate the volume the rotation is slow, unresposive and laggy. Specifically, we have noticed that iOS 18.1 the volume rotation is much smoother as compared to latest iOS 18.2. Eariler, we have faced somewhat similar issue with iOS 17 but it got improved in iOS 18.1. This performance regression is affecting the user experience in our healthcare application.
We have taken reference from the cornerstone.js code and you can reproduce the issue using the following example: https://www.cornerstonejs.org/live-examples/volumeviewport3d
Steps to Reproduce:
Load the above mentioned test example on an iPhone running version 18.2 using safari.
Perform volume rendering using the provided dataset.
Measure the time taken by volume for each rotate or drag action.
Repeat the same steps on an iPhone running version 18.1 for comparison.
Additional Information:
Device Model Tested:
iPhone12, iPhone13, iPhone14
iOS Version With Issue:
18.2
18.3(Beta)
I would appreciate any insights or suggestions on how to address this performance regression. If additional information is needed, please let me know.
Thank you.
We are seeing crashes in Xcode organizer. So far we are not able to reproduce them locally. They affect multiple app releases (some older, built with Xcode 15.x and newer built with Xcode 16.0). They only affect iOS 18.5.
Is there anything that changed in latest iOS? It's hard to tell what exactly is causing this crash because setting symbolic breakpoint on CA::Render::Image::new_image(unsigned int, unsigned int, unsigned int, unsigned int, CGColorSpace*, void const*, unsigned long const*, void (*)(void const*, void*), void*) triggers this breakpoint all the time, but not necessarily with exactly the previous stack frames matching the crash report.
Is it a known issue?
crash.crash
Thank you.
I am creating a RealityKit scene that will contain over 12,000 duplicate cubes arranged in a circle (see image below). This is for some high-energy physical simulations I am doing. I accomplish this scene by creating a single cube and cloning it a bunch of times. So, I there is a single MeshResource and Material even though there are a lot of entities. I have confirmed this by checking with Swift's === operator. Even with this, the program is unworkably slow.
Any suggestions or tricks that could help with this type of scene?
Using a single geometry was the trick to getting SceneKit to work fast with geometries like this. I've been updating my software to RealityKit because I far prefer the structure of RealityKit over SceneKit.
Hello,
Im trying to install it following these steps https://www.applegamingwiki.com/wiki/Game_Porting_Toolkit but i get an error with 'brew install apple/apple/game-porting-toolkit'
==> tar -xf crossover-sources-22.1.1.tar.gz --include=sources/clang/* --strip-components=2
==> cmake -G Ninja -DCMAKE_VERBOSE_MAKEFILE=Off -DCMAKE_MAKE_PROGRAM=ninja -DCMAKE_VERBOSE_MAKEFILE=On -DCMAKE_OSX_ARCHITECTUR
Last 15 lines from /Users/rafael/Library/Logs/Homebrew/game-porting-toolkit-compiler/02.cmake:
-DLLVM_INSTALL_TOOLCHAIN_ONLY=On
-DLLVM_ENABLE_PROJECTS=clang
/private/tmp/game-porting-toolkit-compiler-20250519-44600-qwrjgl/llvm
CMake Error at CMakeLists.txt:3 (cmake_minimum_required):
Compatibility with CMake < 3.5 has been removed from CMake.
Update the VERSION argument <min> value. Or, use the <min>...<max> syntax
to tell CMake that the project requires at least <min> but has been updated
to work with policies introduced by <max> or earlier.
Or, add -DCMAKE_POLICY_VERSION_MINIMUM=3.5 to try configuring anyway.
-- Configuring incomplete, errors occurred!
If reporting this issue please do so to (not Homebrew/* repositories):
apple/apple
MacOS 15.3.1
Thank you in advanced
Regards
Any body have the link to the sample code for the “Bring your SceneKit project to RealityKit” session.
I can’t see it any where in the Dev sample code pages
Topic:
Graphics & Games
SubTopic:
SceneKit
I searched the Metal Shading Language Specification Version 3.0 document, however I cannot see any function for inverting a matrix. Is there really no function in Metal for inverting a matrix?
I often need to this in linear equations and have so far resorted to writing the necessary function each time, most of the time just copy-and-pasting code.
inverse exists in SIMD and GLSL, so why not in Metal? It seems so unexpected that this function does not exist that I am almost certain I have just overlooked something obvious. I even tried 1 / M, to no avail.
Hello,
I've been trying to leverage instanced rendering in RealityKit on visionOS but have not had success.
RealityKit states this is supported:
https://developer.apple.com/documentation/realitykit/validating-usd-files
https://developer.apple.com/videos/play/wwdc2021/10075/?time=1373
https://developer.apple.com/videos/play/wwdc2023/10099/?time=772
RealityKit Trace metrics
Validating instancing is working:
To test I made a base visionOS app with immersive space and the entity replaced with my test usdz file. I've been using the RealityKit Trace profiling template in xcode instruments in the immersive space and volume closed. This gets consistent draw call results.
If I have a single sphere mesh with one material I get one draw call, but the number of draw calls grows linearly with mesh count no matter how my entity is configured.
What I've tried
Create a test scene in blender, export with instancing enabled
Create a test scene in Reality Composer Pro using references
Author usda files by hand based on the OpenUSD spec
Programatically create a MeshResource with Contents at runtime
References
https://openusd.org/release/api/_usd__page__scenegraph_instancing.html
https://developer.apple.com/documentation/realitykit/meshresource
https://developer.apple.com/documentation/realitykit/meshresource/instance
Thank you
Hi! I watched the WWDC25 session "Bring your SceneKit project to RealityKit" which seemed like a great resource for those of us transitioning from the now-deprecated SceneKit framework. The session mentioned that the full sample code for the project would be available to download, but I haven't been able to find it in the Code section of the video page or in the Sample Code Library.
Has the sample code been released yet? Having the project code would make it much easier to follow along with the RealityKit changes shown in the video. Thanks again for the great session.
I have published a number of games that use SpriteKit for everything important. Since the release of macOS Tahoe, I've had a lot of end user reports saying that sound effects have stopped working in many (but not all) of my titles.
I'm not doing anything unusual here – typical code is:
sndGameOver = [SKAction playSoundFileNamed:@"Audio/GameOver.wav" waitForCompletion:YES];
Then at the appropriate time:
[self runAction:sndGameOver];
Has anyone else encountered this? The code still works fine on previous operating systems, and appears to be fine on iOS too. Has something changed in macOS Tahoe?
I'm at a bit of a loss. There's nothing obviously different between the titles that do work and the titles that don't.
Suggestions welcomed!
Thanks
I'm an experienced SceneKit developer and I want to begin work on a new project using RealityKit. So I appreciated as timely, the WWDC 2025 Session, "Bring your SceneKit project to RealityKit".
However, now I am finding that:
Blender does not properly support exporting armatures in usdc files, and usdc is really the only file format that should be used for creating 3D assets for RealityKit.
The option of exporting from Blender to fbx or some other intermediate format, and then converting that to usdc, is a challenge.
Apple's Reality Converter App, which supposedly can support importing and converting fbx files to usdc, is no longer available from Apple's website. And an older copy of it I found at the Kodeco website requires Rosetta on Apple Silicon. As well, this older copy does not in fact import fbx or anything else - I find it doesn't work at all.
Apple's Reality Composer Pro, at least as far as I can tell, only supports importing usdc - it is not a file conversion tool.
Alternatively, I am under the impression that Maya supports producing usdc files with armatures, but Maya costs over $2000 per year and I am skilled with Blender, so I believe strongly that I should be able to continue with Blender. Maya's expense and skillset simply shouldn't be a requirement for building RealityKit applications.
What are my options then, if any, to produce assets with armatures and armature based animations using Blender, and then bring them into RealityKit?
It's a Broadcast Extension issue: on iOS 26.1 beta the extension never launches—after you tap “Start Broadcast” in the system picker the countdown disappears after 3 s and no broadcast starts, so every live-streaming app(and all other non-system apps that use Broadcast Extension) fails to go live (only the native Photos screen recording still works). Is this a known regression or is a new entitlement required?
Hello!
Bare with me here, as there is a lot to explain!
I am working on implementing a Game Center high score leaderboard into my game. I have looked around for examples of how to properly implement this code, but have come up short on finding much material. Therefore, I have tried implementing it myself based off information I found on apples documentation.
Long story short, I am getting success printed when I update my score, but no scores are actually being posted (or at-least no scores are showing up on the Game Center leaderboard when opened).
Before I show the code, one thing I have questioned is the fact that this game is still in development. In AppStoreConnect, the status of the leaderboard is "Not Live". Does this affect scores being posted?
Onto the code. I have created a GameCenter class which handles getting the leaderboards and posting scores to a specific leaderboard. I will post the code in whole, and will discuss below what is happening.
PLEASE VIEW ATTACHED TEXT TO SEE THE GAMECENTER CLASS!
GameCenter class - https://developer.apple.com/forums/content/attachment/0dd6dca8-8131-44c8-b928-77b3578bd970
In a different GameScene, once the game is over, I request to post a new high score to Game Center with this line of code:
GameCenter.shared.submitScore(id: GameCenterLeaderboards.HighScore.rawValue)
Now onto the logic of my code. For the longest time I struggled to figure out how to submit a score. I figured out that in Xcode 12, they deprecated a lot of functions that previously worked for me. Not is seems that we have to load all leaderboards (or the ones we want). That is the purpose behind the leaderboards private variable in the Game Center class.
On the start up of the app, I call authenticate player. Once this callback is reached, I call loadLeaderboards which will load the leaderboards for each string id in an enum that I have elsewhere. Each of these leaderboards will be created as a Leaderboard object, and saved in the private leaderboard array. This is so I have access to these leaderboards later when I want to submit a score.
Once the game is over, I am calling submitScore with the leaderboard id I want to post to. Right now, I only have a high score, but in the future I may add a parameter to this with the value so it works for other leaderboards as well. Therefore, no value is passed in since I am pulling from local storage which holds the high score.
submitScore will get the leaderboard from the private leaderboard array that has the same id as the one passed in. Once I get the correct leaderboard, I submit a score to that leaderboard. Once the callback is hit, I receive the output "Successfully submitted score to leaderboard". This looks promising, except for the fact that no score is actually posted.
At startup, I am calling updatePlayerHighScore, which is not complete - but for the purpose of my point, retrieves the high score of the player from the leaderboard and is printing it out to the console. It is printing out (0), meaning that no score was posted.
The last thing I have questions about is the context when submitting a score. According to the documentation, this seems to just be metadata that GameCenter does not care about, but rather something the developer can use. Therefore, I think I can cross this off as causing the problem.
I believe I implemented this correctly, but for some reason, nothing is posting to the leaderboard. This was ALOT, but I wanted to make sure I got all my thoughts down.
Any help on why this is NOT posting would be awesome! Thanks so much!
Mark
Hi everyone,
I’m running into an issue with RealityKit when trying to animate BlendShapes (ShapeKeys) while a skeletal animation is playing. The model is a rigged character in .usdz format with both predefined skeletal animations and BlendShapes (exported from Blender).
The problem: when I play any animation using entity.playAnimation(...), the BlendShapes stop responding. Calling setBlendShapes(...) still logs that weights are being updated, but the visual changes are not visible.
The exact same blend shape animation works perfectly when no animation is playing.
In SceneKit the same model works as expected: shape keys get animated during animation playback. But not in realitykit
Still, as soon as an animation starts, the shape keys don’t animate anymore.
Here’s the test project on GitHub that demonstrates the issue clearly:
https://github.com/IAMTHEBURT/RealityKitWitnBlendShapesSample
The goal is to play facial expressions (like blinking or talking) while a body animation (like waving) is playing.
Is this a known limitation in RealityKit? Or is there a recommended way to combine skeletal animations with real-time BlendShape updates?
Thanks in advance for any insights.
Hi everyone,
I'm developing an ARKit app using RealityKit and encountering an issue where a video displayed on a 3D plane shows up as a pink screen instead of the actual video content.
Here's a simplified version of my setup:
func createVideoScreen(video: AVPlayerItem, canvasWidth: Float, canvasHeight: Float, aspectRatio: Float, fitsWidth: Bool = true) -> ModelEntity {
let width = (fitsWidth) ? canvasWidth : canvasHeight * aspectRatio
let height = (fitsWidth) ? canvasWidth * (1/aspectRatio) : canvasHeight
let screenPlane = MeshResource.generatePlane(width: width, depth: height)
let videoMaterial: Material = createVideoMaterial(videoItem: video)
let videoScreenModel = ModelEntity(mesh: screenPlane, materials: [videoMaterial])
return videoScreenModel
}
func createVideoMaterial(videoItem: AVPlayerItem) -> VideoMaterial {
let player = AVPlayer(playerItem: videoItem)
let videoMaterial = VideoMaterial(avPlayer: player)
player.play()
return videoMaterial
}
Despite following the standard process, the video plane renders pink. Has anyone encountered this before, or does anyone know what might be causing it?
Thanks in advance!