I've been getting intermittent failures on Xcode code compiling my app on multiple platforms because it fails to compile a metal shader.
The Metal Toolchain was not installed and could not compile the Metal source files. Download the Metal Toolchain from Xcode > Settings > Components and try again.
Sometimes if I re-run it, it works fine. Then I'll run it again, and it will fail.
If you tell me to file a feedback, please tell me what information would be useful and actionable, because this is all I have.
Metal
RSS for tagRender advanced 3D graphics and perform data-parallel computations using graphics processors using Metal.
Posts under Metal tag
169 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
JAX Metal shows 55x slower random number generation compared to NVIDIA CUDA on equivalent workloads. This makes Monte Carlo simulations and scientific computing impractical on Apple Silicon.
Performance Comparison
NVIDIA GPU: 0.475s for 12.6M random elements
M1 Max Metal: 26.3s for same workload
Performance gap: 55x slower
Environment
Apple M1 Max, 64GB RAM, macOS Sequoia Version 15.6.1
JAX 0.4.34, jax-metal latest
Backend: Metal
Reproduction Code
import time
import jax
import jax.numpy as jnp
from jax import random
key = random.PRNGKey(42)
start_time = time.time()
random_array = random.normal(key, (50000, 252))
duration = time.time() - start_time
print(f"Duration: {duration:.3f}s")
I'm using the WebGPU abstraction library wgpu to build an app using compute shaders that compiles to Metal (on macOS), and in certain patterns where it uses a staging buffer for initial data, the data is just total missing from the capture, breaking other workflows such as shader debugging or seeing the completed results in the final buffer.
I wrote up details including a repro project and screen shots of the issue at https://github.com/gfx-rs/wgpu/issues/8111 . Seems like an Xcode bug. Any ideas? I'm happy to help investigate further if I can.
If I compile a compute kernel with a call to texture.read(), it fails with the following error: "Error Domain=AGXMetalG13X Code=3 "Encountered unlowered function call to air.get_read_sampler" UserInfo={NSLocalizedDescription=Encountered unlowered function call to air.get_read_sampler}."
This error occurs on both macOS and iOS 26 Beta 5, but not when running on a simulator or in a playground. It does not occur on a macOS Sequoia VM. It occurs whether I use the old metal 3 or new metal 4 compilation method.
A workaround would be to use a sampler, but according to the feature tables, all platforms support reading from textures of all formats.
Below is a minimal example which produces the error:
let device = MTLCreateSystemDefaultDevice()!
let library = device.makeDefaultLibrary()!
let computeFunction = library.makeFunction(name: "compute_test")!
do {
let pipeline = try device.makeComputePipelineState(function: computeFunction)
debugPrint(pipeline)
} catch {
debugPrint("Metal 3 failed with error:\n\(error)")
}
#import <metal_stdlib>
using namespace metal;
kernel void compute_test(uint2 gid [[thread_position_in_grid]],
texture2d<float, access::read> in [[texture(0)]],
texture2d<float, access::write> out [[texture(1)]]) {
out.write(in.read(gid), gid);
}
I filed feedback FB19530049.
Failed fetching catalog for assetType (com.apple.MobileAsset.MetalToolchain), serverParameters ({
RequestedBuild = 17A5295f;
})
Domain: DVTDownloadsUtilitiesErrorDomain
Code: -1
User Info: {
DVTErrorCreationDateKey = "2025-08-08 07:59:24 +0000";
}
Failed fetching catalog for assetType (com.apple.MobileAsset.MetalToolchain), serverParameters ({
RequestedBuild = 17A5295f;
})
Domain: DVTDownloadsUtilitiesErrorDomain
Code: -1
Download failed due to not being able to find the host. (Catalog download for com.apple.MobileAsset.MetalToolchain)
Domain: com.apple.MobileAssetError.Download
Code: 59
User Info: {
checkConfiguration = 1;
}
System Information
macOS Version 26.0 (Build 25A5327h)
Xcode 26.0 (24198.5) (Build 17A5295f)
Timestamp: 2025-08-08T08:59:24+01:00
With Xcode 26.0 beta 5 (17A5295f) when I run the following command
xcodebuild -downloadComponent metalToolchain
I get the following error:
xcodebuild[48851:12478851] Writing error result bundle to /var/folders/b_/g67r_tl557z244g20ncr_qmsd9wrz1/T/ResultBundle_2025-07-08_11-10-0012.xcresult
xcodebuild: error: Failed fetching catalog for assetType (com.apple.MobileAsset.MetalToolchain), serverParameters ({
RequestedBuild = 17A5295f;
})
I can't install the toolchain from the Xcode GUI also. Does someone know a workaround ?
I'm unable to download the Metal toolchain with Xcode. When trying to do it via the command line, I get the following:
> xcodebuild -downloadComponent metalToolchain -exportPath /tmp/MyMetalExport/
Beginning asset download...
2025-08-06 19:46:19.983 xcodebuild[1395:22024] Writing error result bundle to /var/folders/48/1k1jfsxn56zcs4qr_719rc1w0000gn/T/ResultBundle_2025-06-08_19-46-0019.xcresult
xcodebuild: error: Failed fetching catalog for assetType (com.apple.MobileAsset.MetalToolchain), serverParameters ({
RequestedBuild = 17A5295f;
})
From Console.app:
DVTDownloadsFetchAssetCatalog() complete assetType (com.apple.MobileAsset.MetalToolchain), options: (MADownloadOptions allowsCellular: 0 resourceTimeout: 60 canUseCacheServer: 0 discretionary: 0 disableUI: 0 sessionId: (null) additionalServerParams:{ RequestedBuild = 17A5295f; } allowsExpensiveAccess:1 requiresPowerPluggedIn: 0 prefersInfraWiFi: 1 liveServerOnly: -1 DownloadAuthorizationHeader: not present analyticsData: not present allowDaemonConnectionRetries: 0), result: (59), catalogError: (Download failed due to not being able to find the host. (Catalog download for com.apple.MobileAsset.MetalToolchain))
Note that I am online. I've tried restarting my computer and Xcode and using a different network.
Hi there,
Is it possible to customize the Metal Performance HUD on Apple TV, similar to how it can be done on iPhone & iPad?
Would like to see things like Compiled Shaders for my Apps on tvOS
.
We executed the iOS package using UE5.5 on Windows. UE automatically launched the "Metal Developer Tools for Windows". I tried two versions, 5.3 and 4.4. I found that the metal.exe sometimes had no response or output was empty. The problem of no response could be alleviated by adding timeout retries, but the problem of empty output seemed to have no effect with retries. The actual command line called was: metal.exe -v --target=air64-apple-darwin18.7.0. UE relies on this output to determine the version number, and if it cannot get it, it will crash directly.
During our replication test, we used a Python script to run multiple concurrent executions of the command "metal.exe -v --target=air64-apple-darwin18.7.0". On different Windows machines, it would get stuck at a certain concurrency level. Some machines even got stuck when running two tasks concurrently.
We would like to ask for solutions or available versions.
Hi!
I'm currently trying to render another XR scene in front of a RealityKit one.
Actually, I'm anchoring a plane to the head with a shader to display for left/right eye side-by-side images. By default, the camera has a near plane so I can directly draw at z=0.
Is there a way to change the camera near plane? Or maybe there is a better solution to overlay image/texture for left/right eyes?
Ideally, I would layer some kind of CompositorLayer on RealityKit, but that's sadly not possible from what I know.
Thanks in advance and have a good day!
I have really enjoyed looking through the code and videos related to Metal 4. Currently, my interest is to update a ReSTIR Project and take advantage of more robust ways to refit acceleration Structures and more powerful ways to access resources.
I am working in Swift and have encountered a couple of puzzles:
What is the 'accepted' way to create a MTL4BufferRange to store indices and vertices?
How do I properly rewrite Swift code to build and compact an Acceleration Structure?
I do realize that this is all in Beta and will happily look through Code Samples this Fall. If other guidance is available earlier, that would be fabulous!
Thank you
Hello!
I'm developing a GPU (shader) language, where I aim to target multiple backends with a common frontend. I wanted to avoid having to round trip through Metal, and go straight to IR just like I have with SPIRV, in order to have a fast and efficient compilation process.
I've been looking for a reference page where I can read about Metals IR, and as far as I'm aware, it exists, but I can't seem to find it anywhere.
Furthermore, if such a reference is available, is there also a toolkit where I can run validation on the output IR, and perhaps even run optimizations, much like spv-tools for SPIRV?
Any help would be appreciated!
Thanks,
Gustav
I would love to use Background GPU Access to do some video processing in the background.
However the documentation of BGContinuedProcessingTaskRequest.Resources.gpu clearly states:
Not all devices support background GPU use. For more information, see Performing long-running tasks on iOS and iPadOS.
Is there a list available of currently released devices that do (or don't) support GPU background usage? That would help to understand what part of our user base can use this feature. (And what hardware we need to test this on as developers.)
For example it seems that it isn't supported on an iPad Pro M1 with the current iOS 26 beta. The simulators also seem to not support the background GPU resource. So would be great to understand what hardware is capable of using this feature!
Hello everyone,
I must have missed something but why isn't there a depthAttachmentPixelFormat to the new Metal 4 MTL4RenderPipelineDescriptor, unlike the old MTLRenderPipelineDescriptor?
So how do you set the depth pixel format?
Thanks in advance!
Hello, I'm tracking down a bug where useResource doesn't seem to apply proper synchronization when a resource is produced by the render pass then consumed by the compute pass, but when I use MTLFence between the to signal and wait between the render/compute encoders, the artifact goes away.
The resource is created with MTLHazardTrackingModeTracked and useResource is called on the compute encoder after the render pass. Metal API Validation doesn't report any warnings/errors.
Am I misunderstanding the difference between the two APIs? I dug through the Metal documentation and it looks like useResource should handle synchronization given the resource has MTLHazardTrackingModeTracked but on the other hand, MTLFence should be used to ensure proper synchronization between command encoders. Can someone can clarify the difference between the two APIs and when to use them.
WWDC25: Combine Metal 4 machine learning and graphics
Demonstrated a way to combine neural network in the graphics pipeline directly through the shaders, using an example of Texture Compression. However there is no mention of using which ML technique texture is compressed.
Can anyone point me to some well known model/s for this particular use case shown in WWDC25.
Hello community,
After unable to use Metal Toolchain for Xcode Beta 1 and 2 even after following the workarounds, and using Xcode Beta 3, I am still unable to use the downloaded Metal Compiler on my device.
Building any project with metal file results in
warning: Could not read serialized diagnostics file: error("Failed to open diagnostics file") (in target '<target>' from project '<project>')
error: error: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain
Command CompileMetalFile failed with a nonzero exit code
Here is my build environment,
$ xcodebuild -version
Xcode 26.0
Build version 17A5276g
I have also checked the downloaded metal toolchain.
$ xcodebuild -downloadComponent metalToolchain -exportPath /tmp/MyMetalExport/
2025-07-13 13:16:17.293 xcodebuild[2153:34019] IDEDownloadableMetalToolchainCoordinator: Failed to remount the Metal Toolchain: The file “com.apple.MobileAsset.MetalToolchain-v17.1.5276.7.3KEJwX” couldn’t be opened because you don’t have permission to view it.
Beginning asset download...
2025-07-13 13:16:17.427 xcodebuild[2153:34022] IDEDownloadableMetalToolchainCoordinator: Failed to remount the Metal Toolchain: The file “com.apple.MobileAsset.MetalToolchain-v17.1.5276.7.3KEJwX” couldn’t be opened because you don’t have permission to view it.
Downloaded asset to: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/47af11e2964f385d510c6a9d1a49c8165f334a51.asset/AssetData/Restore/022-19457-052.dmg
Beginning asset export...
Done exporting: /tmp/MyMetalExport/MetalToolchain-17A5276g.exportedBundle
Done downloading: Metal Toolchain 17A5276g.
The ExportMetadata.plist has a buildUpdateVersion of 17A5276g.
Any suggestions are greatly appreciated.
I have discovered that RemoteImmersiveSpace is limited to utilizing the structure of the CompositorContent protocol, precluding direct invocation of RealityView. Consequently, I am interested in understanding the appropriate method for integrating CompositorContent within RemoteImmersiveSpace. Thanks.
Under Xcode 26 beta 3 I'm trying to build a project which uses Metal. I've installed the Metal Toolchain 26.0 under Settings -> Components, but when I start a build it fails during the "Prepare build" step with the following error (repeated many times):
stat(/var/run/com.apple.security.cryptexd/mnt/com.apple.MobileAsset.MetalToolchain-v17.1.5276.7.Pb9SLL/Metal.xctoolchain/usr/bin/clang): No such file or directory (2)
I've confirmed that there is in fact no 'clang' binary in that directory. I've tried using xcode-select to set the Xcode 26 Beta app as the active developer directory, and xcodebuild -version shows:
Xcode 26.0
Build version 17A5276g
Any ideas on other things to try?
Hi ,
My application meet below crash backtrace at very low repro rate from the public users, i do not see it relate to a specific iOS version or iPhone model. The last code line from my application is calling CAMetalLayer nextDrawable API.
I did some basic studying, suppose it may relate to the wrong CAMetaLayer configuration, like
frame property w or h <= 0.0
bounds property w or h <= 0.0
drawableSize w or h <= 0.0 or w or h > max value (like 16384)
Not sure my above thinking is right or not? Will the UIView which my CAMetaLayer attached will cause such nextDrawable crash or not ?
Thanks a lot
Main Thread - Crashed
libsystem_kernel.dylib
__pthread_kill
libsystem_c.dylib
abort
libsystem_c.dylib
__assert_rtn
Metal
MTLReportFailure.cold.1
Metal
MTLReportFailure
Metal
_MTLMessageContextEnd
Metal
-[MTLTextureDescriptorInternal validateWithDevice:]
AGXMetalA13
0x245b1a000 + 4522096
QuartzCore
allocate_drawable_texture(id<MTLDevice>, __IOSurface*, unsigned int, unsigned int, MTLPixelFormat, unsigned long long, CAMetalLayerRotation, bool, NSString*, unsigned long)
QuartzCore
get_unused_drawable(_CAMetalLayerPrivate*, CAMetalLayerRotation, bool, bool)
QuartzCore
CAMetalLayerPrivateNextDrawableLocked(CAMetalLayer*, CAMetalDrawable**, unsigned long*)
QuartzCore
-[CAMetalLayer nextDrawable]
SpaceApp
-[MetalRender renderFrame:] MetalRenderer.mm:167
SpaceApp
-[FrameBuffer acceptFrame:] VideoRender.mm:173
QuartzCore
CA::Display::DisplayLinkItem::dispatch_(CA::SignPost::Interval<(CA::SignPost::CAEventCode)835322056>&)
QuartzCore
CA::Display::DisplayLink::dispatch_items(unsigned long long, unsigned long long, unsigned long long)
QuartzCore
CA::Display::DisplayLink::dispatch_deferred_display_links(unsigned int)
UIKitCore
_UIUpdateSequenceRun
UIKitCore
schedulerStepScheduledMainSection
UIKitCore
runloopSourceCallback
CoreFoundation
__CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__
CoreFoundation
__CFRunLoopDoSource0
CoreFoundation
__CFRunLoopDoSources0
CoreFoundation
__CFRunLoopRun
CoreFoundation
CFRunLoopRunSpecific
GraphicsServices
GSEventRunModal
UIKitCore
-[UIApplication _run]
UIKitCore
UIApplicationMain