Render advanced 3D graphics and perform data-parallel computations using graphics processors using Metal.

Posts under Metal tag

144 Posts

Post

Replies

Boosts

Views

Activity

Help Request! How to Render Models with SubMeshes Using Metal 4?
Hi, I'm Beginner with Metal 4 and Model I/O 🥺. I can render simple models with just one mesh, but when I try to render models with SubMeshes, nothing shows up on screen. Can anyone help me figure out how to properly render models with multiple submeshes? I think I'm not iterating through them correctly or maybe missing some buffers setup. Here's what I have so far: https://www.icloud.com.cn/iclouddrive/0a6x_NLwlWy-herPocExZ8g3Q#LoadModel
1
0
262
Nov ’25
Cannot load .mtlpackage to MTLLibrary
After watching WWDC 2025 session "Combine Metal 4 machine learning and graphics", I have decided to give it a shot to integrate the latest MTL4MachineLearningCommandEncoder to my existing render pipeline. After a lot of trial and errors, I managed to set up the pipeline and have the app compiled. However, I am now stuck on creating a MTLLibrary with .mtlpackage. Here is the code I have to create a MTLLibrary according the WWDC session https://developer.apple.com/videos/play/wwdc2025/262/?time=550: let coreMLFilePath = bundle.path(forResource: "my_model", ofType: "mtlpackage")! let coreMLURL = URL(string: coreMLFilePath)! do { metalDevice.makeLibrary(URL: coreMLURL) } catch { print("error: \(error)") } With the above code, I am getting error: Error Domain=MTLLibraryErrorDomain Code=1 "Invalid metal package" UserInfo={NSLocalizedDescription=Invalid metal package} What is the correct way to create a MTLLibrary with .mtlpackage? Do I see this error because the .mtlpackage I am using is incorrect? How should I go with debugging this? I'd really appreciate if I could get some help on this as I have been stuck with it for some time now. Thanks in advance!
0
0
227
Nov ’25
Are there complete code examples available for “Combine Metal 4 machine learning and graphics”?
Hello, I recently watched the WWDC2025 session titled “Combine Metal 4 machine learning and graphics” (https://developer.apple.com/videos/play/wwdc2025/262/ ), and I’m very excited about the new Metal 4 features that integrate machine learning with graphics—such as neural ambient occlusion, shader-based ML inference, and the use of MTLTensor and MTL4MachineLearningCommandEncoder. While the session includes helpful code snippets and a compelling debug demo (e.g., the neural ambient occlusion example), the implementation details are not fully shown, and I haven’t been able to find a complete, runnable sample project that demonstrates end-to-end integration of ML and rendering in Metal 4. Would Apple be able to provide a full, working example—such as an Xcode project—that shows how to: Export a model to an .mlpackage, Convert it to an .mtlpackage, Use MTL4MachineLearningCommandEncoder alongside render passes, Or embed small neural networks directly in shaders using Shader ML? Having such a sample would greatly help developers like me adopt these powerful new capabilities correctly and efficiently. Thank you very much for your time and support! Best regards,
4
2
928
Nov ’25
no tensorflow-metal past tf 2.18?
Hi We're on tensorflow 2.20 that has support now for python 3.13 (finally!). tensorflow-metal is still only supporting 2.18 which is over a year old. When can we expect to see support in tensorflow-metal for tf 2.20 (or later!) ? I bought a mac thinking I would be able to get great performance from the M processors but here I am using my CPU for my ML projects. If it's taking so long to release it, why not open source it so the community can keep it more up to date? cheers Matt
1
1
362
Nov ’25
Unable to compile Core Image filter on Xcode 26 due to missing Metal toolchain
I have a Core Image filter in my app that uses Metal. I cannot compile it because it complains that the executable tool metal is not available, but I have installed it in Xcode. If I go to the "Components" section of Xcode Settings, it shows it as downloaded. And if I run the suggested command, it also shows it as installed. Any advice? Xcode Version Version 26.0 beta (17A5241e) Build Output Showing All Errors Only Build target Lessons of project StudyJapanese with configuration Light RuleScriptExecution /Users/chris/Library/Developer/Xcode/DerivedData/StudyJapanese-glbneyedpsgxhscqueifpekwaofk/Build/Intermediates.noindex/StudyJapanese.build/Light-iphonesimulator/Lessons.build/DerivedSources/OtsuThresholdKernel.ci.air /Users/chris/Code/SerpentiSei/Shared/iOS/CoreImage/OtsuThresholdKernel.ci.metal normal undefined_arch (in target 'Lessons' from project 'StudyJapanese') cd /Users/chris/Code/SerpentiSei/StudyJapanese /bin/sh -c xcrun\ metal\ -w\ -c\ -fcikernel\ \"\$\{INPUT_FILE_PATH\}\"\ -o\ \"\$\{SCRIPT_OUTPUT_FILE_0\}\"' ' error: error: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain /Users/chris/Code/SerpentiSei/StudyJapanese/error:1:1: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain Build failed 6/9/25, 8:31 PM 27.1 seconds Result of xcodebuild -downloadComponent MetalToolchain (after switching Xcode-beta.app with xcode-select) xcodebuild -downloadComponent MetalToolchain Beginning asset download... Downloaded asset to: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/4d77809b60771042e514cfcf39662c6d1c195f7d.asset/AssetData/Restore/022-19457-035.dmg Done downloading: Metal Toolchain (17A5241c). Screenshots from Xcode Result of "Copy Information" Metal Toolchain 26.0 [com.apple.MobileAsset.MetalToolchain: 17.0 (17A5241c)] (Installed)
25
0
3.3k
Oct ’25
Xcode_26 not compiling Metal project
Hello Xcode 26.0.1 (17A400) Missing some Metal components When building a program using Metal, it induces an unexpected error : “error: error: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain Command CompileMetalFile failed with a nonzero exit code” Which terminates the build The fix given “xcodebuild -downloadComponent MetalToolchain” using sudo does not work Did someone find a work around or could resolve the issue? Many thanks Jean MacBook Air M4; macOS 26.0.1; Xcode 26.0.1
3
2
313
Oct ’25
Metal recommendedMaxWorkingSetSize vs actual RAM on iPhone (LLM load fails)
Context I’m deploying large language models on iPhone using llama.cpp. A new iPhone Air (12 GB RAM) reports a Metal MTLDevice.recommendedMaxWorkingSetSize of 8,192 MB, and my attempt to load Llama-2-13B Q4_K (~7.32 GB weights) fails during model initialization. Environment Device: iPhone Air (12 GB RAM) iOS: 26 Xcode: 26.0.1 Build: Metal backend enabled llama.cpp App runs on device (not Simulator) What I’m seeing MTLCreateSystemDefaultDevice().recommendedMaxWorkingSetSize == 8192 MiB Loading Llama-2-13B Q4_K (7.32 GB) fails to complete. Logs indicate memory pressure / allocation issues consistent with the 8 GB working-set guidance. Smaller models (e.g., 7B/8B with similar quantization) load and run (8B Q4_K provide around 9 tokens/second decoding speed). Questions Is 8,192 MB an expected recommendedMaxWorkingSetSize on a 12 GB iPhone? What values should I expect on other 2025 devices including iPhone 17 (8 GB RAM) and iPhone 17 Pro (12 GB RAM) Is it strictly enforced by Metal allocations (heaps/buffers), or advisory for best performance/eviction behavior? Can a process practically exceed this for long-lived buffers without immediate Jetsam risk? Any guidance for LLM scenarios near the limit?
0
0
490
Oct ’25
realitytool requires Metal for this operation and it is not available in this build environment
Hello, I'm getting started for my project with Xcode Cloud since I upgraded to the macOS Sequioa Beta and Xcode 16 now refuses to archive builds for TestFlight. Somewhere very late in the build process I get the following error: realitytool requires Metal for this operation and it is not available in this build environment The log says this happens at: Compile Skybox urban.skybox My project uses RealityKit. How can I fix this issue? Thanks!
5
5
950
Oct ’25
Help Configuring Unity for Immersive VR on Vision Pro with Pinch Teleport
How do I configure a Unity project for a fully immersive VR app on Apple Vision Pro using Metal Rendering, and add a simple pinch-to-teleport-where-looking feature? I've tried the available samples and docs, but they don't cover this clearly (to me). So far, I've reviewed Unity XR docs, Apple dev guides, and tutorials, but most emphasize spatial apps. Metal examples exist but don't include teleportation. Specifically: visionOS sample "XRI_SimpleRig" – Deploys to device/simulator, but no full immersion or teleport. XRI Toolkit sample "XR Origin Hands (XR Rig)" – Pinch gestures detect, but not linked to movement. visionOS "XR Plugin" sample "Metal Sample URP" – Metal setup works, but static scene without locomotion. I'm new in Unity XR development and would appreciate a simple, standalone scene or document focused only on the essentials for "teleport to gaze on pinch" in VR mode—no extra features. I do have some experience in unreal, world toolkit, cosmo, etc from the 90's and I'm ok with code. Please include steps for: Setting up immersive VR (disabling spatial defaults if needed). Integrating pinch detection with ray-based teleport. Any config changes or basic scripts. Project Configuration: Unity Editor Version: 6000.2.5f1.2588.7373 (Revision: 6000.2/staging 43d04cd1df69) Installed Packages: Apple visionOS XR Plugin: 2.3.1 AR Foundation: 6.2.0 PolySpatial XR: 2.3.1 XR Core Utilities: 2.5.3 XR Hands: 1.6.1 XR Interaction Toolkit: 3.2.1 XR Legacy Input Helpers: 2.1.12 XR Plugin Management: 4.5.1 Imported Samples: Apple visionOS XR Plugin 2.3.1: Metal Sample - URP XR Hands 1.6.1 XR Interaction Toolkit 3.2.1: Hands Interaction Demo, Starter Assets, visionOS Build Platform Settings: Target: Apple visionOS App Mode: Metal Rendering with Compositor Services Selected Validation Profiles: visionOS Metal Documentation: Enabled Xcode Version: 26.01 visionOS SDK: 26 Mac Hardware: Apple M1 Max Target visionOS Version: 20 or 26 Test Environment: Model: Apple Vision Pro, visionOS 26.0.1 (23M341), Apple M1 Max No errors in builds so far; just missing the desired functionality. Thanks for a complete response with actionable steps.
0
0
241
Oct ’25
Metal CIKernel instances with arbitrarily structured data arguments
Hi, In the iOS13 and macOS Catalina release notes it says: Metal CIKernel instances now support arguments with arbitrarily structured data. I've been trying to use this functionality in a CIKernel with mixed results. I'm particularly interested in passing data in the form of a dynamically sized array. It seems to work up to a certain size. Beyond the threshold excessive data is discarded and the kernel becomes unstable. I assume there is some kind of memory alignment issue going on, but I've tried various types in my array and always get a similar result. I have not found any documentation or sample code regarding this. It would be great to know how this is intended to work and what the limitations are. In the forums there are two similar unanswered questions about data arguments, so I'm sure there are a few out there with similar issues. Thanks! Michael
5
0
455
Oct ’25
10-bit support in iPad Pro
Hi, I’m using the latest iPad Pro (13-inch) and I can see that Metal offers an rgb10a2unorm texture for rendering, but when I render a grey ramp and measure the actual luminance, I get a pattern that I would expect from an 8-bit texture (see below). Before I start ripping apart all my code, is there anything else I need to do to convince iOS to render my texture in 10-bit? I already tried setting the PixelFormat in my CMetalLayer to rgb10a2unorm, but that didn’t change anything.
1
0
438
Sep ’25
Metal fails to create PSO on AMD based GPUs
Hello, Shaders in our application is written using HLSL and we rely on Metal Shader Converter to convert DXIL to Metal IR. We ran into an issue that causes metal pipeline state creation to fail when vertex stage-in function is used on AMD GPUs. Here's the error reported by Metal in Xcode output: Compiler failed with XPC_ERROR_CONNECTION_INTERRUPTED XPC_ERROR_CONNECTION_INTERRUPTED MTLCompiler: Compilation failed with XPC_ERROR_CONNECTION_INTERRUPTED on 4 try. This error suggests an unexpected interruption in the connection. Possible reasons: a crash in the compiler service, termination by the OS due to resource constraints (e.g., jetsam), a timeout in the service, or an issue with IPC. Verify system stability and check the logs for more details. Compiler failed with XPC_ERROR_CONNECTION_INVALID XPC_ERROR_CONNECTION_INVALID MTLCompiler: Compiler encountered XPC_ERROR_CONNECTION_INVALID: failed to check-in, peer may have been unloaded: mach_error=10000003 (is the OS shutting down or process jetsammed?) Compilation failed due to an interrupted connection: XPC_ERROR_CONNECTION_INTERRUPTED. This error occurred after multiple retries. which seems to indicate a internal compiler error. I have a minimal repro here: https://github.com/kcloudy0717/metal_pso_fail/tree/main, simply follow the instructions in README.
1
0
159
Sep ’25
Xcode Cloud 26b7 Metal Compilation Failure
I've been getting intermittent failures on Xcode code compiling my app on multiple platforms because it fails to compile a metal shader. The Metal Toolchain was not installed and could not compile the Metal source files. Download the Metal Toolchain from Xcode > Settings > Components and try again. Sometimes if I re-run it, it works fine. Then I'll run it again, and it will fail. If you tell me to file a feedback, please tell me what information would be useful and actionable, because this is all I have.
11
7
868
Sep ’25
Metal Compositor Service & Persona (VisionOS)
Hello, I'm currently trying to make a collaborative app. But it just works only on Reality View, when I tried to use Compositor Layer like below, the personas disappeared. ImmersiveSpace(id: "ImmersiveSpace-Metal") { CompositorLayer(configuration: MetalLayerConfiguration()) { layerRenderer in SpatialRenderer_InitAndRun(layerRenderer) } } Is there any potential solution too see Personas in Metal view? Thanks in advance!
2
0
755
Sep ’25
Xcode 26 Beta 3: missing Metal Toolchain
Hello community, After unable to use Metal Toolchain for Xcode Beta 1 and 2 even after following the workarounds, and using Xcode Beta 3, I am still unable to use the downloaded Metal Compiler on my device. Building any project with metal file results in warning: Could not read serialized diagnostics file: error("Failed to open diagnostics file") (in target '<target>' from project '<project>') error: error: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain Command CompileMetalFile failed with a nonzero exit code Here is my build environment, $ xcodebuild -version Xcode 26.0 Build version 17A5276g I have also checked the downloaded metal toolchain. $ xcodebuild -downloadComponent metalToolchain -exportPath /tmp/MyMetalExport/ 2025-07-13 13:16:17.293 xcodebuild[2153:34019] IDEDownloadableMetalToolchainCoordinator: Failed to remount the Metal Toolchain: The file “com.apple.MobileAsset.MetalToolchain-v17.1.5276.7.3KEJwX” couldn’t be opened because you don’t have permission to view it. Beginning asset download... 2025-07-13 13:16:17.427 xcodebuild[2153:34022] IDEDownloadableMetalToolchainCoordinator: Failed to remount the Metal Toolchain: The file “com.apple.MobileAsset.MetalToolchain-v17.1.5276.7.3KEJwX” couldn’t be opened because you don’t have permission to view it. Downloaded asset to: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/47af11e2964f385d510c6a9d1a49c8165f334a51.asset/AssetData/Restore/022-19457-052.dmg Beginning asset export... Done exporting: /tmp/MyMetalExport/MetalToolchain-17A5276g.exportedBundle Done downloading: Metal Toolchain 17A5276g. The ExportMetadata.plist has a buildUpdateVersion of 17A5276g. Any suggestions are greatly appreciated.
1
2
663
Sep ’25
Xcode 26 beta 3 - Metal toolchain installed but not working
Under Xcode 26 beta 3 I'm trying to build a project which uses Metal. I've installed the Metal Toolchain 26.0 under Settings -> Components, but when I start a build it fails during the "Prepare build" step with the following error (repeated many times): stat(/var/run/com.apple.security.cryptexd/mnt/com.apple.MobileAsset.MetalToolchain-v17.1.5276.7.Pb9SLL/Metal.xctoolchain/usr/bin/clang): No such file or directory (2) I've confirmed that there is in fact no 'clang' binary in that directory. I've tried using xcode-select to set the Xcode 26 Beta app as the active developer directory, and xcodebuild -version shows: Xcode 26.0 Build version 17A5276g Any ideas on other things to try?
6
1
684
Sep ’25
Metal 4 & Acceleration Structures
I have really enjoyed looking through the code and videos related to Metal 4. Currently, my interest is to update a ReSTIR Project and take advantage of more robust ways to refit acceleration Structures and more powerful ways to access resources. I am working in Swift and have encountered a couple of puzzles: What is the 'accepted' way to create a MTL4BufferRange to store indices and vertices? How do I properly rewrite Swift code to build and compact an Acceleration Structure? I do realize that this is all in Beta and will happily look through Code Samples this Fall. If other guidance is available earlier, that would be fabulous! Thank you
4
0
659
Sep ’25
JAX Metal: Random Number Generation Performance Issue on M1 Max
JAX Metal shows 55x slower random number generation compared to NVIDIA CUDA on equivalent workloads. This makes Monte Carlo simulations and scientific computing impractical on Apple Silicon. Performance Comparison NVIDIA GPU: 0.475s for 12.6M random elements M1 Max Metal: 26.3s for same workload Performance gap: 55x slower Environment Apple M1 Max, 64GB RAM, macOS Sequoia Version 15.6.1 JAX 0.4.34, jax-metal latest Backend: Metal Reproduction Code import time import jax import jax.numpy as jnp from jax import random key = random.PRNGKey(42) start_time = time.time() random_array = random.normal(key, (50000, 252)) duration = time.time() - start_time print(f"Duration: {duration:.3f}s")
0
0
402
Aug ’25
[visionOS] How to render side-by-side stereo video?
I want to render a 3d/stereoscopic video in an Apple Vision Pro window using RealityKit/RealityView. The video is a left-right stereo. The straight forward approach would be to spawn a quad, and give it a custom Shader Graph material, which has a CameraIndexSwitch. The CameraIndexSwitch chooses between the right texture vs the left texture. https://i.sstatic.net/XawqjNcg.png The issue I have here is that I have to extract the video frames from my AVSampleBufferVideoRenderer. This should work ok, but not if I'm playing FairPlay content. So, my question is, how to render stereo FairPlay videos in a SwiftUI RealityView?
1
0
690
Aug ’25