Prototype and test your Metal apps in Simulator.
In Xcode 11, Simulator adds support for Metal development. You can write iOS and tvOS apps that use Metal and test them in the Simulator, gaining the benefits of hardware acceleration on the Mac during development of your app. If you use frameworks built on top of Metal, such as SceneKit, Core Animation, and UIKit, you'll also see better performance when testing your apps in Simulator.
Simulator is best used to rapidly prototype and test application behavior, and to develop the basic rendering or compute capabilities of your app. You shouldn't use it to design your final Metal workflow. Design your app to run on actual hardware, and test on real devices to tune its performance.
Get the Default Device Object in Simulator
In Simulator, call the
MTLCreate function to get the default device object, just as you do when running on device in iOS or tvOS. This returns a device object that connects to Simulator.
Treat Simulator as a Special Device
Simulator doesn't try to exactly simulate the GPU from the iOS or tvOS device you are simulating. For example, if you are simulating the iPhone XS, Simulator does not try to emulate the capabilities of an A12 GPU. Instead, Simulator translates any calls you make and directs them to the selected GPU on the host Mac.
Sometimes, this translation means that Simulator may support fewer features or different implementation limits than an actual Apple GPU. Simulator provides a device object with capabilities similar to an Apple family 2 GPU (
MTLGPUFamily), as described in the Metal Feature Set Tables. However, you must test the
MTLDevice object at runtime to determine exactly which features it supports and what its limits are.
In some cases, described below, Metal does not provide an API that you can use to detect these differences at runtime. In those situations, conditionalize your app's behavior for Simulator, as shown in the following code:
For more code examples, see Supporting Simulator in a Metal App.
When working with textures in Simulator, follow these recommendations:
If you want to create a texture that shares storage with a buffer (using
make), you must create the buffer with a private (
Texture(descriptor: offset: bytes Per Row:)
MTLStorage) storage mode. Call
minimumto determine the alignment requirements for the texture data; the alignment requirements are different in Simulator. When setting the texture's
Linear Texture Alignment(for:)
usageproperty, you can't include
renderas one of the uses.
Create depth, stencil, and MSAA textures only with private storage modes.
Don't use a sample count of 2 for MSAA textures.
Use only unified depth and stencil texture formats. Simulator doesn't support separate depth and stencil formats.
Don't use the following pixel formats:
Format .r8Unorm _srgb
MTLPixel, or any XR10 or YUV formats.
Don't render to textures with a
You can't write to sRGB textures in Simulator.
Constant Buffer Limitations
If you use buffers that use the constant address space as arguments to your shaders, follow these recommendations when running in Simulator:
Don't create constant buffers larger than 64 kilobytes in size.
When you set arguments for the render or compute command, align constant buffer offsets to 256 bytes. Normally, iOS requires an alignment of 4 bytes. This difference may mean that you might need to arrange your data differently when running in Simulator.
You cannot use more than 14 constant buffers as arguments to a render or compute pipeline.
Here are other places where you must do things differently when running in Simulator:
Create heaps only with private storage modes.
Simulator does not support programmable blending. When creating render pipelines, you can't read from color attachments in your fragment shader's arguments. This may mean, for example, that you may have to perform additional render passes when running in Simulator.
You can't use Xcode to capture frames, analyze the memory footprint, or measure performance when running in Simulator.
Prototype Your App with Simulator
Consider the following guiding principles for how to make Simulator an effective part of your Metal app development process:
Use Simulator to prototype and iterate on your app's workflow and behavior. For example, when prototyping a game, you only care about how the game plays, not whether the pixels match what renders on device or if the game uses the same approach to render its content. Similarly, in other Metal apps, you might need to iterate on your app's user experience. Simulator lets you test app behavior without needing a device.
Don't use Simulator to design your iOS or tvOS rendering engine. Simulator's features differ significantly from modern Apple GPUs. To get best performance and battery life on devices with Apple GPUs, you need to use Metal features that Simulator doesn't support. To develop, test, and profile those code paths, you'll need to run on a device.
Decide whether to provide long-term support for Simulator. Maintaining a separate Metal path for Simulator takes time and effort. A large game development team can have many game designers and engine developers. Supporting Simulator lets designers work in Simulator to perfect gameplay while engineers work with devices to design the game engine and tune its performance. On a smaller team, you might find that your time is better spent focusing on device support rather than devoting resources to keep your game running in Simulator.
For more information on Simulator, see Simulator Help. For more information on the differences between testing on device and testing in Simulator, see Differences Between Simulated and Physical Devices.