Is Using Metal Compute Shaders for Efficient Resource Copying to RealityKit the Best Approach for Streaming Data in Real-Time Rendering?

Hi Apple,

In VisionOS, for real-time streaming of large 3D scenes, I plan to create Metal buffers and textures in multiple threads and then use a compute shader on the main thread to copy the Metal resources into RealityKit, minimizing main thread usage. Given that most of RealityKit's default APIs require execution on the main actor (main thread), it is not ideal for streaming data. Is this approach the best way to handle streaming data and real-time rendering?

Thank you very much.

Is Using Metal Compute Shaders for Efficient Resource Copying to RealityKit the Best Approach for Streaming Data in Real-Time Rendering?
 
 
Q