Metal Ray Tracing, RealityKit, SwiftUI Problems

First of all, I apologize for such a general question, but my code is far too long to post. However, I have narrowed down my problems and am seeking advice for any solutions/workarounds that may help me.

I recently updated my physics simulation code from using Metal Performance Shaders for ray tracing to using the (fairly) new Metal ray tracing routines directly. A few notes on my program:

  1. I perform the ray tracing entirely in a separate thread using Compute Kernels -- there is no rendering based on the ray tracing results. The compute kernels are called repeatedly and each ends with a waitUntilCompleted() command.
  2. The main thread is running a SwiftUI interface with some renderings that use RealityKit to display the scene (i.e., the vertices), and the rays that traverse the scene using Metal Ray Tracing.
  3. This is purely a Mac program and has no IOS support.

So, the problem is that there seems to be some conflict between RealityKit rendering and the ray-tracing compute kernels where I will get a "GPU Soft Fault" when I run the "intersect" command in Metal. After this soft-fault error, my Ray Tracing results are completely bogus. I have figured out a solution to this which is to refit my acceleration structures semi-regularly. However, this solution is inelegant and probably not sustainable. This problem gets worse the more I render in my RealityKit display UI (rendered as a SwiftUI view) so I am now confident that the problem is some "collision" between the GPU resources needed by my program and RealityKit. I have not been able to find any information on what a "GPU Soft Fault" actually is although I suspect it is a memory violation.

I suspect that I need to use fences to cordon off my ray tracing compute kernel from other things that use Metal (i.e., RealityKit), however I am just not sure if this is the case or how to accomplish this.

Again, I apologize for the vague question, but I am really stuck. I have confirmed that every Metal buffer I pass to my compute kernel is correct. I did this confirmation by making my object a simple cube and having only one instance of this cube. Something happens to either corrupt the acceleration structure data or to make it inaccessible during certain times when RealityKit needs to use the GPU.

Any advice would be appreciated. I have not submitted a bug report since I am still not sure if this is just my lack of advanced knowledge of multiple actors requiring GPU use or if there is something more serious here. Thanks in advance,

-Matt

One speculated guess is you're missing useResource: calls and not making your resources resident. It may also be worth checking with Metal validation tools. As you suspected, refitting acceleration structures regularly shouldn't be a fix for this. It may be that refitting them makes a buffer resource resident when otherwise it would not be.

We can take a better diagnosis if you file a Feedback Assistant report with a minimal reproducing sample, and share the Feedback Assistant report ID here.

Just an update in case this show up in people's web searchers. The original reply post was correct that I was missing useResource commands. However, what I didn't realize until recently is that I needed to apply them to the instance descriptor instance acceleration structures (all of them!). I just assumed the acceleration structure itself needed to have its own useResource. I didn't see this in the documentation but it does show up briefly in one of the developer videos. See responses from "Graphics and Games Enginner"s at: https://forums.developer.apple.com/forums/thread/732588

Problem solved.

-Matt

Metal Ray Tracing, RealityKit, SwiftUI Problems
 
 
Q