AUv3: multi-threaded audio rendering

I wondered, is there any notion in the current design of the Audio Unit v3 API regarding multi-threaded audio rendering? Since most systems nowadays provide multi core CPUs (both on Mac and iOS), a AUv3 host implementation might want to use several audio threads for the individual AUs i.e. to bounce (render) the final mix of a song as fast as possible. Is there a any concept for that in the AUv3 API?


If yes, would that be limited to host's in-process AU plug-ins (Mac only) or do App Extension based AUs support that as well (iOS and Mac)? Because obviously, since AU App Extensions' isolated processes and threads are managed by the OS's Audio Unit subsystem (not by the host), this would only work if Apple implemented support for such a multi-threaded audio rendering feature.


And if that's supported, how can an AU know whether render calls to its individual AU instances might be called from different audio threads? Because obviously the AU implementation needs to prepare itself to one of the two scenarios (single audio thread vs. concurrent audio threads) in order to optimize resources and providing stability.

AUv3: multi-threaded audio rendering
 
 
Q