(iOS) Issues with 'aumi' Audio Units and AVAudioEngine

Hi all,


I just filled the bug report 39600781, regarding 'aumi' (kAudioUnitType_MIDIProcessor) audio units and AVAudioEngine. In short: in iOS, when attaching 'aumi' audio units to an AVAudioEngine, the AVAudioEngine will never call their renderBlock / internalRenderBlock, rendering them (pun 😝) useless.


I write this post because I know that 'aumi' audio units in iOS are undocumented, thus some quirks are to be expected 🙂. But since some developers are already implementing them (I am beta testing their apps), I think it is important to raise awareness that 'aumi' audio units do not work with hosts that use AVAudioEngine. Note that hosts using AUGraph work, because they need to explicitly call the renderBlock method of their audio units. Yet since AUGraph is going to be deprecated... well, if "aumis" are to be officially introduced in the next iOS version, it is important to raise awareness about this particular issue.


That's all. Thanks for reading 😉.

Replies

I found by looking at the example hosting code that if you attach a dormant player node to each aumi, this would cause the render block to be called. That was until recently where this technique now does not work either.


Did you get any resolution with the bug report at all? Is there a workaround?

The bug report was closed, but I still have the same problem (renderBlock / internalRenderBlock is never called). And unfortunately, i know no workarounds around this issue 😟.

This is Still a problem.


Has anyone found a good workaround for this by now?

My current workaround is to create a "fake" Music Device node (a simple AUAudioUnit subclass that just generates silence), connect that to the main mixer and then connect each MIDI processor unit to that using AVAudioEngine connectMidi:to:format:block. In the block parameter I can then set my AUMIDIOutputEventBlock to get the MIDI data of the MIDI processor.

Which seems to work (but no idea about the stability of it yet, I only got it working around 30 minutes ago :-) ), but is kind of bad, as I need to keep a not really needed node active. One per MIDI processor that I use (because connectMidi is only many-to-one).

So: Does anyone have a better solution???
  • Hi Snarp,

    Is it possible if you can send the code. I tried your solution but it doesn't work, and I'm not sure at what step I'm doing something wrong.

    Thanks,

    Ali

Add a Comment

Any update on this issue?

On my app (KeyStage), I'm manually calling the renderblocks of all MIDI processor plugins in the AURenderObserverBlock while letting AVAudioEngine handle all instrument and effect audio units. But I'm having lots of issues with this approach: Getting EXC_BAD_ACCESS errors out of the blue, hitting high DSP for no apparent reason, etc.

I can't believe that Apple keeps promoting this AvAudioEngine framework and still hasn't resolved this major issue.