AUv3 Audio Unit Extension Midi Output Source

Hi,


I was very excited to see this:

https://developer.apple.com/videos/play/wwdc2017/501/


So now, AU extensions can send midi output to the host.

But I didn't see any sample code for this portion?

Could Apple post the sample code for the drum app shown in the presentation?

Thanks.

Replies

Looks like AudioUnitV3Example sample code has been updaed to include MIDI output from the AUs. Though, it looks like at the moment it simply outputs the MIDI messages it receives.


As a side note, I'm wondering if this new feature allows us to build standalone MIDI FX audio units. Does anyone know?

Do you mean this snippet in InstrumentDemo.mm? Line 7?


- (AUInternalRenderBlock)internalRenderBlock {
  /
  Capture in locals to avoid ObjC member lookups. If "self" is captured in
        render, we're doing it wrong.
  */
  __block InstrumentDSPKernel *state = &_kernel;
  __block AUMIDIOutputEventBlock midiOut = self.MIDIOutputEventBlock;
   
    return ^AUAudioUnitStatus(
  AudioUnitRenderActionFlags *actionFlags,
  const AudioTimeStamp       *timestamp,
  AVAudioFrameCount           frameCount,
  NSInteger                   outputBusNumber,
  AudioBufferList            *outputData,
  const AURenderEvent        *realtimeEventListHead,
  AURenderPullInputBlock      pullInputBlock) {

  _outputBusBuffer.prepareOutputBufferList(outputData, frameCount, true);
  state->setBuffers(outputData);
  state->processWithEvents(timestamp, frameCount, realtimeEventListHead, midiOut);


  return noErr;
  };
}

yes, if you look inside the processWithEvents function you'll see it passes through all the incoming MIDI events.

But isn't this for incoming midi event from host to AU?

Where is the portion that I should use for sending midi events to host (from the au)?

void DSPKernel::performAllSimultaneousEvents(AUEventSampleTime now, AURenderEvent const *&event, AUMIDIOutputEventBlock midiOut) {
  do {
  handleOneEvent(event);
        if (event->head.eventType == AURenderEventMIDI && midiOut)
        {
            midiOut(now, 0, event->MIDI.length, event->MIDI.data);
        }
       
  /
  event = event->head.next;

  /
  } while (event && event->head.eventSampleTime <= now);
}


From Line 4 to 7 that's where it filters for incoming MIDI events and sends them back out to the host.

Ah I see.. thanks!

Still, an example would be nice!

Haha, fair enough... Yeah I'm all for more sample code and more documentation for Core Audio. This stuff is hard.

Ah I see in InstrumentDemo.mm there is soemthing like this for the host sending of midi notes.


            cbytes[0] = 0xB0
            cbytes[1] = 123
            cbytes[2] = 0
            self.noteBlock(AUEventSampleTimeImmediate, 0, 3, cbytes)


I guess likely the same would be done for midiOut for sending to host from au.


Would a Core Audio guy chime in, please?

For what it's worth, in the current incarnation of FilterDemo, the midiOut parameter has been removed and InstrumentDemo.mm no longer has the noteBlock.


You do get the midi block in the AU, so you can do whatever.