Instance Property


The block that hosts use to ask the audio unit to render audio.


iOS, Mac Catalyst, tvOS
@property(readonly, nonatomic) AURenderBlock renderBlock;
@property(readonly, atomic) AURenderBlock renderBlock;


Before invoking an audio unit’s rendering functionality, a host should fetch this block and cache the result. The block can then be called from a realtime context without the possibility of blocking and causing an overload at the Core Audio HAL level.

This block will call a subclass’s internalRenderBlock implementation, providing all realtime events scheduled for the current render time interval, bracketed by calls to any render observers. Subclasses should override their internalRenderBlock implementation, not this property.

This version 3 property is bridged to the version 2 AudioUnitRender API.

See Also

Managing the Render Cycle


The block that hosts use to schedule parameters.


The maximum number of frames that the audio unit can render at once.

- tokenByAddingRenderObserver:

Adds a block to be called on each render cycle.

- removeRenderObserver:

Removes an observer block previously added to the render cycle.