iOS Developer Library

Developer

AudioUnit Framework Reference Audio Unit Component Services Reference

Options
Deployment Target:

On This Page
Language:

Audio Unit Component Services Reference

Inheritance


Not Applicable

Conforms To


Not Applicable

Import Statement


Swift

import AudioUnit

Objective-C

@import AudioUnit;

Audio Unit Component Services provides the C interface for using audio units. An audio unit is an audio processing plug-in used for processing or generating audio data. To find, open, and close audio units, you use a companion interface, Audio Component Services, described in Audio Component Services Reference.

An audio unit is uniquely identified by a triplet of codes known as type, subtype, and manufacturer ID. See the AudioComponentDescription structure in Audio Component Services Reference.

Functions

  • Initializes an audio unit

    Declaration

    Swift

    func AudioUnitInitialize(_ inUnit: AudioUnit) -> OSStatus

    Objective-C

    OSStatus AudioUnitInitialize ( AudioUnit inUnit );

    Parameters

    inUnit

    The audio unit to initialize.

    Return Value

    A result code.

    Discussion

    On successful initialization, the audio formats for input and output are valid and the audio unit is ready to render. During initialization, an audio unit allocates memory according to the maximum number of audio frames it can produce in response to a single render call.

    Usually, the state of an audio unit (such as its I/O formats and memory allocations) cannot be changed while an audio unit is initialized.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Uninitializes an audio unit.

    Declaration

    Swift

    func AudioUnitUninitialize(_ inUnit: AudioUnit) -> OSStatus

    Objective-C

    OSStatus AudioUnitUninitialize ( AudioUnit inUnit );

    Parameters

    inUnit

    The audio unit that you want to uninitialize.

    Return Value

    A result code.

    Discussion

    Before you change an initialize audio unit’s processing characteristics, such as its input or output audio data format or its sample rate, you must first uninitialize it. Calling this function deallocates the audio unit’s resources.

    After calling this function, you can reconfigure the audio unit and then call AudioUnitInitialize to reinitialize it.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Registers a callback to receive audio unit render notifications.

    Declaration

    Swift

    func AudioUnitAddRenderNotify(_ inUnit: AudioUnit, _ inProc: AURenderCallback, _ inProcUserData: UnsafeMutablePointer<Void>) -> OSStatus

    Objective-C

    OSStatus AudioUnitAddRenderNotify ( AudioUnit inUnit, AURenderCallback inProc, void *inProcUserData );

    Parameters

    inUnit

    The audio unit that you want to receive render notifications from.

    inProc

    The callback that you are registering.

    inProcUserData

    Custom data that you want to be sent to your callback. Use this, for example, to identify the render listener.

    Return Value

    A result code.

    Discussion

    The registered callback function is called both before the audio unit performs its render operations (when the render flag’s pre-render bit is set) and after the audio unit has completed its render operation (the render flag’s post-render bit is set).

    The inProc and inProcUserData parameters are treated as a two-part identifier. To remove a render listener, you must pass both these values to the AudioUnitRemoveRenderNotify function.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Unregisters a previously-registered render listener callback function.

    Declaration

    Swift

    func AudioUnitRemoveRenderNotify(_ inUnit: AudioUnit, _ inProc: AURenderCallback, _ inProcUserData: UnsafeMutablePointer<Void>) -> OSStatus

    Objective-C

    OSStatus AudioUnitRemoveRenderNotify ( AudioUnit inUnit, AURenderCallback inProc, void *inProcUserData );

    Parameters

    inUnit

    The audio unit that you no longer want to receive render notifications from.

    inProc

    The callback function that you previously registered and are now unregistering.

    inProcUserData

    The custom data that you provided when registering the callback function.

    Return Value

    A result code.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Initiates a rendering cycle for an audio unit.

    Declaration

    Swift

    func AudioUnitRender(_ inUnit: AudioUnit, _ ioActionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>, _ inTimeStamp: UnsafePointer<AudioTimeStamp>, _ inOutputBusNumber: UInt32, _ inNumberFrames: UInt32, _ ioData: UnsafeMutablePointer<AudioBufferList>) -> OSStatus

    Objective-C

    OSStatus AudioUnitRender ( AudioUnit inUnit, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inOutputBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData );

    Parameters

    inUnit

    The audio unit that you are asking to render.

    ioActionFlags

    Flags to configure the rendering operation.

    inTimeStamp

    The audio time stamp for the render operation. Each time stamp must contain a valid sample time that is incremented monotonically from the previous call to this function. That is, the next time stamp is equal to inTimeStamp + inNumberFrames.

    If sample time does not increase like this from one render call to the next, the audio unit interprets that as a discontinuity with the timeline it is rendering for.

    When rendering to multiple output buses, ensure that this value is the same for each bus. Using the same value allows an audio unit to determine that the rendering for each output bus is part of a single render operation.

    inOutputBusNumber

    The output bus to render for.

    inNumberFrames

    The number of audio sample frames to render.

    ioData

    On input, the audio buffer list that the audio unit is to render into. On output, the audio data that was rendered by the audio unit.

    The AudioBufferList that you provide on input must match the topology for the current audio format for the given bus. The buffer list can be either of these two variants:

    • If the mData pointers are non-null, the audio unit renders its output into those buffers

    • If the mData pointers are null, the audio unit can provide pointers to its own buffers. In this case, the audio unit must keep those buffers valid for the duration of the calling thread’s I/O cycle.

    Return Value

    A result code.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Resets an audio unit’s render state.

    Declaration

    Swift

    func AudioUnitReset(_ inUnit: AudioUnit, _ inScope: AudioUnitScope, _ inElement: AudioUnitElement) -> OSStatus

    Objective-C

    OSStatus AudioUnitReset ( AudioUnit inUnit, AudioUnitScope inScope, AudioUnitElement inElement );

    Parameters

    inUnit

    The audio unit whose render state you are resetting.

    inScope

    The audio unit scope, typically set to kAudioUnitScope_Global.

    inElement

    The audio unit element, typically set to 0.

    Return Value

    A result code.

    Discussion

    This function resets the render state of an audio unit. For example, with a delay or reverb type of audio unit, it clears all of the delay lines maintained within the audio unit. Typically, you call this function when an audio unit was previously rendering and was taken out of the render chain (for example, if the track it is in gets muted) and is now being added back in (for example, unmuted). Your application should reset the audio unit before adding it back to the render chain so that it does not produce audio from its delay lines that is no longer valid.

    This function clears memory. It does not allocate or free memory resources.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Registers a callback to receive audio unit property change notifications.

    Declaration

    Swift

    func AudioUnitAddPropertyListener(_ inUnit: AudioUnit, _ inID: AudioUnitPropertyID, _ inProc: AudioUnitPropertyListenerProc, _ inProcUserData: UnsafeMutablePointer<Void>) -> OSStatus

    Objective-C

    OSStatus AudioUnitAddPropertyListener ( AudioUnit inUnit, AudioUnitPropertyID inID, AudioUnitPropertyListenerProc inProc, void *inProcUserData );

    Parameters

    inUnit

    The audio unit you want to receive property change notifications from.

    inID

    The identifier for the property that you want to monitor.

    inProc

    The callback that you are registering.

    inProcUserData

    Custom data that you want to be sent to your callback. Use this, for example, to identify the property listener.

    Return Value

    A result code.

    Discussion

    When an audio unit property value changes, a notification callback can be called by the audio unit to inform interested parties that this event has occurred. The notification is defined by the tuple of the inProc, inProcUserData, inID parameters.

    To unregister a callback, use the AudioUnitRemovePropertyListenerWithUserData function.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Unregisters a previously-registered property listener callback function.

    Declaration

    Swift

    func AudioUnitRemovePropertyListenerWithUserData(_ inUnit: AudioUnit, _ inID: AudioUnitPropertyID, _ inProc: AudioUnitPropertyListenerProc, _ inProcUserData: UnsafeMutablePointer<Void>) -> OSStatus

    Objective-C

    OSStatus AudioUnitRemovePropertyListenerWithUserData ( AudioUnit inUnit, AudioUnitPropertyID inID, AudioUnitPropertyListenerProc inProc, void *inProcUserData );

    Parameters

    inUnit

    The audio unit that you no longer want to receive property change notifications from.

    inID

    The identifier for the property that you no longer want to monitor.

    inProc

    The callback function that you previously registered and are now unregistering.

    inProcUserData

    The custom data that you provided when registering the callback function.

    Return Value

    A result code.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Gets the value of an audio unit property.

    Declaration

    Swift

    func AudioUnitGetProperty(_ inUnit: AudioUnit, _ inID: AudioUnitPropertyID, _ inScope: AudioUnitScope, _ inElement: AudioUnitElement, _ outData: UnsafeMutablePointer<Void>, _ ioDataSize: UnsafeMutablePointer<UInt32>) -> OSStatus

    Objective-C

    OSStatus AudioUnitGetProperty ( AudioUnit inUnit, AudioUnitPropertyID inID, AudioUnitScope inScope, AudioUnitElement inElement, void *outData, UInt32 *ioDataSize );

    Parameters

    inUnit

    The audio unit that you want to get a property value from.

    inID

    The identifier for the property.

    inScope

    The audio unit scope for the property.

    inElement

    The audio unit element for the property.

    outData

    On successful output, the current value for the specified audio unit property. Set this parameter to NULL when calling this function if you only want to determine how much memory to allocate for a variable size property.

    ioDataSize

    On input, the expected size of the property value, as pointed to by the outData parameter. On output, the size of the data that was returned.

    Return Value

    A result code.

    Special Considerations

    Some Core Audio property values are C types and others are Core Foundation objects.

    If you call this function to retrieve a value that is a Core Foundation object, then this function—despite the use of “Get” in its name—duplicates the object. You are responsible for releasing the object, as described in The Create Rule in Memory Management Programming Guide for Core Foundation.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Gets information about an audio unit property.

    Declaration

    Swift

    func AudioUnitGetPropertyInfo(_ inUnit: AudioUnit, _ inID: AudioUnitPropertyID, _ inScope: AudioUnitScope, _ inElement: AudioUnitElement, _ outDataSize: UnsafeMutablePointer<UInt32>, _ outWritable: UnsafeMutablePointer<Boolean>) -> OSStatus

    Objective-C

    OSStatus AudioUnitGetPropertyInfo ( AudioUnit inUnit, AudioUnitPropertyID inID, AudioUnitScope inScope, AudioUnitElement inElement, UInt32 *outDataSize, Boolean *outWritable );

    Parameters

    inUnit

    The audio unit that you want to get property information from.

    inID

    The identifier for the property.

    inScope

    The audio unit scope for the property.

    inElement

    The audio unit element for the property.

    outDataSize

    On successful output, the maximum size for the audio unit property. Can be NULL on input, in which case no value is returned.

    outWritable

    On successful output, a Boolean value indicating whether the property can be written to (YEStrue) or not (NOfalse). Can be NULL on input, in which case no value is returned.

    Return Value

    A result code.

    Discussion

    Some properties that have read/write access when an audio unit is uninitialized become read-only when the audio unit is initialized.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Sets the value of an audio unit property.

    Declaration

    Swift

    func AudioUnitSetProperty(_ inUnit: AudioUnit, _ inID: AudioUnitPropertyID, _ inScope: AudioUnitScope, _ inElement: AudioUnitElement, _ inData: UnsafePointer<Void>, _ inDataSize: UInt32) -> OSStatus

    Objective-C

    OSStatus AudioUnitSetProperty ( AudioUnit inUnit, AudioUnitPropertyID inID, AudioUnitScope inScope, AudioUnitElement inElement, const void *inData, UInt32 inDataSize );

    Parameters

    inUnit

    The audio unit that you want to set a property value for.

    inID

    The audio unit property identifier.

    inScope

    The audio unit scope for the property.

    inElement

    The audio unit element for the property.

    inData

    The value that you want to apply to the property. May be NULL (see Discussion).

    Always pass property values by reference. For example, for a property value of type CFStringRef, pass it as &myCFString.

    inDataSize

    The size of the data you are providing in the inData parameter.

    Return Value

    A result code.

    Discussion

    To clear an audio unit property value, set the inData parameter to NULL and set the inDataSize parameter to 0. Clearing properties works only for those properties that do not have a default value.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Gets the value of an audio unit parameter.

    Declaration

    Swift

    func AudioUnitGetParameter(_ inUnit: AudioUnit, _ inID: AudioUnitParameterID, _ inScope: AudioUnitScope, _ inElement: AudioUnitElement, _ outValue: UnsafeMutablePointer<AudioUnitParameterValue>) -> OSStatus

    Objective-C

    OSStatus AudioUnitGetParameter ( AudioUnit inUnit, AudioUnitParameterID inID, AudioUnitScope inScope, AudioUnitElement inElement, AudioUnitParameterValue *outValue );

    Parameters

    inUnit

    The audio unit that you want to get a parameter value from.

    inID

    The identifier for the parameter.

    inScope

    The audio unit scope for the parameter.

    inElement

    The audio unit element for the parameter.

    outValue

    On success, contains the current value for the specified audio unit parameter.

    Return Value

    A result code.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Sets the value of an audio unit parameter.

    Declaration

    Swift

    func AudioUnitSetParameter(_ inUnit: AudioUnit, _ inID: AudioUnitParameterID, _ inScope: AudioUnitScope, _ inElement: AudioUnitElement, _ inValue: AudioUnitParameterValue, _ inBufferOffsetInFrames: UInt32) -> OSStatus

    Objective-C

    OSStatus AudioUnitSetParameter ( AudioUnit inUnit, AudioUnitParameterID inID, AudioUnitScope inScope, AudioUnitElement inElement, AudioUnitParameterValue inValue, UInt32 inBufferOffsetInFrames );

    Parameters

    inUnit

    The audio unit that you want to set a parameter value for.

    inID

    The audio unit parameter identifier.

    inScope

    The audio unit scope for the parameter.

    inElement

    The audio unit element for the parameter.

    inValue

    The value that you want to apply to the parameter.

    inBufferOffsetInFrames

    Set this to 0. To schedule the setting of a parameter value, use the AudioUnitScheduleParameters function.

    Return Value

    A result code.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Schedules changes to the value of an audio unit parameter.

    Declaration

    Swift

    func AudioUnitScheduleParameters(_ inUnit: AudioUnit, _ inParameterEvent: UnsafePointer<AudioUnitParameterEvent>, _ inNumParamEvents: UInt32) -> OSStatus

    Objective-C

    OSStatus AudioUnitScheduleParameters ( AudioUnit inUnit, const AudioUnitParameterEvent *inParameterEvent, UInt32 inNumParamEvents );

    Parameters

    inUnit

    The audio unit that you want to schedule parameter changes for.

    inParameterEvent

    One or more parameter events that you want to schedule.

    inNumParamEvents

    The number of audio unit parameter events represented in the inParameterEvent parameter.

    Return Value

    A result code.

    Discussion

    Use this function to schedule changes to the value of an audio unit parameter.

    • A so-called immediate audio unit parameter event takes place at a future time and involves an immediate change from one value to another.

    • A so-called ramped audio unit parameter event begins at a future time and proceeds linearly, over a specified number of audio samples, from a starting value to a final value.

    With a single call to this function, you can schedule multiple parameter events. All the events apply only to the current audio unit render call; the events are scheduled as a part of the pre-render notification callback.

    When scheduling an immediate parameter event, you provide a new value to be set at the specified sample buffer offset.

    When scheduling a ramped parameter, the ramp is scheduled each audio unit render for the duration of the ramp. Each schedule of the the new audio unit render specifies the progress of the ramp.

    An audio unit parameter that accepts scheduled events indicates this through its AudioUnitParameterInfo structure.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

Callbacks

  • Called by the system when an audio unit has provided a buffer of output samples.

    Declaration

    Swift

    typealias AUInputSamplesInOutputCallback = CFunctionPointer<((UnsafeMutablePointer<Void>, UnsafePointer<AudioTimeStamp>, Float64, Float64) -> Void)>

    Objective-C

    typedef void (*AUInputSamplesInOutputCallback) ( void *inRefCon, const AudioTimeStamp *inOutputTimeStamp, Float64 inInputSample, Float64 inNumberInputSamples );

    Parameters

    inRefCon

    Custom data that you provided when registering your callback with the audio unit.

    inOutputTimeStamp

    The time stamp that corresponds to the first sample of audio data produced in AudioUnitRender (its output data).

    inInputSample

    The sample number of the input that is represented in the first sample of that output time stamp.

    inNumberInputSamples

    The number of input samples that are represented in an output buffer.

    Return Value

    A result code.

    Discussion

    When your application uses a varispeed or pitch-shifting audio unit, it may not be clear which input samples are represented in a buffer of output samples. This callback function addresses this issue by providing the input sample number corresponding to the first sample in an output buffer.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Called by the system when an audio unit requires input samples, or before and after a render operation.

    Declaration

    Swift

    typealias AURenderCallback = CFunctionPointer<((UnsafeMutablePointer<Void>, UnsafeMutablePointer<AudioUnitRenderActionFlags>, UnsafePointer<AudioTimeStamp>, UInt32, UInt32, UnsafeMutablePointer<AudioBufferList>) -> OSStatus)>

    Objective-C

    typedef OSStatus (*AURenderCallback) ( void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData );

    Parameters

    inRefCon

    Custom data that you provided when registering your callback with the audio unit.

    ioActionFlags

    Flags used to describe more about the context of this call (pre or post in the notify case for instance).

    inTimeStamp

    The timestamp associated with this call of audio unit render.

    inBusNumber

    The bus number associated with this call of audio unit render.

    inNumberFrames

    The number of sample frames that will be represented in the audio data in the provided ioData parameter.

    ioData

    The AudioBufferList that will be used to contain the rendered or provided audio data.

    Return Value

    A result code.

    Discussion

    You can use this callback function with both the audio unit render notification API (see the AudioUnitAddRenderNotify function) and the render input callback (see the kAudioUnitProperty_SetRenderCallback property).

    As a notification listener, the system invokes this callback before and after an audio unit’s render operations.

    As a render operation input callback, it is invoked when an audio unit requires input samples for the input bus that the callback is attached to.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

Data Types

  • The data type for a plug-in component that provides audio processing or audio data generation.

    Declaration

    Swift

    typealias AudioUnit = AudioComponentInstance

    Objective-C

    typedef AudioComponentInstance AudioUnit;

    Discussion

    The various types of audio units are described in the Audio Unit Types enumeration. The subtypes of audio units provided by Apple are described in Converter Audio Unit Subtypes, Effect Audio Unit Subtypes, Mixer Audio Unit Subtypes, and Input/Output Audio Unit Subtypes.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • The data type for an audio unit element identifier.

    Declaration

    Swift

    typealias AudioUnitElement = UInt32

    Objective-C

    typedef UInt32 AudioUnitElement;

    Discussion

    An audio unit element is a discrete programmatic context that is nested within an audio unit scope (see AudioUnitScope). In the context of input and output scopes, elements serve as programmatic analogs of physical signal buses in hardware audio devices. Because of this analogy, the term “bus” is a common synonym for “element.”

    Elements are zero indexed. The Global scope always has exactly one element—element 0.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • The data type for an audio unit parameter identifier.

    Declaration

    Swift

    typealias AudioUnitParameterID = UInt32

    Objective-C

    typedef UInt32 AudioUnitParameterID;

    Discussion

    An audio unit parameter is an adjustable setting with a floating-point value. The parameters for Apple-supplied audio units are described in Audio Unit Parameters Reference. See also AudioUnitParameterValue, AudioUnitParameter.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • The data type for an audio unit parameter value.

    Declaration

    Swift

    typealias AudioUnitParameterValue = Float32

    Objective-C

    typedef Float32 AudioUnitParameterValue;

    Discussion

    An audio unit parameter is an adjustable setting, such as gain. The parameters for Apple-supplied audio units are described in Audio Unit Parameters Reference.

    You can change a parameter value directly by calling the AudioUnitSetParameter function, or schedule a change by calling AudioUnitScheduleParameters. See also AudioUnitParameterID, AudioUnitParameter.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • The data type for audio unit property keys.

    Declaration

    Swift

    typealias AudioUnitPropertyID = UInt32

    Objective-C

    typedef UInt32 AudioUnitPropertyID;

    Discussion

    An audio unit property is a key-value pair for a configuration setting, such as audio data stream format. The properties for Apple-supplied audio units are described in Audio Unit Properties Reference.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • The data type for audio unit scope identifiers.

    Declaration

    Swift

    typealias AudioUnitScope = UInt32

    Objective-C

    typedef UInt32 AudioUnitScope;

    Discussion

    An audio unit scope is a discrete, nonnestable programmatic context for an audio unit. The scopes for audio units are described in the Audio Unit Scopes enumeration.

    Apple reserves audio unit scope identifiers from 0 through 1,024.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • An adjustable audio unit attribute such as volume, pitch, or filter cutoff frequency.

    Declaration

    Swift

    struct AudioUnitParameter { var mAudioUnit: AudioUnit var mParameterID: AudioUnitParameterID var mScope: AudioUnitScope var mElement: AudioUnitElement init() init(mAudioUnit mAudioUnit: AudioUnit, mParameterID mParameterID: AudioUnitParameterID, mScope mScope: AudioUnitScope, mElement mElement: AudioUnitElement) }

    Objective-C

    struct AudioUnitParameter { AudioUnit mAudioUnit; AudioUnitParameterID mParameterID; AudioUnitScope mScope; AudioUnitElement mElement; }; typedef struct AudioUnitParameter AudioUnitParameter;

    Fields

    mAudioUnit

    The audio unit instance that the parameter applies to.

    mParameterID

    The audio unit parameter identifier.

    mScope

    The audio unit scope for the parameter.

    mElement

    The audio unit element for the parameter.

    Discussion

    This data structure is used by functions declared in the AudioToolbox/AudioUnitUtilities.h header file in OS X.

    An audio unit parameter is uniquely identified by the combination of its scope, element, and ID.

    Availability

    Available in iOS 2.0 and later.

  • A scheduled change to an audio unit parameter’s value.

    Declaration

    Swift

    struct AudioUnitParameterEvent { var scope: AudioUnitScope var element: AudioUnitElement var parameter: AudioUnitParameterID var eventType: AUParameterEventType init() }

    Objective-C

    struct AudioUnitParameterEvent { AudioUnitScope scope; AudioUnitElement element; AudioUnitParameterID parameter; AUParameterEventType eventType; union { struct { SInt32 startBufferOffset; UInt32 durationInFrames; AudioUnitParameterValue startValue; AudioUnitParameterValue endValue; } ramp; struct { UInt32 bufferOffset; AudioUnitParameterValue value; } immediate ; } eventValues; }; typedef struct AudioUnitParameterEvent AudioUnitParameterEvent;

    Fields

    scope

    The audio unit scope for the parameter.

    element

    The audio unit element for the parameter.

    parameter

    The audio unit parameter identifier

    eventType

    The event type. See Audio Unit Parameter Event Types.

    startBufferOffset

    For a ramped parameter event, the sample time at which to begin the parameter value change.

    durationInFrames

    For a ramped parameter event, the number of frames over which the parameter value should linearly change from startValue to endValue.

    startValue

    For a ramped parameter event, the starting parameter value.

    endValue

    For a ramped parameter event, the ending parameter value.

    bufferOffset

    For an immediate parameter event, the sample time at which to change the parameter value.

    value

    For an immediate parameter event, the new parameter value.

    Discussion

    If the eventType field value is kParameterEvent_Immediate, use the immediate structure in the eventValues union. If the event type is kParameterEvent_Ramped, use the ramp structure in the eventValues union.

    Apply one or more AudioUnitParameterEvent events to an audio unit using the AudioUnitScheduleParameters function.

    Availability

    Available in iOS 2.0 and later.

  • A key-value pair that declares an attribute or behavior for an audio unit.

    Declaration

    Swift

    struct AudioUnitProperty { var mAudioUnit: AudioUnit var mPropertyID: AudioUnitPropertyID var mScope: AudioUnitScope var mElement: AudioUnitElement init() init(mAudioUnit mAudioUnit: AudioUnit, mPropertyID mPropertyID: AudioUnitPropertyID, mScope mScope: AudioUnitScope, mElement mElement: AudioUnitElement) }

    Objective-C

    struct AudioUnitProperty { AudioUnit mAudioUnit; AudioUnitPropertyID mPropertyID; AudioUnitScope mScope; AudioUnitElement mElement; }; typedef struct AudioUnitProperty AudioUnitProperty;

    Fields

    mAudioUnit

    The audio unit instance that the parameter applies to.

    mPropertyID

    The audio unit property identifier.

    mScope

    The audio unit scope for the property.

    mElement

    The audio unit element for the property.

    Availability

    Available in iOS 2.0 and later.

Constants

  • The defined types of audio processing plug-ins known as audio units.

    Declaration

    Swift

    var kAudioUnitType_Output: Int { get } var kAudioUnitType_MusicDevice: Int { get } var kAudioUnitType_MusicEffect: Int { get } var kAudioUnitType_FormatConverter: Int { get } var kAudioUnitType_Effect: Int { get } var kAudioUnitType_Mixer: Int { get } var kAudioUnitType_Panner: Int { get } var kAudioUnitType_OfflineEffect: Int { get } var kAudioUnitType_Generator: Int { get }

    Objective-C

    enum { kAudioUnitType_Output = 'auou', kAudioUnitType_MusicDevice = 'aumu', kAudioUnitType_MusicEffect = 'aumf', kAudioUnitType_FormatConverter = 'aufc', kAudioUnitType_Effect = 'aufx', kAudioUnitType_Mixer = 'aumx', kAudioUnitType_Panner = 'aupn', kAudioUnitType_OfflineEffect = 'auol', kAudioUnitType_Generator = 'augn', };

    Constants

    • kAudioUnitType_Output

      kAudioUnitType_Output

      An output unit provides input, output, or both input and output simultaneously. It can be used as the head of an audio unit processing graph.

      Available in iOS 2.0 and later.

    • kAudioUnitType_MusicDevice

      kAudioUnitType_MusicDevice

      An instrument unit can be used as a software musical instrument, such as a sampler or synthesizer. It responds to MIDI (Musical Instrument Digital Interface) control signals and can create notes.

      Available in iOS 2.0 and later.

    • kAudioUnitType_MusicEffect

      kAudioUnitType_MusicEffect

      An effect unit that can respond to MIDI control messages, typically through a mapping of MIDI messages to parameters of the audio unit’s DSP algorithm.

      Available in iOS 2.0 and later.

    • kAudioUnitType_FormatConverter

      kAudioUnitType_FormatConverter

      A format converter unit can transform audio formats, such as performing sample rate conversion. A format converter is also appropriate for deferred rendering and for effects such as varispeed. A format converter unit can ask for as much or as little audio input as it needs to produce a given output, while still completing its rendering within the time represented by the output buffer. For effect-like format converters, such as pitch shifters, it is common to provide both a realtime and an offline version. OS X, for example, includes Time-Pitch and Varispeed audio units in both realtime and offline versions.

      Available in iOS 2.0 and later.

    • kAudioUnitType_Effect

      kAudioUnitType_Effect

      An effect unit repeatedly processes a number of audio input samples to produce the same number of audio output samples. Most commonly, an effect unit has a single input and a single output. Some effects take side-chain inputs as well. Effect units can be run offline, such as to process a file without playing it, but are expected to run in realtime.

      Available in iOS 2.0 and later.

    • kAudioUnitType_Mixer

      kAudioUnitType_Mixer

      A mixer unit takes a number of input channels and mixes them to provide one or more output channels. For example, the kAudioUnitSubType_StereoMixer audio unit in OS X takes multiple mono or stereo inputs and produce a single stereo output.

      Available in iOS 2.0 and later.

    • kAudioUnitType_Panner

      kAudioUnitType_Panner

      A panner unit is a specialized effect unit that distributes one or more channels in a single input to one or more channels in a single output. Panner units must support a set of standard audio unit parameters that specify panning coordinates.

      Available in iOS 2.0 and later.

    • kAudioUnitType_OfflineEffect

      kAudioUnitType_OfflineEffect

      An offline effect unit provides digital signal processing of a sort that cannot proceed in realtime. For example, level normalization requires examination of an entire sound, beginning to end, before the normalization factor can be calculated. As such, offline effect units also have a notion of a priming stage that can be performed before the actual rendering/processing phase is executed.

      Available in iOS 2.0 and later.

    • kAudioUnitType_Generator

      kAudioUnitType_Generator

      A generator unit provides audio output but has no audio input. This audio unit type is appropriate for a tone generator. Unlike an instrument unit, a generator unit does not have a control input.

      Available in iOS 2.0 and later.

  • The Apple audio unit manufacturer code.

    Declaration

    Swift

    var kAudioUnitManufacturer_Apple: Int { get }

    Objective-C

    enum { kAudioUnitManufacturer_Apple = 'appl' };

    Constants

    • kAudioUnitManufacturer_Apple

      kAudioUnitManufacturer_Apple

      The unique manufacturer code that identifies audio units provided by Apple Inc.

      Available in iOS 2.0 and later.

  • Audio data format converter audio unit subtypes for audio units provided by Apple.

    Declaration

    Swift

    var kAudioUnitSubType_AUConverter: Int { get } var kAudioUnitSubType_NewTimePitch: Int { get } var kAudioUnitSubType_DeferredRenderer: Int { get } var kAudioUnitSubType_Splitter: Int { get } var kAudioUnitSubType_Merger: Int { get } var kAudioUnitSubType_Varispeed: Int { get } var kAudioUnitSubType_AUiPodTime: Int { get } var kAudioUnitSubType_AUiPodTimeOther: Int { get }

    Objective-C

    enum { kAudioUnitSubType_AUConverter = 'conv', kAudioUnitSubType_NewTimePitch = 'nutp', kAudioUnitSubType_TimePitch = 'tmpt', kAudioUnitSubType_DeferredRenderer = 'defr', kAudioUnitSubType_Splitter = 'splt', kAudioUnitSubType_Merger = 'merg', kAudioUnitSubType_Varispeed = 'vari', kAudioUnitSubType_AUiPodTime = 'iptm', kAudioUnitSubType_AUiPodTimeOther = 'ipto' };

    Constants

    • kAudioUnitSubType_AUConverter

      kAudioUnitSubType_AUConverter

      An audio unit that uses an audio converter to do linear PCM conversions, such as changes to sample rate, bit depth, or interleaving.

      Available in iOS 2.0 and later.

    • kAudioUnitSubType_NewTimePitch

      kAudioUnitSubType_NewTimePitch

      An audio unit that can be used to have independent control of both playback rate and pitch. In OS X it provides a generic view, so it can be used in both a UI and programmatic context. It also comes in an offline version for processing audio files.

      Available in iOS 6.0 and later.

    • kAudioUnitSubType_TimePitch

      kAudioUnitSubType_TimePitch

      An audio unit that can provide independent control of playback rate and pitch. This subtype provides a generic view, making it suitable for UI and programmatic context. OS X provides realtime and offline audio units of this subtype.

      Available in iOS 2.0 through iOS 2.0.

    • kAudioUnitSubType_DeferredRenderer

      kAudioUnitSubType_DeferredRenderer

      An audio unit that acquires audio input from a separate thread than the thread on which its render method is called. You can use this subtype to introduce multiple threads into an audio unit processing graph. There is a delay, equal to the buffer size, introduced between the audio input and output.

      Available in iOS 6.0 and later.

    • kAudioUnitSubType_Splitter

      kAudioUnitSubType_Splitter

      An audio unit with one input bus and two output buses. The audio unit duplicates the input signal to each of its two output buses.

      Available in iOS 6.0 and later.

    • kAudioUnitSubType_Merger

      kAudioUnitSubType_Merger

      An audio unit with two input buses and one output bus. The audio unit merges the two input signals to the single output.

      Available in iOS 6.0 and later.

    • kAudioUnitSubType_Varispeed

      kAudioUnitSubType_Varispeed

      An audio unit that can control playback rate. As the playback rate increases, so does pitch. This subtype provides a generic view, making it suitable for UI and programmatic context. OS X provides realtime and offline audio units of this subtype.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_AUiPodTime

      kAudioUnitSubType_AUiPodTime

      An iPhone OS audio unit that provides simple, limited control over playback rate and time.

      Available in iOS 2.1 and later.

    • kAudioUnitSubType_AUiPodTimeOther

      kAudioUnitSubType_AUiPodTimeOther

      Available in iOS 5.0 and later.

  • Effect (digital signal processing) audio unit subtypes for audio units provided by Apple.

    Declaration

    Swift

    var kAudioUnitSubType_PeakLimiter: Int { get } var kAudioUnitSubType_DynamicsProcessor: Int { get } var kAudioUnitSubType_Reverb2: Int { get } var kAudioUnitSubType_LowPassFilter: Int { get } var kAudioUnitSubType_HighPassFilter: Int { get } var kAudioUnitSubType_BandPassFilter: Int { get } var kAudioUnitSubType_HighShelfFilter: Int { get } var kAudioUnitSubType_LowShelfFilter: Int { get } var kAudioUnitSubType_ParametricEQ: Int { get } var kAudioUnitSubType_Delay: Int { get } var kAudioUnitSubType_SampleDelay: Int { get } var kAudioUnitSubType_Distortion: Int { get } var kAudioUnitSubType_AUiPodEQ: Int { get } var kAudioUnitSubType_NBandEQ: Int { get }

    Objective-C

    enum { kAudioUnitSubType_PeakLimiter = 'lmtr', kAudioUnitSubType_DynamicsProcessor = 'dcmp', kAudioUnitSubType_Reverb2 = 'rvb2', kAudioUnitSubType_LowPassFilter = 'lpas', kAudioUnitSubType_HighPassFilter = 'hpas', kAudioUnitSubType_BandPassFilter = 'bpas', kAudioUnitSubType_HighShelfFilter = 'hshf', kAudioUnitSubType_LowShelfFilter = 'lshf', kAudioUnitSubType_ParametricEQ = 'pmeq', kAudioUnitSubType_Delay = 'dely', kAudioUnitSubType_SampleDelay = 'sdly', kAudioUnitSubType_Distortion = 'dist', kAudioUnitSubType_AUiPodEQ = 'ipeq', kAudioUnitSubType_NBandEQ = 'nbeq' };

    Constants

    • kAudioUnitSubType_PeakLimiter

      kAudioUnitSubType_PeakLimiter

      An audio unit that enforces an upper dynamic limit on an audio signal.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_DynamicsProcessor

      kAudioUnitSubType_DynamicsProcessor

      An audio unit that provides dynamic compression or expansion.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_Reverb2

      kAudioUnitSubType_Reverb2

      A reverb unit for iOS.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_LowPassFilter

      kAudioUnitSubType_LowPassFilter

      An audio unit that passes frequencies below a specified cutoff frequency, and blocks frequencies above that cutoff frequency.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_HighPassFilter

      kAudioUnitSubType_HighPassFilter

      An audio unit that passes frequencies above a specified cutoff frequency, and blocks frequencies below that cutoff frequency.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_BandPassFilter

      kAudioUnitSubType_BandPassFilter

      An audio unit that passes frequencies between specified upper and lower cutoff frequencies, and blocks frequencies outside that band.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_HighShelfFilter

      kAudioUnitSubType_HighShelfFilter

      An audio unit suitable for implementing a treble control in an audio playback or recording system.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_LowShelfFilter

      kAudioUnitSubType_LowShelfFilter

      An audio unit suitable for implementing a bass control in an audio playback or recording system.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_ParametricEQ

      kAudioUnitSubType_ParametricEQ

      An audio unit that provides a filter whose center frequency, boost/cut level, and Q can be adjusted.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_Delay

      kAudioUnitSubType_Delay

      An audio unit that introduces a time delay to a signal.

      Available in iOS 6.0 and later.

    • kAudioUnitSubType_SampleDelay

      kAudioUnitSubType_SampleDelay

      An audio unit that provides a time delay for a specified number of samples.

      Available in iOS 8.0 and later.

    • kAudioUnitSubType_Distortion

      kAudioUnitSubType_Distortion

      An audio unit that provides a distortion effect.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_AUiPodEQ

      kAudioUnitSubType_AUiPodEQ

      An audio unit that provides a graphic equalizer in iPhone OS.

      Available in iOS 2.0 and later.

    • kAudioUnitSubType_NBandEQ

      kAudioUnitSubType_NBandEQ

      A multi-band equalizer with specifiable filter type for each band.

      Available in iOS 5.0 and later.

  • Audio mixing audio unit subtypes for audio units provided by Apple.

    Declaration

    Swift

    var kAudioUnitSubType_MultiChannelMixer: Int { get } var kAudioUnitSubType_MatrixMixer: Int { get } var kAudioUnitSubType_AU3DMixerEmbedded: Int { get }

    Objective-C

    enum { kAudioUnitSubType_MultiChannelMixer = 'mcmx', kAudioUnitSubType_MatrixMixer = 'mxmx', kAudioUnitSubType_AU3DMixerEmbedded = '3dem', };

    Constants

    • kAudioUnitSubType_MultiChannelMixer

      kAudioUnitSubType_MultiChannelMixer

      An audio unit that can have any number of input buses, with any number of channels on each input bus, and one output bus. In OS X, the output bus can have any number of channels. In iPhone OS, the output bus always has two channels.

      Available in iOS 2.0 and later.

    • kAudioUnitSubType_MatrixMixer

      kAudioUnitSubType_MatrixMixer

      An audio unit that can have any number of input and output buses with any number of channels on each bus. You configure the mix using a matrix of channels with a separate input level control for each channel. The audio unit also provides individual level control for each input-channel-to-output-channel combination, as well as level control for each output channel. Finally, the audio unit provides a global level control for the matrix as a whole.

      Available in iOS 6.0 and later.

    • kAudioUnitSubType_AU3DMixerEmbedded

      kAudioUnitSubType_AU3DMixerEmbedded

      An audio unit in iPhone OS that is a simplified version of the OS X kAudioUnitSubType_3DMixer audio unit. It can have any number of input buses and one output bus. Each input bus can be stereo or mono. The output bus is stereo.

      Available in iOS 2.0 and later.

  • Audio units that serve as sound sources.

    Declaration

    Swift

    var kAudioUnitSubType_ScheduledSoundPlayer: Int { get } var kAudioUnitSubType_AudioFilePlayer: Int { get }

    Objective-C

    enum { kAudioUnitSubType_ScheduledSoundPlayer = 'sspl', kAudioUnitSubType_AudioFilePlayer = 'afpl', };

    Constants

    • kAudioUnitSubType_ScheduledSoundPlayer

      kAudioUnitSubType_ScheduledSoundPlayer

      A generator unit that can be used to schedule slices of audio to be played at specified times. The audio is scheduled using the time stamps for the render operation and can be scheduled from any thread.

      Available in iOS 5.0 and later.

    • kAudioUnitSubType_AudioFilePlayer

      kAudioUnitSubType_AudioFilePlayer

      A generator unit that is used to play a file. In OS X it presents a custom UI so can be used in a UI context as well as in a programmatic context.

      Available in iOS 5.0 and later.

  • Audio units that can be played as musical instruments via MIDI control.

    Declaration

    Swift

    var kAudioUnitSubType_Sampler: Int { get }

    Objective-C

    enum { kAudioUnitSubType_Sampler = 'samp' };

    Constants

    • kAudioUnitSubType_Sampler

      kAudioUnitSubType_Sampler

      A monotimbral instrument unit that functions a a sampler-synthesizer and supports full interactive editing of its state.

      Available in iOS 5.0 and later.

  • Input/output audio unit subtypes for audio units provided by Apple.

    Declaration

    Swift

    var kAudioUnitSubType_GenericOutput: Int { get } var kAudioUnitSubType_RemoteIO: Int { get } var kAudioUnitSubType_VoiceProcessingIO: Int { get }

    Objective-C

    enum { kAudioUnitSubType_GenericOutput = 'genr', kAudioUnitSubType_RemoteIO = 'rioc', kAudioUnitSubType_VoiceProcessingIO = 'vpio' };

    Constants

    • kAudioUnitSubType_GenericOutput

      kAudioUnitSubType_GenericOutput

      An audio unit that responds to start/stop calls and provides basic services for converting to and from linear PCM formats.

      Available in iOS 2.0 and later.

    • kAudioUnitSubType_RemoteIO

      kAudioUnitSubType_RemoteIO

      An audio unit that interfaces to the audio inputs and outputs of iPhone OS devices. Bus 0 provides output to hardware and bus 1 accepts input from hardware. Called an I/O audio unit or sometimes a Remote I/O audio unit.

      Available in iOS 2.0 and later.

    • kAudioUnitSubType_VoiceProcessingIO

      kAudioUnitSubType_VoiceProcessingIO

      An audio unit that interfaces to the audio inputs and outputs of iPhone OS devices and provides voice processing features. Bus 0 provides output to hardware and bus 1 accepts input from hardware. See the Voice-Processing I/O Audio Unit Properties enumeration for the identifiers for this audio unit’s properties.

      Available in iOS 3.0 and later.

  • Audio unit parameter event types.

    Declaration

    Swift

    typealias AUParameterEventType = UInt32

    Objective-C

    enum { kParameterEvent_Immediate = 1, kParameterEvent_Ramped = 2 }; typedef UInt32 AUParameterEventType;

    Constants

    • kParameterEvent_Immediate

      kParameterEvent_Immediate

      An immediate change from the parameter’s previous value to a new value.

      Available in iOS 2.0 and later.

    • kParameterEvent_Ramped

      kParameterEvent_Ramped

      A gradual change from the parameter’s previous value to a new value, applied linearly over a specified period of time

      Available in iOS 2.0 and later.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • Flags for configuring audio unit rendering.

    Declaration

    Swift

    typealias AudioUnitRenderActionFlags = UInt32

    Objective-C

    enum { kAudioUnitRenderAction_PreRender = (1 << 2), kAudioUnitRenderAction_PostRender = (1 << 3), kAudioUnitRenderAction_OutputIsSilence = (1 << 4), kAudioOfflineUnitRenderAction_Preflight = (1 << 5), kAudioOfflineUnitRenderAction_Render = (1 << 6), kAudioOfflineUnitRenderAction_Complete = (1 << 7), kAudioUnitRenderAction_PostRenderError = (1 << 8), kAudioUnitRenderAction_DoNotCheckRenderArgs = (1 << 9) }; typedef UInt32 AudioUnitRenderActionFlags;

    Constants

    • kAudioUnitRenderAction_PreRender

      kAudioUnitRenderAction_PreRender

      Called on a render notification Proc - which is called either before or after the render operation of the audio unit. If this flag is set, the proc is being called before the render operation is performed.

      Available in iOS 2.0 and later.

    • kAudioUnitRenderAction_PostRender

      kAudioUnitRenderAction_PostRender

      Called on a render notification Proc - which is called either before or after the render operation of the audio unit. If this flag is set, the proc is being called after the render operation is completed.

      Available in iOS 2.0 and later.

    • kAudioUnitRenderAction_OutputIsSilence

      kAudioUnitRenderAction_OutputIsSilence

      This flag can be set in a render input callback (or in the audio unit's render operation itself) and is used to indicate that the render buffer contains only silence. It can then be used by the caller as a hint to whether the buffer needs to be processed or not.

      Available in iOS 2.0 and later.

    • kAudioOfflineUnitRenderAction_Preflight

      kAudioOfflineUnitRenderAction_Preflight

      This is used with offline audio units (of type 'auol'). It is used when an offline unit is being preflighted, which is performed prior to the actual offline rendering actions are performed. It is used for those cases where the offline process needs it (for example, with an offline unit that normalizes an audio file, it needs to see all of the audio data first before it can perform its normalization).

      Available in iOS 2.0 and later.

    • kAudioOfflineUnitRenderAction_Render

      kAudioOfflineUnitRenderAction_Render

      Once an offline unit has been successfully preflighted, it is then put into its render mode. So this flag is set to indicate to the audio unit that it is now in that state and that it should perform its processing on the input data.

      Available in iOS 2.0 and later.

    • kAudioOfflineUnitRenderAction_Complete

      kAudioOfflineUnitRenderAction_Complete

      This flag is set when an offline unit has completed either its preflight or performed render operation.

      Available in iOS 2.0 and later.

    • kAudioUnitRenderAction_PostRenderError

      kAudioUnitRenderAction_PostRenderError

      If this flag is set on the post-render call an error was returned by the audio unit's render operation. In this case, the error can be retrieved through the lastRenderError property and the audio data in ioData handed to the post-render notification will be invalid.

      Available in iOS 2.0 and later.

    • kAudioUnitRenderAction_DoNotCheckRenderArgs

      kAudioUnitRenderAction_DoNotCheckRenderArgs

      If this flag is set, then checks that are done on the arguments provided to render are not performed. This can be useful to use to save computation time in situations where you are sure you are providing the correct arguments and structures to the various render calls.

      Available in iOS 4.0 and later.

    Discussion

    These flags can be set in a callback from an audio unit during an audio unit render operation from either the RenderNotify Proc or the render input callback.

    Import Statement

    Objective-C

    @import AudioUnit;

    Swift

    import AudioUnit

    Availability

    Available in iOS 2.0 and later.

  • General audio unit component selectors that correspond to functions in the audio unit API.

    Declaration

    Swift

    var kAudioUnitRange: Int { get } var kAudioUnitInitializeSelect: Int { get } var kAudioUnitUninitializeSelect: Int { get } var kAudioUnitGetPropertyInfoSelect: Int { get } var kAudioUnitGetPropertySelect: Int { get } var kAudioUnitSetPropertySelect: Int { get } var kAudioUnitAddPropertyListenerSelect: Int { get } var kAudioUnitRemovePropertyListenerSelect: Int { get } var kAudioUnitRemovePropertyListenerWithUserDataSelect: Int { get } var kAudioUnitAddRenderNotifySelect: Int { get } var kAudioUnitRemoveRenderNotifySelect: Int { get } var kAudioUnitGetParameterSelect: Int { get } var kAudioUnitSetParameterSelect: Int { get } var kAudioUnitScheduleParametersSelect: Int { get } var kAudioUnitRenderSelect: Int { get } var kAudioUnitResetSelect: Int { get } var kAudioUnitComplexRenderSelect: Int { get } var kAudioUnitProcessSelect: Int { get } var kAudioUnitProcessMultipleSelect: Int { get }

    Objective-C

    enum { kAudioUnitRange = 0x0000, kAudioUnitInitializeSelect = 0x0001, kAudioUnitUninitializeSelect = 0x0002, kAudioUnitGetPropertyInfoSelect = 0x0003, kAudioUnitGetPropertySelect = 0x0004, kAudioUnitSetPropertySelect = 0x0005, kAudioUnitAddPropertyListenerSelect = 0x000A, kAudioUnitRemovePropertyListenerSelect = 0x000B, kAudioUnitRemovePropertyListenerWithUserDataSelect = 0x0012, kAudioUnitAddRenderNotifySelect = 0x000F, kAudioUnitRemoveRenderNotifySelect = 0x0010, kAudioUnitGetParameterSelect = 0x0006, kAudioUnitSetParameterSelect = 0x0007, kAudioUnitScheduleParametersSelect = 0x0011, kAudioUnitRenderSelect = 0x000E, kAudioUnitResetSelect = 0x0009, kAudioUnitComplexRenderSelect = 0x0013, kAudioUnitProcessSelect = 0x0014, kAudioUnitProcessMultipleSelect = 0x0015 };

    Constants

    • kAudioUnitRange

      kAudioUnitRange

      The start of the numerical range for general audio unit function selectors.

      Available in iOS 2.0 and later.

    • kAudioUnitInitializeSelect

      kAudioUnitInitializeSelect

      Used by the system to initialize an audio unit when you call the AudioUnitInitialize function.

      Available in iOS 2.0 and later.

    • kAudioUnitUninitializeSelect

      kAudioUnitUninitializeSelect

      Used by the system to uninitialize an audio unit when you call the AudioUnitUninitialize function.

      Available in iOS 2.0 and later.

    • kAudioUnitGetPropertyInfoSelect

      kAudioUnitGetPropertyInfoSelect

      Used by the system to get property information from an audio unit when you call the AudioUnitGetPropertyInfo function.

      Available in iOS 2.0 and later.

    • kAudioUnitGetPropertySelect

      kAudioUnitGetPropertySelect

      Used by the system to get a property value from an audio unit when you call the AudioUnitGetProperty function.

      Available in iOS 2.0 and later.

    • kAudioUnitSetPropertySelect

      kAudioUnitSetPropertySelect

      Used by the system to set an audio unit property value when you call the AudioUnitSetProperty function.

      Available in iOS 2.0 and later.

    • kAudioUnitAddPropertyListenerSelect

      kAudioUnitAddPropertyListenerSelect

      Used by the system to register a property listener callback function for an audio unit when you call the AudioUnitAddPropertyListener function.

      Available in iOS 2.0 and later.

    • kAudioUnitRemovePropertyListenerSelect

      kAudioUnitRemovePropertyListenerSelect

      Used by the system to unregister a property listener callback function from an audio unit when you call the AudioUnitRemovePropertyListener function.

      Available in iOS 2.0 and later.

    • kAudioUnitRemovePropertyListenerWithUserDataSelect

      kAudioUnitRemovePropertyListenerWithUserDataSelect

      Used by the system to unregister a property listener callback function, associated with specified user data, from an audio unit when you call the AudioUnitRemovePropertyListenerWithUserData function.

      Available in iOS 2.0 and later.

    • kAudioUnitAddRenderNotifySelect

      kAudioUnitAddRenderNotifySelect

      Used by the system to register a render notification callback function for an audio unit when you call the AudioUnitAddRenderNotify function.

      Available in iOS 2.0 and later.

    • kAudioUnitRemoveRenderNotifySelect

      kAudioUnitRemoveRenderNotifySelect

      Used by the system to unregister a render notification callback function from an audio unit when you call the AudioUnitRemoveRenderNotify function.

      Available in iOS 2.0 and later.

    • kAudioUnitGetParameterSelect

      kAudioUnitGetParameterSelect

      Used by the system to get the current value of an audio unit parameter when you call the AudioUnitGetParameter function.

      Available in iOS 2.0 and later.

    • kAudioUnitSetParameterSelect

      kAudioUnitSetParameterSelect

      Used by the system to set the value of an audio unit parameter when you call the AudioUnitSetParameter function.

      Available in iOS 2.0 and later.

    • kAudioUnitScheduleParametersSelect

      kAudioUnitScheduleParametersSelect

      Used by the system to schedule an audio unit parameter value change when you call the AudioUnitScheduleParameters function.

      Available in iOS 2.0 and later.

    • kAudioUnitRenderSelect

      kAudioUnitRenderSelect

      Used by the system to invoke audio rendering by an audio unit when you call the AudioUnitRender function.

      Available in iOS 2.0 and later.

    • kAudioUnitResetSelect

      kAudioUnitResetSelect

      Used by the system to reset an audio unit when you call the AudioUnitReset function.

      Available in iOS 2.0 and later.

    • kAudioUnitComplexRenderSelect

      kAudioUnitComplexRenderSelect

      Available in iOS 4.0 and later.

    • kAudioUnitProcessSelect

      kAudioUnitProcessSelect

      Available in iOS 4.0 and later.

    • kAudioUnitProcessMultipleSelect

      kAudioUnitProcessMultipleSelect

      Available in iOS 5.0 and later.

    Discussion

    For audio unit component function selectors that apply to I/O audio units, see Output Audio Unit Services Reference.

Result Codes

This table lists the result codes defined for Audio Unit Component Services.