AudioUnit

RSS for tag

Create audio unit extensions and add sophisticated audio manipulation and processing capabilities to your app using AudioUnit.

AudioUnit Documentation

Posts under AudioUnit tag

48 Posts
Sort by:
Post not yet marked as solved
1 Replies
389 Views
Hi, I've released an open-source AUv3 MIDI processor plugin for iOS and macOS that records and plays MIDI messages in a sample accurate fashion and doesn't ever apply any quantization. I've tested this plugin with 120 beta testers and everything seemed to work fine. However, now that I've released it, there seems to be a problem in Logic Pro X on some Mac computers with MIDI FX processor plugins that are using Catalyst. You can find my plugin here: http://uwyn.com/mtr/ ... and the source code here: https://github.com/gbevin/MIDITapeRecorder When I trace the AUv3 instantiation, I see Logic Pro X obtaining the internalRenderBlock several times, but then never ever calling it. This means there's no render callback and there's never any MIDI parameter events received. I've talked to the developer of ZOA, which is also a MIDI processor plugin using Catalyst and he's running into exactly the same problem: https://www.audiosymmetric.com/zoa.html Another developer that’s working on a MIDI processor plugin has been trying to track this down for weeks also. When I test this on my M1 Max MacBook Pro, is always internalRenderBlock, however an my M1 MacBook Air and Intel 2019 MacBook Pro, it is never called. Any thoughts or ideas to work around this would be really helpful. Thanks!
Posted
by
Post not yet marked as solved
3 Replies
857 Views
Hello, my app crashed on the new MacOS12.x system, it works well on MacOS 11 BigSur. I'm developing an audio app on MacOS using AudioUnit, it sometime crashed when i switch devices. the relevant api is: AudioUnitSetProperty(audio_unit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, kAudioUnitOutputBus, &rnd_id, sizeof(rnd_id)); it troubles me for month, i can't find the reason or any useful info, any help will be appreciate. the crash log is: OS Version:      macOS 12.1 (21C51) Report Version:    12 Bridge OS Version:  6.1 (19P647) Crashed Thread:    43 schedule-thread Exception Type:    EXC_BAD_ACCESS (SIGSEGV) Exception Codes:   KERN_INVALID_ADDRESS at 0x00000a14f8969188 Exception Codes:   0x0000000000000001, 0x00000a14f8969188 Exception Note:    EXC_CORPSE_NOTIFY Application Specific Information: objc_msgSend() selector name: copy Thread 43 Crashed:: schedule-thread 0 libobjc.A.dylib         0x7ff815ef405d objc_msgSend + 29 1 CoreAudio            0x7ff817a237b9 HALC_ShellDevice::_GetPropertyData(unsigned int, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int, unsigned int&, void*, unsigned int&, AudioObjectPropertyAddress&, bool&) const + 1133 2 CoreAudio            0x7ff817c57b81 invocation function for block in HALC_ShellObject::GetPropertyData(unsigned int, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int, unsigned int&, void*) const + 107 3 CoreAudio            0x7ff817e8a606 HALB_CommandGate::ExecuteCommand(void () block_pointer) const + 98 4 CoreAudio            0x7ff817c56a98 HALC_ShellObject::GetPropertyData(unsigned int, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int, unsigned int&, void*) const + 376 5 CoreAudio            0x7ff817b04235 HAL_HardwarePlugIn_ObjectGetPropertyData(AudioHardwarePlugInInterface**, unsigned int, AudioObjectPropertyAddress const*, unsigned int, void const*, unsigned int*, void*) + 349 6 CoreAudio            0x7ff817c16109 HALPlugIn::ObjectGetPropertyData(HALObject const&, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int&, void*) const + 59 7 CoreAudio            0x7ff817bd2f5d HALObject::GetPropertyData(AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int&, void*) const + 461 8 CoreAudio            0x7ff817f2ffca HALDevice::GetPropertyData(AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int&, void*) const + 644 9 CoreAudio            0x7ff8179809ab AudioObjectGetPropertyData + 275 10 AudioDSP              0x134b6df19 0x13490b000 + 2502425 11 AudioDSP              0x134b69776 0x13490b000 + 2484086 12 AudioDSP              0x134cc4eb4 0x13490b000 + 3907252 13 AudioDSP              0x134cc5f56 0x13490b000 + 3911510 14 AudioDSP              0x134e17b2c 0x13490b000 + 5294892 15 AudioDSP              0x134e0f4d1 0x13490b000 + 5260497
Posted
by
Post not yet marked as solved
0 Replies
369 Views
Hi, I've problem with an AU host (based on Audio Toolbox/Core Audio, not AVFoundation) when running on macOS 11 or later and Apple Silicon – it crashes after some operations in GUI. The weird is, it crashes in IOThread. Could this be caused by some inappropriate operation in GUI (eg. outside the main thread) that effects the IOThread? Sounds quite improbable to me. And I did not find anything suspicious in the code. There are two logs in the debugger: [AUHostingService Client] connection interrupted. rt_sender::signal_wait failed: 89 ... And here is the crash log: Crash log: ... Thanks, Tomas
Posted
by
Post not yet marked as solved
2 Replies
493 Views
Hi Update complete today and Magic Mouse is no longer scrolling Accessibility, Sound, Keyboard in Preferences can't be accessed iPods Pro can't be connected as well as any other device for audio Photos not synching Keyboard audio regulation switched off, unless on a wire Really? Yes, nothing changed with 12.1
Posted
by
Post not yet marked as solved
0 Replies
185 Views
I have an audio unit that I can retrieve parameters and save those parameters in Logic Pro X. The values are boolean, float and integer. What I want to do is be able to save a string value, just like what Logic Pro X is doing with thier MIDI-FX plugin Scripter. Is this possible?
Posted
by
Post not yet marked as solved
0 Replies
233 Views
Hi, Not sure if this is by design or not, but whenever I connect a Bluetooth device to the app the AU callback stops producing frames for several seconds until the device is connected. I'm building a recording app that uses AVAssetWriter with fragmented segments (HLS buffers). When the callback freezes it suppose to create a gap in the audio but for some reason, the segment that is created does not contain an audio gap, and the audio just "jumps" in timestamps.
Posted
by
Post not yet marked as solved
0 Replies
324 Views
I’m developing a voice communication app for the iPad with both playback and record and using AudioUnit of type kAudioUnitSubType_VoiceProcessingIO to have echo cancellation. When playing the audio before initializing the recording audio unit, volume is high. But if I'm playing the audio after initializing the audio unit or when switching to remoteio and then back to vpio the playback volume is low. It seems like a bug in iOS, any solution or workaround for this? Searching the net I only found this post without any solution: https://developer.apple.com/forums/thread/671836
Posted
by
Post not yet marked as solved
0 Replies
277 Views
When using VoiceProcessingIO audio unit with voicechat audio session mode to have echo cancellation, I can't play audio in stereo, it only allows mono audio. How can I enable stereo playback with echo cancellation? Is it some kind of limitation? since it isn't mentioned anywhere in the documentation.
Posted
by
Post not yet marked as solved
0 Replies
260 Views
Could someone explain what 'MALLOC_NANO' is? A snippet from the crash report this was contained in is here: Crashed Thread: 50 myThread 0x1722ac000 - 0x17244ffff Exception Type: EXC_BAD_ACCESS (SIGBUS) Exception Codes: KERN_PROTECTION_FAILURE at 0x0000600015c9e340 Exception Note: EXC_CORPSE_NOTIFY Termination Signal: Bus error: 10 Termination Reason: Namespace SIGNAL, Code 0xa Terminating Process: exc handler [35195] VM Regions Near 0x600015c9e340: MALLOC_NANO 600008000000-600010000000 [128.0M] rw-/rwx SM=PRV --> MALLOC_NANO 600010000000-600018000000 [128.0M] rw-/rwx SM=SHM MALLOC_NANO 600018000000-600020000000 [128.0M] rw-/rwx SM=PRV I'm guessing the root cause for this crash is using a deallocated object but I'd really like to just know more about MALLOC_NANO as I couldn't find much information elsewhere. Much appreciated!
Posted
by
Post not yet marked as solved
0 Replies
277 Views
I know the VoiceProcessingIO audio unit will create a aggregate audio device. But there are error kAudioUnitErr_InvalidProperty (-10789) during getting kAudioOutputUnitProperty_OSWorkgroup property in recent macOS Monterey 12.2.1 or BigSur 11.6.4. os_workgroup_t workgroup = NULL; UInt32  sSize; OSStatus sStatus; sSize = sizeof(os_workgroup_t); sStatus = AudioUnitGetProperty(mAudioUnit, kAudioOutputUnitProperty_OSWorkgroup, kAudioUnitScope_Global, 1, &workgroup, &sSize); if (sStatus != noErr) { NSLog(@"Error %d", sStatus); } And the same code works fine on iOS 15.3.1 but not macOS. Have you any hint to resolve this issue?
Posted
by
Post not yet marked as solved
0 Replies
289 Views
I'm trying to create a simple pure MIDI Audio Unit (AUv3) that could act as a pipe between for example an AVMusicTrack (played by an AVAudioSequencer) and an AVAudioUnitSampler. I used the default audio extension template generated by XCode 13.2.1 and modified just a few things: My audio unit has type kAudioUnitType_MIDIProcessor (aumi), from what I read it's the right candidate and the only one I can connect to a sampler. Else the app will crash with the Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: graphNodeSrc->IsMIDIProcessorNode() message. Maybe I'm missing something here, any suggestion? I overrode handleMIDIEvent() function in my AU's DSP Kernel just to print something when I receive MIDI events: void handleMIDIEvent(const AUMIDIEvent &midiEvent) override { cout << "MIDI Event" << endl; } I declared a MIDIOutputNames properly My goal is to follow the MIDI playing context from this AU, and edit some note messages depending on the context. My host provides AUMIDIOutputEventBlock, AUHostTransportStateBlock and AUHostMusicalContextBlock so that the AU can read the state and context and output MIDI to the host. If I make my AU a kAudioUnitType_MusicDevice (aumu), I do receive note events from the music track (even if I cannot connect my AU to the sampler as it's not a MIDI processor). But with a MIDI processor, I don't. Any clue why this is?
Posted
by
Post not yet marked as solved
0 Replies
244 Views
How can you add a live audio player to Xcode where they will have a interactive UI to control the audio and they will be able to exit out of the app and or turn their device off and it will keep playing? Is their a framework or API that will work for this? Thanks! Really need help with this…. 🤩 I have looked everywhere and haven’t found something that works….
Posted
by
Post not yet marked as solved
0 Replies
187 Views
Hi, I've developed a AU plugin which has an issue when input monitoring is turned on. The output is noisy when my plugin is processing audio and the input monitoring on. There is no noise with if either one of these is turned off. Any idea what changes in Logic when input monitoring is turned on which might have an effect on an AU plugin?
Posted
by
Post not yet marked as solved
0 Replies
244 Views
AudioComponentDescription desc = {kAudioUnitType_Output, kAudioUnitSubType_VoiceProcessingIO, kAudioUnitManufacturer_Apple, 0, 0}; AudioComponent comp = AudioComponentFindNext(NULL, &desc); OSStatus error = AudioComponentInstanceNew(comp, &myAudioUnit);   In special case the returned error value is -1, I searched the https://www.osstatus.com/, but didn't get relevent info. my question is: what's the meanning of -1 in the case ? myAudioUnit is a nullptr this time ?
Posted
by
Post not yet marked as solved
1 Replies
283 Views
I'm writing a macOS audio unit hosting app using the AVAudioUnit and AUAudioUnit APIs. I'm trying to use the NSView cacheDisplay(in:to:) function to capture an image of a plugin's view: func viewToImage(veiwToCapture: NSView) -> NSImage? {     var image: NSImage? = nil     if let rep = veiwToCapture.bitmapImageRepForCachingDisplay(in: veiwToCapture.bounds) {       veiwToCapture.cacheDisplay(in: veiwToCapture.bounds, to: rep)       image = NSImage(size: veiwToCapture.bounds.size)       image!.addRepresentation(rep)     }     return image   } } This works ok when a plugin is instantiated using the .loadInProcess option. If the plugin is instantiated using the .loadOutOfProcess option the resulting bitmapImageRep is blank. I'd much rather be loading plugins out-of-process for the enhanced stability. Is there any trick I'm missing to be able to capture the contents of the NSView from an out-of-process audio unit?
Posted
by
Post not yet marked as solved
0 Replies
262 Views
When i make a call within our VOIP application (ipadOS app on MacOS and M1 - MBP 16) all is fine. If i make a call with plugged in headphones - all is fine. If i unplug the headphones during the call - whole audio just stop working. If i hang up the call, make the call again - audio is there with no problems. On iPhone and iPad work correctly. Where can be a problem? HALC_ShellDevice::CreateIOContextDescription: failed to get a description from the server HALC_ProxyIOContext::IOWorkLoop: the server failed to start, Error: 0x6E6F7065 HALC_ProxyIOContext::IOWorkLoop: the server failed to start, Error: 0x6E6F7065 HALC_ProxyIOContext::IOWorkLoop: the server failed to start, Error: 0x6E6F7065 AudioObjectGetPropertyDataSize: no object with given ID 73 AudioObjectHasProperty: no object with given ID 66 AudioObjectHasProperty: no object with given ID 66 AudioObjectHasProperty: no object with given ID 0 [auvp] AUVPAggregate.cpp:4413 Failed to get current tap stream physical format, err=2003332927 AudioObjectGetPropertyDataSize: no object with given ID 66 AudioObjectGetPropertyData: no object with given ID 66 AudioObjectHasProperty: no object with given ID 66 AudioObjectRemovePropertyListener: no object with given ID 66 AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 66 AudioObjectGetPropertyDataSize: no object with given ID 73 AudioObjectHasProperty: no object with given ID 66 AudioObjectHasProperty: no object with given ID 66 HALC_ProxySystem::GetObjectInfo: got an error from the server, Error: 560947818 (!obj) AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 66 HALC_ShellObject::HasProperty: there is no proxy object AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 73 [auvp] AUVPAggregate.cpp:6912 error 2003332927 getting input sample rate AudioObjectHasProperty: no object with given ID 73 [auvp] AUVPAggregate.cpp:6922 error 2003332927 getting input latency AudioObjectHasProperty: no object with given ID 73 [auvp] AUVPAggregate.cpp:6932 error 2003332927 getting input safety offset AudioObjectHasProperty: no object with given ID 66 [auvp] AUVPAggregate.cpp:6944 error 2003332927 getting tap stream input latency AudioObjectHasProperty: no object with given ID 66 [auvp] AUVPAggregate.cpp:6954 error 2003332927 getting tap stream input safety offset AudioObjectHasProperty: no object with given ID 73 [auvp] AUVPAggregate.cpp:6965 error 2003332927 getting output sample rate AudioObjectHasProperty: no object with given ID 66 [auvp] AUVPAggregate.cpp:6975 error 2003332927 getting output latency AudioObjectHasProperty: no object with given ID 66 AudioDeviceDuck: no device with given ID [auvp] AUVPAggregate.cpp:6985 error 2003332927 getting output safety offset AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 66 AudioObjectsPublishedAndDied: no such owning object AudioObjectsPublishedAndDied: no such owning object AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 66 AudioObjectHasProperty: no object with given ID 73 [auvp] AUVPAggregate.cpp:6912 error 2003332927 getting input sample rate AudioObjectHasProperty: no object with given ID 73 [auvp] AUVPAggregate.cpp:6922 error 2003332927 getting input latency AudioObjectHasProperty: no object with given ID 73 [auvp] AUVPAggregate.cpp:6932 error 2003332927 getting input safety offset AudioObjectHasProperty: no object with given ID 66 [auvp] AUVPAggregate.cpp:6944 error 2003332927 getting tap stream input latency AudioObjectHasProperty: no object with given ID 66 [auvp] AUVPAggregate.cpp:6954 error 2003332927 getting tap stream input safety offset AudioObjectHasProperty: no object with given ID 73 [auvp] AUVPAggregate.cpp:6965 error 2003332927 getting output sample rate AudioObjectHasProperty: no object with given ID 66 [auvp] AUVPAggregate.cpp:6975 error 2003332927 getting output latency AudioObjectHasProperty: no object with given ID 66 [auvp] AUVPAggregate.cpp:6985 error 2003332927 getting output safety offset AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 73 AudioObjectPropertiesChanged: no such object [auvp] AUVPAggregate.cpp:2799 AggCompChanged wait failed AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 73 AudioObjectSetPropertyData: no object with given ID 73 [auvp] AUVPUtilities.cpp:472 SetDeviceMuteState(73) false: (err=560947818) AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 73 [auvp] AUVPUtilities.cpp:560 SetCFNumberValueForKeyInDescriptionDictionary(73); doesn't support 'cdes' AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 73 AudioObjectHasProperty: no object with given ID 73 AudioObjectGetPropertyDataSize: no object with given ID 73 AudioObjectHasProperty: no object with given ID 66 AudioObjectHasProperty: no object with given ID 66 AudioObjectHasProperty: no object with given ID 66 [auvp] AUVPUtilities.cpp:560 SetCFNumberValueForKeyInDescriptionDictionary(66); doesn't support 'cdes' AudioObjectHasProperty: no object with given ID 66 AudioObjectHasProperty: no object with given ID 66 [auvp] AUVPAggregate.cpp:3523 VP block error num input channels is unexpected (err=-66784) [vp] vpStrategyManager.mm:358 Error code 2003332927 reported at GetPropertyInfo [vp] vpStrategyManager.mm:358 Error code 2003332927 reported at GetPropertyInfo HALC_ProxySystem::GetObjectInfo: got an error from the server, Error: 560947818 (!obj) HALC_ShellDevice::RebuildStreamLists: there is no device [vp] vpStrategyManager.mm:358 Error code 2003332927 reported at GetPropertyInfo [vp] vpStrategyManager.mm:358 Error code 2003332927 reported at GetPropertyInfo
Posted
by
Post not yet marked as solved
1 Replies
220 Views
In our audio unit host, we're seeing the following error thrown when keypresses are dispatched to an audio unit view , does anyone know what it means and more importantly what might be causing it ? .... I can only guess that perhaps an object has been released or not retained properly somewhere but cannot find a solution. This is error message when caught in the debugger _Thread 1: "assertion failed: '<AUAudioUnitRemoteViewController: 0x600002b00c60> does not conform to AUAudioUnitViewControllerHostInterface' in -[NSRemoteView ensureClientExportedObject] on line 7080 of file /AppleInternal/Library/BuildRoots/66382bca-8bca-11ec-aade-6613bcf0e2ee/Library/Caches/com.apple.xbs/Sources/ViewBridge/NSRemoteView.m"
Posted
by
Post not yet marked as solved
0 Replies
181 Views
Hi, When debugging an iOS AUv3 extension in GarageBand (or in other host apps) on Mac OS Monterey running on an M1 Mac, there is a large number of warnings that read: WARNING: SPI usage of '-[UINSWindow uiWindows]' is being shimmed. This will break in the future. Please file a radar requesting API for what you are trying to do. I noticed that immediately after such a warning, the view's hitTest is called with a UIHoverEvent, and indeed moving the mouse results in more log spam. Although the very first occurrence of the warning may have a different root cause. Using a symbolic breakpoint wasn't helpful in revealing further information. Note that I'm launching the AUv3 extension in the "Designed for iPad" mode. Project language is Objective-C. I was able to reproduce the issue with even an empty project, adding nothing beyond Xcode's built-in appex template for iOS Audio Units. The issue doesn't seem to happen with the Apple sample code "CreatingCustomAudioEffects", which is a swift project. I also compared and matched storyboard settings between that project and my own, but to no avail. Any pointers would be highly appreciated. Thanks.
Posted
by