AudioUnit

RSS for tag

Create audio unit extensions and add sophisticated audio manipulation and processing capabilities to your app using AudioUnit.

AudioUnit Documentation

Posts under AudioUnit tag

48 Posts
Sort by:
Post not yet marked as solved
0 Replies
324 Views
I’m developing a voice communication app for the iPad with both playback and record and using AudioUnit of type kAudioUnitSubType_VoiceProcessingIO to have echo cancellation. When playing the audio before initializing the recording audio unit, volume is high. But if I'm playing the audio after initializing the audio unit or when switching to remoteio and then back to vpio the playback volume is low. It seems like a bug in iOS, any solution or workaround for this? Searching the net I only found this post without any solution: https://developer.apple.com/forums/thread/671836
Posted
by
Post not yet marked as solved
0 Replies
233 Views
Hi, Not sure if this is by design or not, but whenever I connect a Bluetooth device to the app the AU callback stops producing frames for several seconds until the device is connected. I'm building a recording app that uses AVAssetWriter with fragmented segments (HLS buffers). When the callback freezes it suppose to create a gap in the audio but for some reason, the segment that is created does not contain an audio gap, and the audio just "jumps" in timestamps.
Posted
by
Post not yet marked as solved
0 Replies
185 Views
I have an audio unit that I can retrieve parameters and save those parameters in Logic Pro X. The values are boolean, float and integer. What I want to do is be able to save a string value, just like what Logic Pro X is doing with thier MIDI-FX plugin Scripter. Is this possible?
Posted
by
Post not yet marked as solved
2 Replies
493 Views
Hi Update complete today and Magic Mouse is no longer scrolling Accessibility, Sound, Keyboard in Preferences can't be accessed iPods Pro can't be connected as well as any other device for audio Photos not synching Keyboard audio regulation switched off, unless on a wire Really? Yes, nothing changed with 12.1
Posted
by
Post not yet marked as solved
0 Replies
369 Views
Hi, I've problem with an AU host (based on Audio Toolbox/Core Audio, not AVFoundation) when running on macOS 11 or later and Apple Silicon – it crashes after some operations in GUI. The weird is, it crashes in IOThread. Could this be caused by some inappropriate operation in GUI (eg. outside the main thread) that effects the IOThread? Sounds quite improbable to me. And I did not find anything suspicious in the code. There are two logs in the debugger: [AUHostingService Client] connection interrupted. rt_sender::signal_wait failed: 89 ... And here is the crash log: Crash log: ... Thanks, Tomas
Posted
by
Post not yet marked as solved
3 Replies
857 Views
Hello, my app crashed on the new MacOS12.x system, it works well on MacOS 11 BigSur. I'm developing an audio app on MacOS using AudioUnit, it sometime crashed when i switch devices. the relevant api is: AudioUnitSetProperty(audio_unit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, kAudioUnitOutputBus, &rnd_id, sizeof(rnd_id)); it troubles me for month, i can't find the reason or any useful info, any help will be appreciate. the crash log is: OS Version:      macOS 12.1 (21C51) Report Version:    12 Bridge OS Version:  6.1 (19P647) Crashed Thread:    43 schedule-thread Exception Type:    EXC_BAD_ACCESS (SIGSEGV) Exception Codes:   KERN_INVALID_ADDRESS at 0x00000a14f8969188 Exception Codes:   0x0000000000000001, 0x00000a14f8969188 Exception Note:    EXC_CORPSE_NOTIFY Application Specific Information: objc_msgSend() selector name: copy Thread 43 Crashed:: schedule-thread 0 libobjc.A.dylib         0x7ff815ef405d objc_msgSend + 29 1 CoreAudio            0x7ff817a237b9 HALC_ShellDevice::_GetPropertyData(unsigned int, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int, unsigned int&, void*, unsigned int&, AudioObjectPropertyAddress&, bool&) const + 1133 2 CoreAudio            0x7ff817c57b81 invocation function for block in HALC_ShellObject::GetPropertyData(unsigned int, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int, unsigned int&, void*) const + 107 3 CoreAudio            0x7ff817e8a606 HALB_CommandGate::ExecuteCommand(void () block_pointer) const + 98 4 CoreAudio            0x7ff817c56a98 HALC_ShellObject::GetPropertyData(unsigned int, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int, unsigned int&, void*) const + 376 5 CoreAudio            0x7ff817b04235 HAL_HardwarePlugIn_ObjectGetPropertyData(AudioHardwarePlugInInterface**, unsigned int, AudioObjectPropertyAddress const*, unsigned int, void const*, unsigned int*, void*) + 349 6 CoreAudio            0x7ff817c16109 HALPlugIn::ObjectGetPropertyData(HALObject const&, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int&, void*) const + 59 7 CoreAudio            0x7ff817bd2f5d HALObject::GetPropertyData(AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int&, void*) const + 461 8 CoreAudio            0x7ff817f2ffca HALDevice::GetPropertyData(AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int&, void*) const + 644 9 CoreAudio            0x7ff8179809ab AudioObjectGetPropertyData + 275 10 AudioDSP              0x134b6df19 0x13490b000 + 2502425 11 AudioDSP              0x134b69776 0x13490b000 + 2484086 12 AudioDSP              0x134cc4eb4 0x13490b000 + 3907252 13 AudioDSP              0x134cc5f56 0x13490b000 + 3911510 14 AudioDSP              0x134e17b2c 0x13490b000 + 5294892 15 AudioDSP              0x134e0f4d1 0x13490b000 + 5260497
Posted
by
Post not yet marked as solved
1 Replies
389 Views
Hi, I've released an open-source AUv3 MIDI processor plugin for iOS and macOS that records and plays MIDI messages in a sample accurate fashion and doesn't ever apply any quantization. I've tested this plugin with 120 beta testers and everything seemed to work fine. However, now that I've released it, there seems to be a problem in Logic Pro X on some Mac computers with MIDI FX processor plugins that are using Catalyst. You can find my plugin here: http://uwyn.com/mtr/ ... and the source code here: https://github.com/gbevin/MIDITapeRecorder When I trace the AUv3 instantiation, I see Logic Pro X obtaining the internalRenderBlock several times, but then never ever calling it. This means there's no render callback and there's never any MIDI parameter events received. I've talked to the developer of ZOA, which is also a MIDI processor plugin using Catalyst and he's running into exactly the same problem: https://www.audiosymmetric.com/zoa.html Another developer that’s working on a MIDI processor plugin has been trying to track this down for weeks also. When I test this on my M1 Max MacBook Pro, is always internalRenderBlock, however an my M1 MacBook Air and Intel 2019 MacBook Pro, it is never called. Any thoughts or ideas to work around this would be really helpful. Thanks!
Posted
by
Post not yet marked as solved
1 Replies
453 Views
Curious if there is a sound way for an AUv3 component to identify how many other instances of it that are running on a device. For instance, if GarageBand has 4 tracks and all of the tracks use the same AUv3 component, is there a sound way for each one to obtain a unique index value? Thanks!
Posted
by
Post not yet marked as solved
0 Replies
405 Views
Hi, Wondering if anyone has found a solution to the automatic volume reduction on the host computer using the OSX native screen share application. The volume reduction makes it nearly impossible to comfortably continue working on the host computer when there is any audio involved. Is there a way to bypass to this function? It seems to be the same native function that FaceTime uses to reduce the system audio volume to create priority for the application. Please help save my speakers! Thanks.
Posted
by
Post not yet marked as solved
0 Replies
279 Views
I created a multi-timbral instrument application based on multiple AVAudioUnitSampler instances (one per Midi channel), wrapped in a custom AVSampler class. I want to expose it also as an AUv3. I followed some articles and samples and put the view controller and other classes in a Framework target, created an AudioUnit extension target (with a dummy/empty class file as I've no implementation to provide). In the extension's Info.plist (NSExtensionAttributes) I added AudioComponentBundle (points to the AUFramework) and AudioComponents item with factoryFunction (points to $(PRODUCT_MODULE_NAME).MultiSamplerViewController), aumu type. Also added NSExtensionPrincipalClass pointing to AUFramework.MultiSamplerViewController. In the shared MultiSamplerViewController I implemented (AUAudioUnit *)createAudioUnitWithComponentDescription:(AudioComponentDescription)desc error:(NSError **)error {     return [[[multiSampler engine] outputNode] AUAudioUnit]; } It also contains an - (id)initWithCoder:(NSCoder*)decoder method, that instantiates the wrapping MultiSampler and starts an enclosed MidiManager. The host application target runs fine, however the AU extension plugin isn't listed in GarageBand (even after runnig the host application once). The target platform is iPad. I added code to load the appex plugin bundle, however it doesn't seem enough to register the plugin. Also I cannot use the AUAudioUnit registerSubclass as I've no concrete AU implementation class (I could pass [[[multiSampler engine] outputNode] AUAudioUnit] ?) I'm in the same configuration as an application built on AudioKit framework (that originally wrapped AVAudioUnitSampler - and now uses a custom implementation).
Posted
by
Post not yet marked as solved
1 Replies
1.6k Views
I am a developer of Tencent. We found that after AirPods upgraded to the new firmware version 4A400, some AirPods microphones may have abnormal sound problems, especially on the iPhone with the system version iOS 13 . The specific performance is: the sound captured by AirPods microphones will appear intermittently with noise, broken sound, pitch shift, and tremor, and the sense of hearing and intelligibility is poor. Our users have reported that many times when using Tencent Meeting for video calls, the others can’t hear his/her voice, so we tested many iPhones and AirPods and found that they all have some problems. Let’s share our Specific test results: iPhone 11 Pro Max / iOS 13.6.1 / AirPods 2, the sound periodically appears noise and tremor about every 20s, it sounds uncomfortable, and when using the system phone/FaceTime/WeChat call/Zoom test, the effect is the same ; Use the same iPhone as 1 and replace the earphones with AirPods Pro, and the test conditions are exactly the same as 1; Using the same AirPods Pro as 2 and changing the phone to iPhone X / iOS 13.6, there will be occasional discontinuities in the sound, and crackling noises will be heard; Use the same AirPods Pro as 2 and change the phone to iPhone Xs / iOS 13.7. At the beginning, there was continuous noise and pitch shifting. After a few minutes of speaking, it returned to normal, and then the sound remained normal; Using the same AirPods Pro as 2 and changing the phone to iPhone 12 Pro Max / iOS 15.0.2, the sound is completely normal. Why does the same AirPods, on different iPhones, have such different performance of the collected sound quality, and the same phenomenon of using various VoIP apps, and all of them have problems on iOS 13? Does the 4A400 firmware have compatibility issues with the iOS 13 system? We noticed that the previous AirPods hardware sampling rate was 16kHz, but after upgrading to 4A400, the hardware sampling rate changed to 24kHz. Is the above noise related to the change of the hardware sampling rate? Do I need to modify the Audio Unit parameters to solve the above problems? Our app has a very large group of personal and corporate users. When they find that there is a problem with the sound, they will give us feedback, which brings us more pressure. Hope to get a reply from Apple or other developers, thank you!
Posted
by
Post not yet marked as solved
5 Replies
468 Views
I have been trying for the last day and a half to try and get an AU .component bundle with no success. The bundle in question is an AU implementation of a .vst3 plugin which I also have a bundle for which notarises without any issue. the AU implementation is achieved with the Steinberg AU wrapper library which implements an AU (v2) component that loads the vst3 plugin implementation contained in a .vst bundle under the Resources folder of the AU component bundle. I code sign and package the bundle with, codesign --force --options runtime --timestamp --sign "Developer ID Application: ***" HBDynamicsAU.component/Contents/Resources/plugin.vst3/Contents/MacOS/HBDynamicsVST codesign --force --options runtime --timestamp --sign "Developer ID Application: ***" HBDynamicsAU.component/Contents/MacOS/HBDynamicsAU codesign --force --timestamp --sign "Developer ID Application: ***" HBDynamicsAU.component productbuild --component HBDynamicsAU.component /Library/Audio/Plug-Ins/Components --root ./presets/AU /Library/Audio/Presets/HarBal/Dynamics/ --timestamp --sign "Developer ID Installer: ***" HBDynamicsAU-1.0.2.intel.64.pkg Checking the output of the signing process with, pkgutil --check-signature HBDynamicsAU-1.0.2.intel.64.pkg yields, Package "HBDynamicsAU-1.0.2.intel.64.pkg": Status: signed by a certificate trusted by Mac OS X Certificate Chain: 1. Developer ID Installer: Paavo Jumppanen (***) SHA1 fingerprint: B8 BD FF DC 43 1A 6B 25 BE 39 21 F2 B5 D1 3F C2 D7 B6 0B 1F ----------------------------------------------------------------------------- 2. Developer ID Certification Authority SHA1 fingerprint: 3B 16 6C 3B 7D C4 B7 51 C9 FE 2A FA B9 13 56 41 E3 88 E1 86 ----------------------------------------------------------------------------- 3. Apple Root CA SHA1 fingerprint: 61 1E 5B 66 2C 59 3A 08 FF 58 D1 4A E2 24 52 D1 98 DF 6C 60 which looks fine to me. Also doing, spctl -a -vvv -t install HBDynamicsAU-1.0.2.intel.64.pkg yields, HBDynamicsAU-1.0.2.intel.64.pkg: accepted source=Developer ID origin=Developer ID Installer: *** which again looks fine. Now if I notarise it I get the following outcome in the log, logFormatVersion 1 jobId "9127060e-ea60-4044-82e8-ba6a7cd234c6" status "Invalid" statusSummary "Archive contains critical validation errors" statusCode 4000 archiveFilename "HBDynamicsAU-1.0.2.intel.64.pkg" uploadDate "2021-10-04T03:46:59Z" sha256 "17dc1ba78e55349501913ee31648a49850aa996d0c822131cf7625096f5d827c" ticketContents null issues 0 severity "error" code null path "HBDynamicsAU-1.0.2.intel.64.pkg/com.har_bal.HarBal.dynamics_1.0.2.au.pkg Contents/Payload/Library/Audio/Plug-Ins/Components/HBDynamicsAU.component/Contents/MacOS/HBDynamicsAU" message "The signature of the binary is invalid." docUrl null architecture "x86_64" The file it is complaining about is the AU wrapper dynamic library which is HBDynamicsAU. If I now check that in my bundle (ie. before it was packaged) I get, codesign --verify --verbose -r- HBDynamicsAU.component/Contents/MacOS/HBDynamicsAU HBDynamicsAU.component/Contents/MacOS/HBDynamicsAU: valid on disk HBDynamicsAU.component/Contents/MacOS/HBDynamicsAU: satisfies its Designated Requirement so it looks ok. Now to check that nothing is wrong with the package I installed the plugin by double clicking on the .pkg file in finder and then did the codesign check with the installed plugin in place with, codesign --verify --verbose -r- /Library/Audio/Plug-Ins/Components/HBDynamicsAU.component/Contents/MacOS/HBDynamicsAU /Library/Audio/Plug-Ins/Components/HBDynamicsAU.component/Contents/MacOS/HBDynamicsAU: valid on disk /Library/Audio/Plug-Ins/Components/HBDynamicsAU.component/Contents/MacOS/HBDynamicsAU: satisfies its Designated Requirement which looks perfectly fine as well. So what gives, why does it fail notarisation when every check you are told to do to debug issues says it is ok. This looks like a bug in the process but I have no idea. The limited information returned through the log is not enough to isolate the problem. How do I fix this and indeed, can I fix this? regards, Paavo. PS - Oh, and to check that it wasn't a problem with the actual dylib I swapped the libraries around in the package so the VST3 one was where the AU one was and visa versa and ran the notarisation again and it it again said the library under HBDynamicsAU.component/Contents/MacOS was the problem (which is now the vast one cos I swapped it) so it clearly has nothing to do with the library.
Posted
by
Post not yet marked as solved
0 Replies
481 Views
The voice call in our game has such a usage scenario: when the player does not wear the headset or Bluetooth, we use the VPIO mode to turn on the microphone. On the contrary, the player wears the headset or Bluetooth, and the remoteIO mode is used to turn on the microphone. So we have a switch between VPIO and RemoteIO。When we only use RemoteIO (the player has been wearing headphones or Bluetooth since the game), the output volume does not change before and after the microphone is switched on. When we only use the VPIO mode (the player has not been wearing headphones or Bluetooth since the game), before and after turning on the microphone, the output volume will reduce a little (because turning on the microphone will cause the phone to enter the VPIO and the echo cancellation will be turned on), the performance of the output volume in the above two scenarios is normal。However, when we switched between VPIO and RemotoIO (the player didn’t wear headphones or Bluetooth at the beginning, and brought headphones or Bluetooth in the middle of the game), we encountered some problems: As long as I have used VPIO mode before RemotoIO mode, the output volume in RemoteIO mode after turning on the microphone will be the same as in VPIO mode (normally the output volume in RemoteIO should be greater than in VPIO mode), turn off the microphone ,The output volume will become normal.What confuses me is: My understanding is that when we use RemoteIO, the mobile phone should not do some suppression-like speech algorithms, so when we only use RemoteIO, the output volume of the mobile phone does not change. When we use VPIO, the mobile phone will use echo Algorithms, and perhaps dynamic compression and some gain algorithm processing. At this time, the output volume will be reduced to better handle the echo. This behavior is normal. However, when I switch between VPIO and RemotIO, I feel that when I use RemoteIO (VPIO resources are released), some of the previous VPIO algorithm processing is still reserved (may be dynamic compression or gain algorithm), which finally leads to the output volume under RemoteIO is the same as that under VPIO, and this only happens on all of IOS14, the previous version is normal (anytime you enter the remoteIO mode, the volume does not change). I want to know, For IOS14 this behavior (VPIO and RemoteIO switching causes the volume of RemoteIO to decrease) is normal? If it is not normal, how can we solve it?
Posted
by
Post not yet marked as solved
0 Replies
329 Views
When I run my AUv3 synth inside a host on an iPad under certain conditions I'm receiving repeated MIDI events. Still figuring out what the exact trigger is. Could be a CPU overload, not sure yet. Thought I'd ask here if anyone else has ideas as to what might be going on. After inspecting the passed in variable AURenderEvent* realtimeEventListHead, it looks like it contains MIDI events that were already handled in previous callbacks. These MIDI events have timestamps that are older than the passed in AudioTimeStamp* timestamp. I sometimes receive the same events like 8 times in a row. i.e. in 8 render callbacks. And these events all have the same timestamp. So I'm not sure why I'm receiving them again. Could system be assuming they weren't handled and is sending them again? I'm on iOS 15. (Not sure if this is happening on iOS 14 also. Don't have an iOS 14 device to test it on.) I reproduced this issue, both in AUM and GarageBand.
Posted
by
Post not yet marked as solved
0 Replies
299 Views
My company develops an audio plugin that downloads a dylib file from our server in order to ensure that the plugin is always up-to-date. The dylib file is signed and hardened, and this process has worked correctly for years. We've installed the OSX 12 Beta, and the software continues to work properly, except in GarageBand. GarageBand refuses to load the dylib, saying the file "cannot be opened because Apple cannot check it for malicious software." This dylib file cannot be installed via a stapled package. Is there a mechanism by which we can staple an individual dylib file? We've tried adding the dylib to a .zip file and uploading it to the notarization service. The file is approved, but the staple tool apparently won't staple a single dylib file, expecting it instead to be part of a package. Is there any way around this? Will GarageBand for OSX 12 be repaired? The plugin works correctly in Logic on OSX 12. Thanks!
Posted
by
Post marked as solved
1 Replies
605 Views
Hey folks, I've been able to build and run the 'StarterAudioUnitExample' project provided by Apple, in Xcode 12.5.1, and run and load in Logic 10.6.3 on macOS 11.5.2. However, when trying to recreate the same plugin from a blank project, I'm having trouble with AUVAL or Logic actually instantiating and loading the component. See auval output below: validating Audio Unit Tremelo AUv2 by DAVE: AU Validation Tool Version: 1.8.0 Copyright 2003-2019, Apple Inc. All Rights Reserved. Specify -h (-help) for command options VALIDATING AUDIO UNIT: 'aufx' - 'trem' - 'DAVE' Manufacturer String: DAVE AudioUnit Name: Tremelo AUv2 Component Version: 1.0.0 (0x10000) PASS TESTING OPEN TIMES: COLD: FATAL ERROR: OpenAComponent: result: -1,0xFFFFFFFF validation result: couldn’t be opened Does anyone, hopefully someone from Apple, know what the error code <FATAL ERROR: OpenAComponent: result: -1,0xFFFFFFFF> actually refers too? I've been Googling for hours, and nothing I have found had worked for me so far. Also, here is the info.plist file too : Anyone that could help steer me in the right direction? Thanks, Dave
Posted
by
Post not yet marked as solved
0 Replies
332 Views
I have the bare bones of an AUv3 plug in. I have used UIHostingController to allow me to add a SwiftUI view. Then from the SwiftUI I have added a SprikeKit scene which I want to use to start building an interface for the AU. It's a very basic GameScene for now. If I run one instance of it in Logic then it's OK. If I add another instance it crashes with a Thread 1: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0) error. This seems to be down to how I declare a SpriteNode variable. If I declare it within the GameScene as a "let" then it's fine. If I declare it outside the class - (see hashed code) then it will crash with two instances. It would be useful to me later on for this to be a public variable as I want to perform functions on it outside of the class. Is there a better way to declare the SpriteNode variable to make it stable? import SpriteKit //public var ball = SKShapeNode(circleOfRadius: 30) var ball = SKShapeNode(circleOfRadius: 30) class GameScene: SKScene {         let ball = SKShapeNode(circleOfRadius: 30)     override func didMove(to view: SKView) {            //physicsBody = SKPhysicsBody(edgeLoopFrom: frame)         ball.position = CGPoint(x: 200, y: 200)         ball.fillColor = .lightGray         self.addChild(ball)          }
Posted
by
Post marked as solved
1 Replies
532 Views
I have just begun to start building plugins for Logic using the AUv3 format. After a few teething problems to say the least I have a basic plug in working (integrated with SwiftUI which is handy) but the install and validation process is still buggy and bugging me! Does the stand alone app which Xcode generates to create the plugin have to always be running separately to Logic for the AUv3 to be available? Is there no way to have it as a permanently available plugin without running that? If anyone has any links to a decent tutorial please.. there are very few I can find on YouTube or anywhere else and the Apple tutorials and examples aren't great.
Posted
by
Post not yet marked as solved
0 Replies
412 Views
AudioUnit is kAudioUnitSubType_VoiceProcessingIO ExceptionHandling        0x7fff3bde6f31 -[NSExceptionHandler _handleException:mask:] + 364 ExceptionHandling        0x7fff3bde6cac NSExceptionHandlerUncaughtSignalHandler + 35 libsystem_platform.dylib    0x7fff203bad7d _sigtramp + 29 0x0000000000000000       0x7000043de580 0x0 + 123145373476224 CoreAudio            0x7fff220a1ee9 _ZN9HALDevice4DuckEfPK14AudioTimeStampf + 921 CoreAudio            0x7fff21c08598 AudioDeviceDuck + 843 AudioDSP            0x13ed3161f _Z14DuckOtherAudiojff + 51 AudioDSP            0x13ee64a3b _ZN16AUVoiceProcessor22DestroyAggregateDeviceEv + 829 AudioDSP            0x13ee65d59 _ZN16AUVoiceProcessorD2Ev + 415 AudioDSP            0x13ef43342 _ZN13ComponentBase8AP_CloseEPv + 30 AudioToolboxCore        0x7fff217b5c8c _ZN19APComponentInstance15disposeInstanceEv + 40 AudioToolboxCore        0x7fff218b92ef AudioComponentInstanceDispose + 40
Posted
by