Core Audio Kit

RSS for tag

Add user interfaces to Cocoa audio units using Core Audio Kit.

Core Audio Kit Documentation

Posts under Core Audio Kit tag

11 Posts
Sort by:
Post not yet marked as solved
0 Replies
328 Views
I have a music player that is able to save and restore AU parameters using the kAudioUnitProperty_ClassInfo property. For non apple AUs, this works fine. But for any of the Apple units, the class info can be set only the first time after the audio graph is built. Subsequent sets of the property do not stick even though the OSStatus code is 0 upon return. Previously this had worked fine. But sometime, not sure when, the Apple provided AUs changed their behavior and is now causing me problems. Can anyone help shed light on this ? Thanks in advance for the help. Jeff Frey
Posted
by
Post not yet marked as solved
2 Replies
651 Views
I have some visualisation-heavy AUv3's, and the goal is not to perform graphics-intensive tasks if the plugin window is not opened inside the host app (such as Logic Pro). On iOS, this is easily accomplished by the viewWillAppear, etc overrides. But on macOS, it seems these overrides are not called every time the user opens / closes the plugin windows in the host application. I did try some alternate methods, like trying to traverse the view / controller hierarchy, or make use of the window property, to no avail. What substitute mechanism could I use to determine visibility status of an AUv3 on macOS? Thanks in advance, Zoltan
Posted
by
Post not yet marked as solved
1 Replies
477 Views
I'm trying to build an audio unit that can load another one. The first step, listing and getting the components only works in the example code if and when the audio unit is loaded by its accompanying main app. But when loaded inside Logic Pro for example, the listed components will be limited to Apple-manufactured ones. On another forum, although not for an AppEx, someone indicated that the solution was to enable Inter-App Audio, which is now deprecated. Tried all three methods of AVAudioUnitComponentManager to get the component list. Please advise. Many thanks, Zoltan
Posted
by
Post not yet marked as solved
0 Replies
474 Views
Some users reported issues when using our audio units with recent version of Logic Pro. After investigation, it seems like there is an issue where the modal won't appear if the view is not open on Logic with AUHostingServices. After trying with a JUCE example plugin on latest version, it also fails to run a modal while the view is not open. The system call results in a NSModalResponseAbort return code. I'm nearly sure that's a recent regression of the AUHostingServices, but may be it's always been there. Could you help us finding a solution or at least a workaround ? Thanks in advance, Best regards
Posted
by
Post not yet marked as solved
0 Replies
973 Views
I have an app that supports AUv3 and can output both Audio and MIDI information. It has the Audio Unit Type set to "aumu" which is a 'music generator'. This has worked well In iOS hosts until now (like AUM, or ApeMatrix) - because they allow the selection of this type of Audio Unit both as an audio generator and as a MIDI sender. So if someone wants to use it as an audio generator, they can - same as someone who only wants to use it as MIDI FX. Logic Pro on the iPad has changed this since it's not seeing it under the Midi FX selection. Is there any configuration to bypass this, or do I have to change to a Midi FX AUv3 ('aumi') for this to show up there?
Posted
by
Post not yet marked as solved
3 Replies
951 Views
Back at the start of January this year we filed a bug report that still hasn't been acknowledged. Before we file a code level support ticket, we would love to know if there’s any one else out there who has experienced anything similar. We have read documentation (and again repeatedly) and searched the web over and still found no solution and this issue does look like it could be a bug in the system (or our coding) rather than proper behaviour. The app is a host for a v3 audio unit which itself is a workstation that can host other audio units. The host and audio unit are both working well in all other areas other than this issue. Note: This is not running on catalyst, it is a native Mac OS app. ( not iOS ) The problem is that when an AUv3 is hosted out of process (on the Mac) and then goes to fetch a list of all the other available installed audio units, the list that is returned from the system does not contain any of the other v3 audio units in the system. It only contains v2. We see this issue if we load our AU out of process in our own bespoke host, and also when it loads into Logic Pro which gives no choice but to load out of process. This means that, as it stands at the moment, when we release the app our users will have limited functionality in Logic Pro, and possibly by then, other hosts too. In our own host we can load our hosting AU in-process and then it can find and use all the available units both v2 and v3. So no issue there but sadly when loaded into the only other host that can do anything like this ( Logic Pro at the time of posting) it won't be able to use v3 units which is quite a serious limitation. SUMMARY v3 Audio Unit being hosted out of process. Audio unit fetches the list of available audio units on the system. v3 audio units are not provided in the list. Only v2 are presented. EXPECTED In some ways this seems to be the opposite of the the behaviour we would expect. We would expect to see and host ALL the other units that are installed on the system. “out of process” suggests the safer of the two options and so this looks like it could be related to some kind of sand boxing issue. But sadly we cannot work out a solution hence this report. Is Quin “The Eskimo” still around? Can you help us out here?
Posted
by
Post not yet marked as solved
0 Replies
703 Views
If I set the kAUVoiceIOProperty_BypassVoiceProcessing parameter to true on the voice processing audio unit, it appears that the microphone mode setting in the Control Center still takes effect. For example, if the user has selected the voice isolation mode, the audio unit will still function in voice isolation mode. However, on the iPhone 14, the microphone mode is not displayed in the Control Center when kAUVoiceIOProperty_BypassVoiceProcessing is set to true, unlike on other iPhone models such as the iPhone 11 and iPhone 12. As a result, after setting kAUVoiceIOProperty_BypassVoiceProcessing to true, the voice processing unit will continue to use the last selected microphone mode for my app, and it cannot be changed on the iPhone 14. I have the following questions about this: Is it intended that microphone modes in the Control Center take precedence over the kAUVoiceIOProperty_BypassVoiceProcessing setting on the audio unit? My app has a music mode that captures music through mic. Is there a way to avoid all voice processing, even bypass the mic modes in control center? Why is the microphone mode not displayed in the Control Center on the iPhone 14 Pro when kAUVoiceIOProperty_BypassVoiceProcessing is set to true on the voice processing audio unit, while other iPhone models display it?
Posted
by
Post marked as solved
33 Replies
13k Views
Hi all,  Apple dropping on-going development for FireWire devices that were supported with the Core Audio driver standard is a catastrophe for a lot of struggling musicians who need to both keep up to date on security updates that come with new OS releases, and continue to utilise their hard earned investments in very expensive and still pristine audio devices that have been reduced to e-waste by Apple's seemingly tone-deaf ignorance in the cries for on-going support.  I have one of said audio devices, and I'd like to keep using it while keeping my 2019 Intel Mac Book Pro up to date with the latest security updates and OS features.  Probably not the first time you gurus have had someone make the logical leap leading to a request for something like this, but I was wondering if it might be somehow possible of shoe-horning the code used in previous versions of Mac OS that allowed the Mac to speak with the audio features of such devices to run inside the Ventura version of the OS.  Would it possible? Would it involve a lot of work? I don't think I'd be the only person willing to pay for a third party application or utility that restored this functionality. There has to be 100's of thousands of people who would be happy to spare some cash to stop their multi-thousand dollar investment in gear to be so thoughtlessly resigned to the scrap heap.  Any comments or layman-friendly explanations as to why this couldn’t happen would be gratefully received!  Thanks,  em
Posted
by