I think you're supposed to use AUv3 audio units and instead let the UI be loaded into the AU hosting app. Although you wrote my goto AUv3 sample code (auv3test5) so I guess there's some reason why AUv3 is not a solution to your problem.
[tags:wwdc20-10224]
12 results found
Post
Replies
Boosts
Views
Activity
I have one iOS app that is an iPad touchscreen virtual musical instrument. I have another app that is an audio visualizer (36-th octave spectrum analyzer). I used to use inter-app audio to pipe audio samples from the first app to either the second app, or to GarageBand for recording. But inter-app audio has been deprecated. My question is what is Apple's suggested method (API's etc.) to get audio samples generated in one UI app to play and display in another UI app (say on an iPad, where both apps can be on the display) in performance-near-real-time without using inter-app audio?
My (Mac) application is latency sensitive, so I let the AudioServer know (via kAudioDevicePropertyIOCycleUsage) how late it can call my thread to provide output for the device. So far, I've done this by benchmarking a worst-case work-load when setting up my IOProc func (see here - https://github.com/q-p/SoundPusher/blob/master/SoundPusher/DigitalOutputContext.cpp#L97 if you're curious). How would I do this now with potentically asymmetric cores? I would like my benchmark to be called under the same performance characteristics as under the real output case, but without actually having a real deadline or having to produce real output.
I'll try to explain again: My app is forwarding audio from one (virtual) input device to (real) output device. This process is driven by the IOProc of the output device. As I want to incur as little latency as possible (i.e. use the most up-to-date data from the input device), I want my IOProc to be scheduled as late as possible while still hitting the deadline (see the documentation for kAudioDevicePropertyIOCycleUsage - https://developer.apple.com/documentation/coreaudio/kaudiodevicepropertyiocycleusage the header docs are more helpful, the web documentation seems useless). But I do have a bit of work to do to the data before it's ready, so I need to figure out what I can set kAudioDevicePropertyIOCycleUsage to and still hit the deadline (from the documentation you get scheduled the entire cycle of the audio duration with the default IOCycleUsage). So far, I've been benchmarking the workload how long my audio fiddling needs (on some synthetic input data) before starting the IOProc, and then multiplied that
I am not really following what exactly you are trying to do. But here are some comments on your post: IOProcs will run on the HAL real-time thread that is associated with an audio workgroup in the OS, meaning that it will have performance impact in the system, such always running threads on performance cores and higher power consumption. If you want to run other threads concurrently with the same priority and performance/power properties as the HAL thread, please use the new audio workgroup APIs - https://developer.apple.com/videos/play/wwdc2020/10224/.
Some testing on actual hardware seems to indicate that the IOProcs (which default to the workgroup of their respective device) seem to end up on the performance cores. If I could force the scheduler to run my benchmark either on the same cores as the IOProcs will, *or* force it to the efficient cores (that way I will have increased latency but at least it wouldn't matter where the actual IOProcs are scheduled, I will always have enough leeway). But as far as I'm aware, macOS doesn't have any core affinity APIs. As my benchmark is triggered in response to a UI interaction, I'd assume it runs w/ high QoS (as it's user-interactive), so currently it's probably also on the performance cores...
Hi. Just watched Doug’s session on Audio Workgroups. The methods used to access the new API includes support for AUv2 audio units. Is this an indication that AUv2 audio units ( and Inter App Audio ) are still going to be supported going forward? Thanks!
Thanks @NikoloziApps . Will have to wait and see. My App uses IAA and is still working on the current iOS 14 Beta, so safe for awhile longer!
Thanks! Have any idea if joining threads as such would be backward compatible let’s say iOS 12,13 underneath?
Just saw the Audio Workgroup WWDC20 video. Thanks for bringing this up as it reduces uncertainty for most real-time audio Apps. I am wondering about the availability of Workgroups. The speaker mentions Fall 2020. Is this already part of iOS 14? Is this only iOS 14+ feature or is there backward compatibility? thanks in advance, Arshia Cont
On macOS AUv2 is still going strong and in fact, hardly any DAWs support AUv3 (on Mac) unfortunately. So, I think AUv2s will be around for a long time. Pretty sure IAA is going away though.
Yes, it is already part of iOS 14 and macOS 11. If you search for Workgroup Management and look at the APIs you'll see it under Availability.