Posts

Post marked as solved
4 Replies
2.8k Views
I've looked around the documentation and searched a bit on google and I'm curious if it possible to have a sidechain input using the v3 Audio Units API in both iOS and OSX. I would like to be able to instantiate a plug-in on one track and select another track to manipulate the Audio Units effect on the first track. If so I was wondering where I might find documentation on this.
Posted
by bbrysiuk.
Last updated
.
Post not yet marked as solved
1 Replies
667 Views
I'm having an interesting issue with the frames requested and produced when sample rate conversion is happening on iOS. When I downsample from 48000 to 44100 as expected every 5th interrupt 236 frames are requested instead of 235 due to 256 *(44100/48000) = 235.2. However when the input produces 236 and the output requests 236 is out of sync by one interrupt. In that, the output will request 235 and when the AudioUnitRender is called on the input it produces 236 frames and then in the next interrupt the output requests 236 but only 235 is produced. I've currently solved the issue by inserting a buffer which then saves the extra from the first and adds it to the beginning of the second. However, this merely solves the symptom and not the actual cause.I'm using a modified version of the Amazing Audio Engine 2 to which I added the stream format specifications to the remoteIO. In it has a specified InputCallback which just grabs the input timestamp and stores it. However, even if this input callback doesn't exist the issues still does. Unfortunately, the offending code is included with some proprietary code so I'm starting with the question in case someone has experienced this and knows why it's happening. I'm trying to create a simple project that replicates the issue I can include code
Posted
by bbrysiuk.
Last updated
.