Hi,
I'm using AVAudioPlayer to play audio files at the same time as an AVAudioEngine is running (to process and record from the microphone).
Is there any way to get frame-accurate sync between these two APIs?
I'm using
play(atTime:)
to start my AVAudioPlayer isntance at a known deviceCurrentTime
value. Meanwhile, I've installed a tap on the microphone with AVAudioEngine. In the tap callback, I'm getting an AVAudioTime which, as I understand it (and please correct me if I'm wrong!), is a timestamp that corresponds to the node's lastRenderTime
, and is basically a reference point to a timeline of samples common to all nodes in a running AVAudioEngine instance.Is there any way to calculate the
deviceCurrentTime
that corresponds to AVAudioEngine's lastRenderTime
? I guess the question boils down to whether both timelines depend on the same lower-level clock, rather than having arbitrary 0 points imposed by each API.So far it looks like this is impossible, and the way forward would be to move playback to the same AVAudioEngine I'm using to tap the microphone, but maybe I'm missing something?
Many thanks!
Yes. AVAudioPlayer's deviceCurrentTime returns this:
double(CAHostTimeBase::GetCurrentTime()) * CAHostTimeBase::GetInverseFrequency()
The CAHostTimeBase utility class is discussed here Audio Host Time On iOS