AVAudioEngine

RSS for tag

Use a group of connected audio node objects to generate and process audio signals and perform audio input and output.

Posts under AVAudioEngine tag

48 Posts

Post

Replies

Boosts

Views

Activity

Understanding AVAudioTime in AVAudioNodeTapBlock? Is there a way to get time relative to a scheduled Buffer?
I'm using AVAudioEngine to play AVAudioPCMBuffers. I'd like to synchronize some events with the playback. For example if the audio's frame position is >= some point && less than some point trigger some code. So I'm looking at - (void)installTapOnBus:(AVAudioNodeBus)bus bufferSize:(AVAudioFrameCount)bufferSize format:(AVAudioFormat * __nullable)format block:(AVAudioNodeTapBlock)tapBlock; Now I have frame positions calculated (predetermined before audio is scheduled I already made all necessary computations) . So I just need to fire code at certain points during playback: [playerNode installTapOnBus:bus bufferSize:bufferSize format:format block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when) { //Inspect current audio here and fire... }]; [playerNode scheduleBuffer:fullbuffer atTime:startTime options:0 completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) { // some code is here, not important to this question. }]; The problem I'm having is figuring out at what point in full buffer I'm at within the tap block. The tap block passes chunks (not the full audio buffer). I tried using the when parameter of the block to calculate the frame position relative to the entire audio but have be unsuccessful so far. I'm assuming the when parameter is relative to the buffer passed in the tap block (not my entire audio buffer I scheduled). Not installing a tap and just using a timer before scheduling my fullBuffer has given me good results but I'd rather avoid using a timer if possible and use sample time.
3
0
1.5k
Dec ’24
AVAudioEngineConfigurationChange Clearing AVPlayerNode
Hi all, I am working on an app where I have live prompts playing, in addition to a voice channel that sometimes becomes active. Right now I am using two different AVAudioSession Configurations so what we only switch to a mic enabled mode when we actually need input from the mic. These are defined below. When just using the device hardware, everything works as expected and the modes change and the playback continues as needed. However when using bluetooth devices such as AirPods where the switch from AD2P to HFP is needed, I am getting a AVAudioEngineConfigurationChange notification. In response I am tearing down the engine and creating a new one with the same 2 player nodes. This does work fine and there are no crashes, except all the audio I have scheduled on a player node has now been cleared. All the completion blocks marked with ".dataPlayedBack" return the second this event happens, and leaves me in a state where I now have a valid engine setup again but have no idea what actually played, or was errantly marked as such. Is this the expected behavior when getting a configuration change notification? Adding some information below to my audio graph for context: All my parts of the graph, I disconnect when getting this event and do the same to the new engine private var inputEngine: AVAudioEngine private var audioEngine: AVAudioEngine private let voicePlayerNode: AVAudioPlayerNode private let promptPlayerNode: AVAudioPlayerNode audioEngine.attach(voicePlayerNode) audioEngine.attach(promptPlayerNode) audioEngine.connect( voicePlayerNode, to: audioEngine.mainMixerNode, format: voiceNodeFormat ) audioEngine.connect( promptPlayerNode, to: audioEngine.mainMixerNode, format: nil ) An example of how I am scheduling playback, and where that completion is firing even if it didn't actually play. private func scheduleVoicePlayback(_ id: AudioPlaybackSample.Id, buffer: AVAudioPCMBuffer) async throws { guard !voicePlayerQueue.samples.contains(where: { $0 == id }) else { return } seprateQueue.append(buffer) if !isVoicePlaying { activateAudioSession() } voicePlayerQueue.samples.append(id) if !voicePlayerNode.isPlaying { voicePlayerNode.play() } if let convertedBuffer = buffer.convert(to: voiceNodeFormat) { await voicePlayerNode.scheduleBuffer(convertedBuffer, completionCallbackType: .dataPlayedBack) } else { throw AudioPlaybackError.failedToConvert } voiceSampleHasBeenPlayed(id) } And lastly my audio session configuration if its useful. extension AVAudioSession { static func setDefaultCategory() { do { try sharedInstance().setCategory( .playback, options: [ .duckOthers, .interruptSpokenAudioAndMixWithOthers ] ) } catch { print("Failed to set default category? \(error.localizedDescription)") } } static func setVoiceChatCategory() { do { try sharedInstance().setCategory( .playAndRecord, options: [ .defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP, .duckOthers, .interruptSpokenAudioAndMixWithOthers ] ) } catch { print("Failed to set category? \(error.localizedDescription)") } } }
1
0
674
Dec ’24
iOS Audio Crackling issue when send audio data to UDP server and Play
I am experiencing an issue while recording audio using AVAudioEngine with the installTap method. I convert the AVAudioPCMBuffer to Data and send it to a UDP server. However, when I receive the Data and play it back, there is continuous crackling noise during playback. I am sending audio data using this library "https://github.com/mindAndroid/swift-rtp" by creating packet and send it. Please help me resolve this issue. I have attached the code reference that I am currently using. Thank you. ViewController.swift
0
0
545
Nov ’24
Connect 2 mono nodes as L/R input for a stereo node
Hello, I'm fairly new to AVAudioEngine and I'm trying to connect 2 mono nodes as left/right input to a stereo node. I was successful in splitting the input audio to 2 mono nodes using AVAudioConnectionPoint and channelMap. But I can't figure out how to connect them back to a stereo node. I'll post the code I have so far. The use case for this is that I'm trying to process the left/right channels with separate audio units. Any ideas? let monoFormat = AVAudioFormat(standardFormatWithSampleRate: nativeFormat.sampleRate, channels: 1)! let leftInputMixer = AVAudioMixerNode() let rightInputMixer = AVAudioMixerNode() let leftOutputMixer = AVAudioMixerNode() let rightOutputMixer = AVAudioMixerNode() let channelMixer = AVAudioMixerNode() [leftInputMixer, rightInputMixer, leftOutputMixer, rightOutputMixer, channelMixer].forEach { engine.attach($0) } let leftConnectionR = AVAudioConnectionPoint(node: leftInputMixer, bus: 0) let rightConnectionR = AVAudioConnectionPoint(node: rightInputMixer, bus: 0) plugin.leftInputMixer = leftInputMixer plugin.rightInputMixer = rightInputMixer plugin.leftOutputMixer = leftOutputMixer plugin.rightOutputMixer = rightOutputMixer plugin.channelMixer = channelMixer leftInputMixer.auAudioUnit.channelMap = [0] rightInputMixer.auAudioUnit.channelMap = [1] engine.connect(previousNode, to: [leftConnectionR, rightConnectionR], fromBus: 0, format: monoFormat) // Process right channel, pass through left channel engine.connect(rightInputMixer, to: plugin.audioUnit, format: monoFormat) engine.connect(plugin.audioUnit, to: rightOutputMixer, format: monoFormat) engine.connect(leftInputMixer, to: leftOutputMixer, format: monoFormat) // Mix back to stereo? engine.connect(leftOutputMixer, to: channelMixer, format: stereoFormat) engine.connect(rightOutputMixer, to: channelMixer, format: stereoFormat)
1
0
530
Nov ’24
AVAudioPlayerNode scheduleBuffer leaks memory
I'm building a streaming app on visionOS that can play sound from audio buffers each frame. The audio format has a bitrate of 48000, and each buffer has 480 samples. I noticed when calling audioPlayerNode.scheduleBuffer(audioBuffer) The memory keeps increasing at the speed of 0.1MB per second And at around 4 minutes, the node seems to be full of buffers and had a hard reset, at which point, the audio is stopped temporary with a memory change. see attached screenshot. However, if I call audioPlayerNode.scheduleBuffer(audioBuffer, at: nil, options: .interrupts) The memory leak issue is gone, but the audio is broken (sounds like been shortened). Below is the full code snippet, anyone knows how to fix it? @Observable final class MyAudioPlayer { private var audioEngine: AVAudioEngine = .init() private var audioPlayerNode: AVAudioPlayerNode = .init() private var audioFormat: AVAudioFormat? init() { audioEngine.attach(audioPlayerNode) audioEngine.connect(audioPlayerNode, to: audioEngine.mainMixerNode, format: nil) try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .default) try? AVAudioSession.sharedInstance().setActive(true) audioEngine.prepare() try? audioEngine.start() audioPlayerNode.play() } // more code... /// callback every frame private func audioFrameCallback_Non_Interleaved(buf: UnsafeMutablePointer<Float>?, samples: Int) { guard let buf, let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 48000, channels: 2, interleaved: false), let audioBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(samples)) else { return } audioBuffer.frameLength = AVAudioFrameCount(samples) if let data = audioBuffer.floatChannelData { for channel in 0 ..< Int(format.channelCount) { for frame in 0 ..< Int(audioBuffer.frameLength) { data[channel][frame] = buf[frame * Int(format.channelCount) + channel] } } } // memory leak here audioPlayerNode.scheduleBuffer(audioBuffer) } }
1
0
738
Nov ’24
AVAudioSession gets interrupted when closing a window
I have a visionOS app that plays audio using AVAudioEngine and presents both a window and an immersive space. If I close the window, the audio session gets interrupted and attempting to restart the session and audio engine has no effect. I need to dismiss the app, then reopen it, which reopens the main window, in order for audio to start playing again. This is in all visionOS 2 betas. Note that I have background audio enabled for my app.
3
1
834
Nov ’24
AVAudioPlayerNode can't play interleaved AVAudioPCMBuffer
I'm building a streaming app on visionOS that can play sound from audio buffers each frame. The source audio buffer has 2 channels and is in a Float32 interleaved format. However, when setting up the AVAudioFormat with interleaved to true, the app will crash with a memory issue: AURemoteIO::IOThread (35): EXC_BAD_ACCESS (code=1, address=0x3) But if I set AVAudioFormat with interleaved to false, and manually set up the AVAudioPCMBuffer, it can play audio as expected. Could you please help me fix it? Below is the code snippet. @Observable final class MyAudioPlayer { private var audioEngine: AVAudioEngine = .init() private var audioPlayerNode: AVAudioPlayerNode = .init() private var audioFormat: AVAudioFormat? init() { audioEngine.attach(audioPlayerNode) audioEngine.connect(audioPlayerNode, to: audioEngine.mainMixerNode, format: nil) try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .default) try? AVAudioSession.sharedInstance().setActive(true) audioEngine.prepare() try? audioEngine.start() audioPlayerNode.play() } // more code... /// This crashes private func audioFrameCallback_Interleaved(buf: UnsafeMutablePointer<Float>?, samples: Int) { guard let buf, let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 480000, channels: 2, interleaved: true), let audioBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(samples)) else { return } audioBuffer.frameLength = AVAudioFrameCount(samples) if let data = audioBuffer.floatChannelData?[0] { data.update(from: buf, count: samples * Int(format.channelCount)) } audioPlayerNode.scheduleBuffer(audioBuffer) } /// This works private func audioFrameCallback_Non_Interleaved(buf: UnsafeMutablePointer<Float>?, samples: Int) { guard let buf, let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 480000, channels: 2, interleaved: false), let audioBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(samples)) else { return } audioBuffer.frameLength = AVAudioFrameCount(samples) if let data = audioBuffer.floatChannelData { for channel in 0 ..< Int(format.channelCount) { for frame in 0 ..< Int(audioBuffer.frameLength) { data[channel][frame] = buf[frame * Int(format.channelCount) + channel] } } } audioPlayerNode.scheduleBuffer(audioBuffer) } }
1
0
686
Nov ’24
AVAudioEngine Dolby Atmos
Hi! I have a music app using AVAudioEngine. Right now, I have set it up to play multi channel tracks and show "Multichannel" in the volume controls. However, I am unable to figure out how to get it to use Dolby Atmos. Is there something that needs to be enabled? Is it even possible for AVAudioEngine? I saw some apps that are able of playing with Dolby Atmos, but they do not have EQ feature, so I'm guessing that they are not using AVAudioEngine.
2
0
962
Nov ’24