Microphone crashing when change input source

I am using the microphone, and found a lot of issues that crashes the app when the input source did change. The error was always something like:

Code Block
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: format.sampleRate == hwFormat.sampleRate.

Now I have fixed this issue, forcing the input-source to be always the builtin microphone and using AVAudioFoundation entities at different moments. But I am still not confident about this solution. For example, I still having some inconsistencies while transmitting the device screen on macOS through QuickTime. So my question around this:
  1. How to prevent app-crashes with this kind of error?

  2. I saw a lot of similar errors just changing the property of the format. How can I avoid these kinds of errors? There is a Swift-way to handle the error instead of using an exception catcher from Objective-c?

Replies

You may try to obtain the sample rate by :

Code Block
let input = avAudioEngine.inputNode
let sampleRate = input.inputFormat(forBus: 0).sampleRate // the default sample rate from mic is 48000
let channelCount = input.inputFormat(forBus: 0).channelCount // 1

then you can set your av audio format as following:

Code Block
let avAudioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: sampleRate, channels: channelCount, interleaved: false)




Here is the link of a answer from stack overflow. Change Audio input programmatically doesn't change Audio engine input AVFoundation

You crashed because after input source changed, let input = avAudioEngine.inputNode does not update to new input source. Therefore, just re-create the avAudioEngine.

audioEngine = AVAudioEngine()

So now you have to understand

  • first understand what installTap does when you call it on inputNode?

Ans: it just listens to Node's Buffer but before installation we have to tell it which Bus to installTap, what will be the bufferSize that will come from inputNode, and what will be the audioFormat of input of inputNode so whenever you want change the input microphone there are proper safe flow which has to be followed.

class AudioProcessor
{
     // Audio resources
     var audioEngine: AVAudioEngine!
     var inputNode: AVAudioInputNode!
     var formatIn: AVAudioFormat
     static let sharedInstance = AudioProcessor()

     private init()
     {
          self.audioEngine = AudioResources.sharedInstance.audioEngine
          self.inputNode = self.audioEngine.inputNode
          self.formatIn = self.inputNode.inputFormat(forBus: 0)
     }

     open func pause()
     {
          self.audioEngine.inputNode.removeTap(onBus: 0)
          self.audioEngine.stop()
     }
     
     open func resume()
     {
               if !self.audioEngine.isRunning
               {
                    self.installTap()
                    self.audioEngine.prepare()
                    self.audioEngine.inputNode.volume = 1.0
                    try self.audioEngine.start()
               }
     }


     func installTap()
     {
          self.audioEngine = AudioResources.sharedInstance.audioEngine
          self.inputNode = self.audioEngine.inputNode
          self.formatIn = self.inputNode.inputFormat(forBus: 0)          
          self.inputNode.installTap(
               onBus: 0,
               bufferSize: 4096,
               format: self.formatIn
          )
          { (buffer,_) in
               //DO WHATEVER YOU WANT TO DO WITH BUFFER
          }
     }

}

//After that add notification observers in your VC

NotificationCenter.default.addObserver(self, selector: #selector(self.handleRouteChange(_:)), name: AVAudioSession.routeChangeNotification, object: nil)


@objc func handleRouteChange(_ notification: Notification) {
        let devices = AVAudioSession.sharedInstance().availableInputs ?? []
        let currentDevice = AVAudioSession.sharedInstance().currentRoute.inputs.first?.portName ?? ""
        
        if let inputDevicePortDescription = devices.first(where: {$0.portName == currentDevice}) {
             AudioProcessor.sharedInstance.pause()
             AudioProcessor.sharedInstance.resume()
        }
    }