AVCaptureSession runtime error -11800 / 'what' on startRunning() with audio input — what's holding the HAL?

AVCaptureSession.startRunning() triggers AVCaptureSessionRuntimeErrorNotification with AVError.unknown (-11800), underlying OSStatus 2003329396 → fourCC 'what', every cold launch, but only when an audio AVCaptureDeviceInput is attached. Removing only the audio input makes the error disappear. Same code in a fresh project records audio fine — bug only appears in this app's binary.

AVAudioApplication.shared.recordPermission == .granted. Info.plist has NSMicrophoneUsageDescription. No interruption notifications fire.

Test device: iPhone 16 Pro, iOS 26.4.2. iOS deployment target 17.1.

Minimal reproducer

import AVFoundation

let session = AVCaptureSession()
session.beginConfiguration()

let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)!
session.addInput(try AVCaptureDeviceInput(device: camera))

// Removing ONLY this line makes the error disappear:
let mic = AVCaptureDevice.default(for: .audio)!
session.addInput(try AVCaptureDeviceInput(device: mic))

session.addOutput(AVCaptureMovieFileOutput())
session.addOutput(AVCapturePhotoOutput())
session.commitConfiguration()

NotificationCenter.default.addObserver(
    forName: .AVCaptureSessionRuntimeError, object: session, queue: nil
) { print($0.userInfo ?? [:]) }

session.startRunning() // -11800 / 'what' fires within ~2 sec

Observed state at error time

AVError.unknown (-11800)
underlyingError = NSError(NSOSStatusErrorDomain, 2003329396)
  userInfo[AVErrorFourCharCode] = 'what'

captureSession.isRunning   = false   ← never came up
captureSession.isInterrupted = false
captureSession.preset      = .high
captureSession.inputs      = [Back Triple Camera, iPhone Microphone]

AVAudioSession.sharedInstance():
  category               = .playAndRecord
  mode                   = .videoRecording
  sampleRate             = 48000.0
  isInputAvailable       = true
  isOtherAudioPlaying    = false
  availableInputs        = [MicrophoneBuiltIn]   (no BT/Continuity/AirPods)

  currentRoute.inputs    = []                   ← EMPTY
  currentRoute.outputs   = [Speaker|Speaker]

2003329396 = 0x77686174 = 'what'. From a few SO threads this maps to AURemoteIO::StartIO returning a HAL-bring-up failure.

The smoking gun: currentRoute.inputs is empty even though availableInputs contains the built-in mic, isInputAvailable is true, the category is .playAndRecord, and isOtherAudioPlaying is false. The HAL never routes the mic into the session, then 'what' follows. Nothing observable from AVAudioSession indicates a competing client.

Environment / SDKs linked

Firebase (SPM: Crashlytics, Performance, Messaging, Analytics, AppCheck, RemoteConfig, DynamicLinks), FBSDK, Kingfisher, MetalPetal. Multiple Google ad mediation pods present, but their audio session takeover is already disabled (audioVideoManager.isAudioSessionApplicationManaged = true, IMSdk.shouldAutoManageAVAudioSession(false)).

What I've ruled out (all still produce 'what')

Audio session config: .playAndRecord/.videoRecording, .playAndRecord/.default, .record/.measurement, .record/.default. With/without .defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP, .mixWithOthers. setActive(true) before vs. after attaching audio input. setPreferredInput(builtInMic) (verified accepted). 200ms Thread.sleep between setActive(true) and startRunning(). Setting usesApplicationAudioSession = false swaps the fourCC to '!rec' but produces the same outcome.

Topology: sessionPreset = .high / .hd1920x1080 / .hd1280x720 / .medium. Camera = .builtInTripleCamera / .builtInDualWideCamera / .builtInWideAngleCamera. AVCam-style always-attached graph. Setting sessionPreset before vs. after adding inputs.

Threading: All session mutations on a single dedicated DispatchQueue (vs. Swift actor). 1× and 2× full stopRunning()+startRunning() recovery cycles ("do it twice" pattern) — both re-fail with 'what'.

SDK takeover prevention: GoogleMobileAdsMediation pods (Vungle, Mintegral, Pangle, Unity, InMobi), Google-Mobile-Ads-SDK, MediaPipeTasksVision removed via full pod uninstall + clean build — 'what' persists.

Notifications during the failure window:

  • 3 × AVAudioSession.routeChangeNotification reason categoryChange before the error fires, even though category stays .playAndRecord/.videoRecording. Disabling automaticallyConfiguresApplicationAudioSession drops this to 1, but the runtime error still fires.
  • No AVAudioSession.interruptionNotification.
  • No AVCaptureSessionWasInterruptedNotification.

Symbol audit

otool -L and nm of the bundle confirm none of the linked frameworks reference AVAudioRecorder, AudioComponentInstanceNew, AURemoteIO, or AudioUnitInitialize in their symbol tables. Only the app's own files reference any audio API. Yet adding AVCaptureDeviceInput(.audio) reproduces 100% in this binary and 0% in a fresh project.

My questions

  1. Who is most likely holding the audio HAL in a process where no linked framework references the AudioUnit / HAL APIs directly? Are there framework load-time audio initializations that don't show up in symbol tables (e.g., dynamic dlopen, CFBundleLoadExecutable) that could grab the HAL?
  2. Is there an os_log subsystem / category that surfaces the underlying AURemoteIO::StartIO failure reason at runtime? com.apple.coreaudio shows 'what' but not the originating cause.
  3. currentRoute.inputs is empty at error time even though availableInputs = [MicrophoneBuiltIn], isInputAvailable = true, and the category is .playAndRecord. What does an empty input route under those conditions imply, and what other system-level holders could be preventing the HAL from routing the mic in?
  4. Has anyone seen 'what' resolve with a device reboot, an iOS update, or by removing a specific framework?

Happy to share a sysdiagnose. Thanks!

AVCaptureSession runtime error -11800 / 'what' on startRunning() with audio input — what's holding the HAL?
 
 
Q