iOS 26.5 SIGKILLs audio-recording app at ~50s of background despite UIBackgroundModes: audio - what is the supported API path?

Hi, hoping for guidance on what's a long-running bug for our app.

The problem

We have a transcription app on iPhone 17 Pro Max running iOS 26.5. Recording flow uses AVAudioEngine.installTap(onBus:) to capture PCM into a JS bridge for streaming to a remote transcription service. A parallel AVAudioRecorder writes the same audio to disk as backup.

When the user starts a recording and locks the phone, iOS terminates our process with SIGKILL at approximately 50 seconds of continuous background time, despite:

  • UIBackgroundModes includes audio (verified in shipping IPA's Info.plist)
  • AVAudioSession.setCategory(.playAndRecord, mode: .default) is active
  • AVAudioEngine is running with installTap producing PCM buffers right up to the moment of death
  • UIApplication.backgroundTimeRemaining returns Double.greatestFiniteMagnitude at applicationDidEnterBackground (verified in our event log)

No AVAudioSession.interruptionNotification is delivered before the kill. iOS terminates the process cleanly with no warning event to our observer.

Evidence

Our Swift observer module writes an event log to disk on every system event. On relaunch we ship it to our crash reporter. Excerpt from a recent kill on iOS 26.5 / build 2.1.32:

T=0.000s    session-start (engineRunning: true)
T=57.199s   app-will-resign-active (bufferCallbackCount: 22)
T=58.913s   app-did-enter-background (backgroundTimeRemaining: infinity, bufferCallbackCount: 39)
            [no further audio events captured]
            [Swift heartbeat written every 5s for next ~46 seconds]
T~105s      Process SIGKILLed (heartbeat last-alive: 09:31:01.597Z)

Background time before kill: ~46 seconds. engineRunning: true and bufferCallbackCount was still incrementing at the moment the event log stops capturing - the audio engine was alive and feeding buffers when iOS terminated us.

What we've tried (35 documented attempts)

Hopefully not all relevant but listing for completeness:

  • Various AVAudioSession category/mode/options combinations (Default, Measurement, VoiceChat, .mixWithOthers, .defaultToSpeaker, .allowBluetoothHFP)
  • Parallel AVAudioRecorder writing a .caf file as a "real recording app" signal
  • SFSpeechRecognizer with requiresOnDeviceRecognition = true consuming PCM in-process (50s request rotation)
  • BGContinuedProcessingTask with Progress.completedUnitCount reporting monotonic progress every 5 seconds
  • Live Activity (ActivityKit) with NSSupportsLiveActivitiesFrequentUpdates = true
  • Live Activity update pushes via APNs (confirmed wake widget extension only, not host)
  • Silent device-token APNs background pushes (confirmed iOS ~5/day rate limit)
  • CallKit fake call (CXProvider + CXCallController) - works but creates the green pill UI which our product can't ship
  • WebRTC peer connection with active media stream (via react-native-webrtc loopback)
  • UIBackgroundModes: voip declaration (without CallKit)
  • beginBackgroundTask + engine bounce (Apple's own guidance says don't, our test confirmed it's actively harmful)
  • CLLocationManager background updates

All die at ~50s background. None of them survive.

What works on the same device

Three App Store transcription apps survive indefinite background recording on our exact device + iOS version. We have inspected their IPAs (Mach-O LC_LOAD_DYLIB analysis + embedded entitlement extraction):

Otter (com.aisense.otter) - UIBackgroundModes: audio + fetch + processing + remote-notification. Uses OneSignal-driven Live Activity push tokens + NotificationServiceExtension. No CallKit, PushKit, or WebRTC.

Granola (com.granola.ios-prod) - has UIBackgroundModes: voip but the voip is for their separate outbound-phone-call feature (TwilioVoice + CallKit, lives in their PhoneCalls.framework). Recording-path uses ONLY AVAudioRecorder + PlayAndRecord + ModeDefault + Live Activity with frequentPushesEnabled. Zero PushKit anywhere in the bundle.

Transcribe Speech to Text by DENIVIP (ru.denivip.transcribe) - the smallest API surface: UIBackgroundModes: audio + remote-notification only. AVAudioEngine + .playAndRecord + .default + SFSpeechRecognizer consuming PCM. No CallKit, PushKit, BGTask, Live Activity, WebRTC, or VoIP.

Three apps, three different mechanisms, all working. We've implemented bits of all three approaches in our app and still die at 50s.

Apple Voice Memos (system app, private entitlements) also survives indefinite recording on the same device.

Questions

  1. What is the supported API path for indefinite background microphone-only recording on iOS 26.5? Voice Memos and competitor apps clearly accomplish this - what's the missing piece?

  2. Why does UIApplication.backgroundTimeRemaining return Double.greatestFiniteMagnitude at applicationDidEnterBackground but the process is terminated ~50 seconds later? Is the meaning of this property changing in iOS 26?

  3. What causes the iOS 26 process scheduler to revoke the audio-mode background runtime classification? No AVAudioSession.interruptionNotification is delivered before SIGKILL. Where can we observe the classification change?

  4. Does iOS 26 distinguish "audio recording with no audible output" from "audio recording with audible output (e.g. a media playback session)"? If so, what is the supported API to register as a recording-only background-audio app?

  5. Does BGContinuedProcessingTask (new in iOS 26) actually extend background CPU time for an app that is also using UIBackgroundModes: audio and an active AVAudioSession? Or is it for finish-what-you-started bursts only (per WWDC 2025 session 227)?

Any guidance - even pointers to specific WWDC sessions, sample code, or technotes - would be hugely appreciated. We've spent ~40+ hours on this and want to know what the supported path looks like in iOS 26.

Happy to share more event-log data, IPA inspection notes, or build a focused Xcode reproduction if helpful.

Thanks!

iOS 26.5 SIGKILLs audio-recording app at ~50s of background despite UIBackgroundModes: audio - what is the supported API path?
 
 
Q