Hi,
After updating to iOS 26, our app is facing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier.
Error - Domain[CoreMediaErrorDomain]:Code[-15628]:Desc[The operation couldn’t be completed.]:Underlying Error Domain[(null)]:Code[0]:Desc[(null)]
Environment:
iOS version: ios 26
React Native: 0.69
Video library: react-native-video (AVPlayer under the hood)
Stream type: HLS (m3u8) with segment (.ts) files
Observed behaviour:
Playback works initially on iOS 26.
On iOS 26, the stream fails at runtime after a few seconds/minutes (not on first load).
Network logs show 307 redirects on some segment requests. After this, AVPlayer throws the above error.
Playback fails intermittently on slow/unstable networks.
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In our Apple TV application, we use the native AVPlayer for live playback functionality. Until tvOS 17.6 and during the tvOS 18 beta, the Pause/Resume feature worked as expected, allowing us to pause live playback. However, after updating to tvOS 18.1, the pause functionality no longer works.
The same app still works fine on tvOS 17, but on tvOS 18, attempting to pause live playback has no effect. We reviewed the tvOS 18 release notes but couldn't find any relevant changes or deprecations related to AVPlayer or live playback behavior.
Has there been any change in the handling of live playback or the Pause/Resume functionality in tvOS 18.1? Any guidance or suggestions to address this issue would be greatly appreciated.
Thank you!
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Media Player
HTTP Live Streaming
Hi,
I have a usecase where I'd like to handle and prevent automatic retries whenever certain errors occur during FairPlay content key requests.
Here's the current flow:
FairPlay certificate is requested and obtained from my server
makeStreamingContentKeyRequestData is called on the keyRequest
The license server will return a 403 along with a body response containing a json with the detailed code and message
The error is caught and handled properly by calling AVContentKeyRequest.processContentKeyResponseError
The AVContentKeySession automatically retries up to 8 times by providing a new key request through public func contentKeySession(_ session: AVContentKeySession, didProvide keyRequest: AVContentKeyRequest)
My license server gets hit with 8 requests that will always result in a 403, these retries are useless
My custom error is succesfully caught later down the line through AVPlayerItem.observe(\.status), this is great
Thing is.. I'd like to catch the 403 error and prevent any retry from being made at step 5, ideally through
public func contentKeySession(_ session: AVContentKeySession, contentKeyRequest keyRequest: AVContentKeyRequest, didFailWithError err: Error)
I've looked for quite a while and just can't seem to find any way of achieving this. Is this not supported at all?
I'm having a crash on an app that plays videos when the users activates close captions.
I was able to replicate the issue on an empty project. The crash happens when the AVPlayerLayer is used to instantiate an AVPictureInPictureController
These are the example project where I tested the crash:
struct ContentView: View {
var body: some View {
VStack {
VideoPlaylistView()
}
.frame(maxWidth: .infinity, maxHeight: .infinity)
.background(Color.black.ignoresSafeArea())
}
}
class VideoPlaylistViewModel: ObservableObject {
// Test with other videos
var player: AVPlayer? = AVPlayer(url: URL(string:"https://d2ufudlfb4rsg4.cloudfront.net/newsnation/WIpkLz23h/adaptive/WIpkLz23h_master.m3u8")!)
}
struct VideoPlaylistView: View {
@StateObject var viewModel = VideoPlaylistViewModel()
var body: some View {
ScrollView {
VideoCellView(player: viewModel.player)
.onAppear {
viewModel.player?.play()
}
}
.scrollTargetBehavior(.paging)
.ignoresSafeArea()
}
}
struct VideoCellView: View {
let player: AVPlayer?
@State var isCCEnabled: Bool = false
var body: some View {
ZStack {
PlayerView(player: player)
.accessibilityIdentifier("Player View")
}
.containerRelativeFrame([.horizontal, .vertical])
.overlay(alignment: .bottom) {
Button {
player?.currentItem?.asset.loadMediaSelectionGroup(for: .legible) { group,error in
if let group {
let option = !isCCEnabled ? group.options.first : nil
player?.currentItem?.select(option, in: group)
isCCEnabled.toggle()
}
}
} label: {
Text("Close Captions")
.font(.subheadline)
.foregroundStyle(isCCEnabled ? .red : .primary)
.buttonStyle(.bordered)
.padding(8)
.background(Color.blue.opacity(0.75))
}
.padding(.bottom, 48)
.accessibilityIdentifier("Button Close Captions")
}
}
}
import Foundation
import UIKit
import SwiftUI
import AVFoundation
import AVKit
struct PlayerView: UIViewRepresentable {
let player: AVPlayer?
func updateUIView(_ uiView: UIView, context: UIViewRepresentableContext<PlayerView>) {
}
func makeUIView(context: Context) -> UIView {
let view = PlayerUIView()
view.playerLayer.player = player
view.layer.addSublayer(view.playerLayer)
view.layer.backgroundColor = UIColor.red.cgColor
view.pipController = AVPictureInPictureController(playerLayer: view.playerLayer)
view.pipController?.requiresLinearPlayback = true
view.pipController?.canStartPictureInPictureAutomaticallyFromInline = true
view.pipController?.delegate = view
return view
}
}
class PlayerUIView: UIView, AVPictureInPictureControllerDelegate {
let playerLayer = AVPlayerLayer()
var pipController: AVPictureInPictureController?
override init(frame: CGRect) {
super.init(frame: frame)
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override func layoutSubviews() {
super.layoutSubviews()
playerLayer.frame = bounds
playerLayer.backgroundColor = UIColor.green.cgColor
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: any Error) {
print("Error starting Picture in Picture: \(error.localizedDescription)")
}
}
class AppDelegate: NSObject, UIApplicationDelegate {
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey : Any]? = nil) -> Bool {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playback, mode: .moviePlayback)
try audioSession.setActive(true)
} catch {
print("ERR: \(error.localizedDescription)")
}
return true
}
}
UITest to make the app crash:
final class VideoPlaylistSampleUITests: XCTestCase {
func testCrashiOS26ToggleCloseCaptions() throws {
let app = XCUIApplication()
app.launch()
let videoPlayer = app.otherElements["Player View"]
XCTAssertTrue(videoPlayer.waitForExistence(timeout: 30))
let closeCaptionButton = app.buttons["Button Close Captions"]
for _ in 0..<2000 {
closeCaptionButton.tap()
}
}
}
We are getting reports from customers that they are not able to play videos in our app after updating their phones to iOS18.3.1.
(Further checking indicates that it happens on all iOS18 versions. It suddenly started occurring from February 18th, 2025)
When checking logs we see that playback is failing due to CoreMediaErrorDomain error -42709.
This is an undocumented error code and hence we do not know the cause of the playback issue.
Does anyone know what this error code means and how the app should handle it?
Reported as FB16638501.
Topic:
Media Technologies
SubTopic:
Streaming
Hi everyone,
We’re currently developing a music-based app using MusicKit, and we recently noticed that iOS 26 beta introduces a new “Automix” feature in the Apple Music app. This enables seamless DJ-style transitions between songs—beyond the standard crossfade functionality.
We’re trying to understand:
Will this Automix feature be accessible to third-party apps that use MusicKit?
If not available in the initial iOS 26 release, is there a plan to expose it through public APIs in a future update?
Is there any technical documentation, WWDC session, or roadmap info regarding Automix support via MusicKit?
This functionality would be a significant enhancement for our app, especially for intelligent audio transitions and curated playlists.
Thanks.
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error.
To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro.
The relevant part of the m3u8 is:
#EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO"
{{url}}
Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
Hi,
I have been working on a project that enables users to listen to their favorite music using a streaming service, which so far was Spotify. The app had a programmable 3D/2D interface with the ability to connect to devices in your home and have them react to music. As of September 2024, Spotify decomissioned their Audio Analysis API. I have seen other posts mention playing Apple Music through AVFoundation, which would break DRM and so it’s not supported. However, the Spotify Audio Analysis API does not allow for a full frequency reconstruction. It is entirely temporal data on beats, kicks, loudness, and timbre changes, which themselves are operators on the spectral data from the FFT. It would be very useful for the developer community if we get the ability to do this and it will probably Apple Music among developers and those who use their apps a lot more.
Would love to hear your thoughts about this and Happy New Year!
We move to another streaming service and need to deliver a ASK, .PEM &key, and CRT to enable DRM. Now the issue is that we don't have that information anymore.
Most logical would be to revoke the current certificate and create a new one. Unfortunately for Fairplay Streaming Certificates there is no revoke button.
We asked developer support who isn't able to help. We then did a request to revoke as described in article 2.7 of the Apple Developer Program License Agreement. They can only do this when the certificate is compromised.
So now we are stuck. Anyone out there who had the same issue and found a solution?
Your help is much appreciated.
Hello,
Our users have started to see a new fatal AVPlayer error during playback starting with iOS/tvOS 18.0. The error is defined as "CoreMediaErrorDomain Code=-15486".
We have not been able to reproduce this issue locally within our development team.
Is there any documentation on the cause of this error or steps to recover from this error?
Thank you,
Howard
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
HTTP Live Streaming
AVFoundation
Hello, our application is unable to HDMI output FairPlay protected content to TV via official Lightning HDMI AV Adapter, by checking the console log on mediaplayerd it is found that a CoreMediaErrorDomain Code=-19156 is raised, but we are unable to know what this error code means.
default 11:18:15.121584+0800 mediaplaybackd keyboss ckb_customURLReadCallback: 0x7fa62f800 60/0 customURLReqID 4 isComplete 1 err -19156 error <private> (0) dokeyCallbacksExist 0
default 11:18:15.121670+0800 mediaplaybackd keyboss ckb_processErrorForRequest: 0x7fa62f800 60/0 handler 4 err 0
default 11:18:15.121752+0800 mediaplaybackd <<<< FigCustomURLHandling >>>> curll_cancelRequestOnQueue: 0x7fa031360: requestID: 4
default 11:18:15.121932+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 reqFin err Error Domain=CoreMediaErrorDomain Code=-19156 (-19156) dokeyCallbacksExist 0
default 11:18:15.122025+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 retry
default 11:18:15.123195+0800 mediaplaybackd <<<< FigCPECryptorPKD >>>> PostKeyRequestErrorOccurred: 0x7fab7be80 029592C2-093D-400D-B57F-7AB06CC292D1 key request error: Error Domain=CoreMediaErrorDomain Code=-19160 (-19160)
Hi everyone,
After updating my Apple TV HD (model A1625) to tvOS 26, I’ve noticed a significant spike in CPU usage—up to 3× higher than before the update. Go from around 40% to 120%
Model: Apple TV HD (A1625)
tvOS Version: 26 (stable release) and beta version of 26.1,
App downgrade stream due to lack of cpu power
If anyone else is experiencing this, please share your findings or workarounds.
Would love to hear from Apple engineers or other developers if this is a known regression or if there’s a recommended fix.
Thanks!
Hi,It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.Any ideas, tips, advice would be greatly appreciated.Thanks,Frank
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me:
Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode}
Thanks!
Hi,
I have a IOS app and we are using fairplay DRM to play videos. In IOS app we are allowing offline download of the videos and hence we are getting a persistent fairplay license. In IOS app everything is working fine.
Now we have used the same app and built for MacOS catalyst. In MAC OS catalyst app we are not able to play the video and getting error code -42650
We are able to get the persistent license from server, but when we play the video with the license we are getting the error. Below are the logs:
2024-12-06 22:05:48.911266+0530 0x4dffe2 Default 0x0 85505 0 teachonline: (MediaToolbox) [com.apple.coremedia:] <<<< FigPKDKeyManager >>>> keyManager_processOfflineKeyInternal: 0x600000322000 160D4519-C60B-4FD0-B69A-20B2A4597017 created decrypt context:0x0 with offline key; updated offline key:0x0 err:-42650
2024-12-06 22:05:48.911369+0530 0x4dffe2 Default 0x0 85505 0 teachonline: (MediaToolbox) [com.apple.coremedia:player] <<<< FigStreamPlayer >>>> fpfs_ensureDecryptorHasStarted: [0x7fc44e4dc520|P/NW] <0x7fc44fa44000|I/SRA.01>: track 1 latching decryptorFailure -42650
85505 0 teachonline: (MediaToolbox) [com.apple.coremedia:player] <<<< FigStreamPlayer >>>> fpfs_StopPlayingItem: [0x7fc44e4dc520|P/NW] <0x7fc44fa44000|I/SRA.01>: Pausing, err=Error Domain=CoreMediaErrorDomain Code=-42650 "(null)"
I have copied only the lines which has errors. You can download the full logs from https://drive.google.com/file/d/1feb9pKZERUr--PMt6m-6IrO_mDvoFbjO/view?usp=sharing
Can you please help me to fix the issue.
I'm developing an iOS radio app that plays various HLS streams. The challenge is that some stations broadcast HLS streams containing both audio and video (example: https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8), but I want to:
Extract and play only the audio track
Support AirPlay for audio-only streaming
Minimize data usage by not downloading video content
Technical Details:
iOS 17+
Swift 5.9
Using AVFoundation for playback
Current implementation uses AVPlayer with AVPlayerItem
Current Code Structure:
class StreamPlayer: ObservableObject {
@Published var isPlaying = false
private var player: AVPlayer?
private var playerItem: AVPlayerItem?
func playStream(url: URL) {
let asset = AVURLAsset(url: url)
playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
player?.play()
}
Stream Analysis:
When analyzing the video stream using FFmpeg:
CopyInput #0, hls, from 'https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8':
Stream #0:0: Video: h264, yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 25 fps
Stream #0:1: Audio: aac, 44100 Hz, stereo, fltp
Attempted Solutions:
Using MobileFFmpeg:
let command = [
"-i", streamUrl,
"-vn",
"-acodec", "aac",
"-ac", "2",
"-ar", "44100",
"-b:a", "128k",
"-f", "mpegts",
"udp://127.0.0.1:12345"
].joined(separator: " ")
ffmpegProcess = MobileFFmpeg.execute(command)
Issue: While FFmpeg successfully extracts audio, playback through AVPlayer doesn't work reliably.
Tried using HLS output:
let command = [
"-i", streamUrl,
"-vn",
"-acodec", "aac",
"-ac", "2",
"-ar", "44100",
"-b:a", "128k",
"-f", "hls",
"-hls_time", "2",
"-hls_list_size", "3",
outputUrl.path
]
Issue: Creates temporary files but faces synchronization issues with live streams.
Requirements:
Real-time audio extraction from HLS stream
Maintain live streaming capabilities
Full AirPlay support
Minimal data usage (avoid downloading video content)
Handle network interruptions gracefully
Questions:
What's the most efficient way to extract only audio from an HLS stream in real-time?
Is there a way to tell AVPlayer to ignore video tracks completely?
Are there better alternatives to FFmpeg for this specific use case?
What's the recommended approach for handling AirPlay with modified streams?
Any guidance or alternative approaches would be greatly appreciated. Thank you!
Topic:
Media Technologies
SubTopic:
Streaming
Hi Apple Team,
We have integrated FairPlay Streaming Server SDK v3 into our MDRM platform since 2017, the system works stable and stayed untouched. As you know, both Widevine and Playready have requirements to upgrade the Server SDK regularly. We want to know if Apple imposes similar requirements for upgrading the FPS SDK, or if we may continue using the old one without any updates.
Thanks for your support!
Hello All,
I am looking for assistance with our FairPlay Streaming (FPS) certificates. We are in the process of migrating to a new video streaming vendor and need to create a new FPS certificate using SDK 4. However, we have reached the limit of allowed FPS certificates in our account and cannot create a new one.
Issue Details:
• We currently have two FPS certificates active in our developer account.
• One of these was created using SDK 5, but our new vendor (Mux) requires an FPS certificate based on SDK 4.
• Since Apple does not allow deleting FPS certificates from the developer portal, we are unable to create a new SDK 4 certificate.
• We kindly request Apple to revoke one of our existing FPS certificates to allow us to generate a new SDK 4 certificate.
Request:
We would greatly appreciate it if you could assist us on how to delete one of our existing FPS certificates so that we can proceed with creating a new SDK 4 certificate for our vendor integration.
Thank you for your support.
Hello,
I am developing a video streaming service that uses FairPlay. Since around February 20th, we have started receiving reports of CoreMediaErrorDomain -42709 errors.
Unfortunately, there is no documentation from Apple that explains what this error means, so we are not sure how to address or fix the issue.
Most of the users who reported this error are using iOS 18.2.1 and iOS 18.3.1.
Could you please advise on what we should check or how we might resolve this error?
Hello Apple Developer Community,
I am trying to play an HLS stream using the React Native Video player (underneath it's using AvPlayer). I am able to play the stream smoothly, but in some cases the player can not play the stream properly.
Behaviour:
react-native-video: I am getting the below error.
Error details from react-native-video player:
Error Code: -12971
Domain: CoreMediaErrorDomain
Localised Description: The operation couldn’t be completed. (CoreMediaErrorDomain error -12971.)
Target: 2457
The error does not provide a specific failure reason or recovery suggestion, which makes troubleshooting challenging.
AvPlayer on native iOS project: Video playback stopped after playing a few seconds.
AVPlayer configuration:
player.currentItem?.preferredForwardBufferDuration = 1
player.automaticallyWaitsToMinimizeStalling = true
N.B.: The same buffer duration is working perfectly for others.
Stream properties:
video resolution: 1280 x 720
I have attached an overview report generated from MediaStreamValidator.
I would appreciate any insights or suggestions on how to address this error. Has anyone in the community experienced a similar issue or have any advice on potential solutions?
Thank you for your help!