Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

AVFoundationErrorDomain Code=-11819 in AVPlayer – Causes & Fixes?
We are encountering an issue where AVPlayer throws the error: Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" > Underlying Error Domain[(null)]:Code[0]:Desc[(null)] This error seems to occur intermittently during video playback, especially after extended usage or when switching between different streams. We observe Error 11819 (AVFoundationErrorDomain) in Conviva platform that some of our users experience it but we couldn't reproduce it so far and we’re need support to determine the root cause and/or best practices to prevent it. Some questions we have: What typically triggers this error? Could it be related to memory/resource constraints, network instability, or backgrounding? Are there any recommended ways to handle or recover from this error gracefully? Any insights or guidance would be greatly appreciated. Thanks!
0
0
128
Apr ’25
Assistance Needed: CoreMediaErrorDomain Error -12971
Hello Apple Developer Community, I am trying to play an HLS stream using the React Native Video player (underneath it's using AvPlayer). I am able to play the stream smoothly, but in some cases the player can not play the stream properly. Behaviour: react-native-video: I am getting the below error. Error details from react-native-video player: Error Code: -12971 Domain: CoreMediaErrorDomain Localised Description: The operation couldn’t be completed. (CoreMediaErrorDomain error -12971.) Target: 2457 The error does not provide a specific failure reason or recovery suggestion, which makes troubleshooting challenging. AvPlayer on native iOS project: Video playback stopped after playing a few seconds. AVPlayer configuration: player.currentItem?.preferredForwardBufferDuration = 1 player.automaticallyWaitsToMinimizeStalling = true N.B.: The same buffer duration is working perfectly for others. Stream properties: video resolution: 1280 x 720 I have attached an overview report generated from MediaStreamValidator. I would appreciate any insights or suggestions on how to address this error. Has anyone in the community experienced a similar issue or have any advice on potential solutions? Thank you for your help!
0
1
125
Apr ’25
Clarification of use of `AVAssetDownloadConfiguration` and `AVAssetDownloadTask` to persist MP3 Audio downloads/streams
Hello! I have been following the UsingAVFoundationToPlayAndPersistHTTPLiveStreams sample code in order to test persisting streams to disk. In addition to support for m3u8, I have noticed in testing that this also seems to work for MP3 Audio, simply by changing the plist entries to point to remote URLs with audio/mpeg content. Is this expected, or are there caveats that I should be aware of? Thanks you!
0
0
61
Apr ’25
Apple Music web player will not work on wkwebview web browser or electron chromium browser.
Hello, I'm trying to create a webbrowser but currently when signed into apple music webplayer I get the following message when I attempt to play on any versions of my webbrowser: Not available on the web You can listen to this in the Apple Music app. Is there a way to setup DRM (assuming this is the issue) with apple to allow my webbrowser to play this content? I believe Apple TV is also affected. Thank you ahead of time.
0
0
128
May ’25
Can I Fade Out Track Volume Before End Using ApplicationMusicPlayer?
I’m building a music app using Apple Music streaming via ApplicationMusicPlayer. My goal is to decrease the volume of the current song during the last 10 seconds, and when the next track begins, restore the volume to its normal level. I know that ApplicationMusicPlayer doesn’t expose a volume API, and I want to avoid triggering the system volume HUD. ✅ Using Apple Music streaming (not local files) ❓ Is it possible to implement per-track fade-out/fade-in logic with ApplicationMusicPlayer? Appreciate any clarification or official guidance!
0
0
70
Jun ’25
Couldn't able to hear audio via speaker on ios real device
This is my native module code implementation I'm getting base64 encoded string from server and passing this to my native module of pcm player to play audio App.tsx PcmPlayer.writeChunk(e.data); PcmPlayer.swift import AVFoundation @objc(PcmPlayer) class PcmPlayer: RCTEventEmitter { private var engine: AVAudioEngine? private var playerNode: AVAudioPlayerNode? private var format: AVAudioFormat? private var bufferQueue = [Data]() private var isPlaying = false private var hasEnded = false private var scheduledBufferCount = 0 private let minBufferBytes = 50000 private let pcmQueue = DispatchQueue(label: "pcm.queue") override init() { super.init() } override func supportedEvents() -> [String]! { return ["onStatus", "onMessage"] } @objc(initPlayer:channels:bitsPerSample:) func initPlayer(_ sampleRate: NSNumber, channels: NSNumber, bitsPerSample: NSNumber) { pcmQueue.async { self.stopInternal() let session = AVAudioSession.sharedInstance() do { try session.setCategory(.playback, mode: .default, options: []) try session.setActive(true, options: .notifyOthersOnDeactivation) try session.setMode(.default) print("🔈 Audio session active. Output route:", session.currentRoute.outputs) } catch { print("❌ Audio session setup failed:", error) return } self.engine = AVAudioEngine() self.playerNode = AVAudioPlayerNode() guard let engine = self.engine, let playerNode = self.playerNode else { print("❌ Engine or playerNode is nil") return } engine.attach(playerNode) self.format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: sampleRate.doubleValue, channels: AVAudioChannelCount(channels.uintValue), interleaved: false) guard let format = self.format else { print("❌ Failed to create AVAudioFormat") return } engine.connect(playerNode, to: engine.mainMixerNode, format: format) do { try engine.start() playerNode.play() engine.mainMixerNode.outputVolume = 1.0 print("✅ AVAudioEngine started with format:", format) } catch { print("❌ Engine start failed:", error) } self.hasEnded = false } } @objc(writeChunk:) func writeChunk(_ base64Pcm: String) { pcmQueue.async { guard base64Pcm.count >= 10 else { print("⚠️ Skipping short base64 string") return } var padded = base64Pcm let mod4 = base64Pcm.count % 4 if mod4 > 0 { padded += String(repeating: "=", count: 4 - mod4) } guard let data = Data(base64Encoded: padded, options: .ignoreUnknownCharacters) else { print("❌ Failed to decode base64") return } self.bufferQueue.append(data) print("📥 Received PCM chunk (\(data.count) bytes)") print("📥 writeChunk called. isPlaying=\(self.isPlaying), bufferQueue.count=\(self.bufferQueue.count)") if !self.isPlaying { self.isPlaying = true self.waitForBufferAndStartPlayback() } else if self.scheduledBufferCount == 0 { self.isPlaying = true self.waitForBufferAndStartPlayback() } } } private func waitForBufferAndStartPlayback() { DispatchQueue.global().async { while self.queueSize() < self.minBufferBytes && !self.hasEnded { Thread.sleep(forTimeInterval: 0.01) } self.writeLoop() } } private func writeLoop() { DispatchQueue.global().async { writeLoop: while self.isPlaying { if self.bufferQueue.isEmpty { for _ in 0..<100 { Thread.sleep(forTimeInterval: 0.01) if !self.bufferQueue.isEmpty { break } } if self.bufferQueue.isEmpty { print("🔇 No more data to play after waiting") self.isPlaying = false break writeLoop } } var data: Data? self.pcmQueue.sync { if !self.bufferQueue.isEmpty { data = self.bufferQueue.removeFirst() } } guard let chunk = data else { print("⚠️ No data to process") continue } if let buffer = self.pcmBufferFromData(chunk) { self.scheduledBufferCount += 1 self.playerNode?.scheduleBuffer(buffer, completionHandler: { self.pcmQueue.async { self.scheduledBufferCount -= 1 if self.bufferQueue.isEmpty && self.scheduledBufferCount == 0 { print("ℹ️ Playback idle - waiting for more data") self.isPlaying = false } } }) } } } } private func pcmBufferFromData(_ data: Data) -> AVAudioPCMBuffer? { guard let format = self.format else { return nil } let frameCount = UInt32(data.count / 2) guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: frameCount) else { print("❌ Failed to create AVAudioPCMBuffer") return nil } buffer.frameLength = frameCount guard let floatChannelData = buffer.floatChannelData?[0] else { print("❌ floatChannelData is nil") return nil } data.withUnsafeBytes { (rawBuffer: UnsafeRawBufferPointer) in let int16Buffer = rawBuffer.bindMemory(to: Int16.self) let count = min(int16Buffer.count, Int(frameCount)) for i in 0..<count { floatChannelData[i] = Float32(int16Buffer[i]) / Float32(Int16.max) } } return buffer } @objc(stopPlayer) func stopPlayer() { pcmQueue.async { self.stopInternal() } } private func stopInternal() { print("🛑 stopInternal called") self.playerNode?.stop() self.engine?.stop() self.engine?.reset() self.playerNode = nil self.engine = nil self.format = nil self.bufferQueue.removeAll() self.isPlaying = false self.hasEnded = true self.scheduledBufferCount = 0 } @objc(canWrite:rejecter:) func canWrite(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: RCTPromiseRejectBlock) { pcmQueue.async { resolve(self.bufferQueue.count < 20) } } @objc(flushPlayer:rejecter:) func flushPlayer(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: RCTPromiseRejectBlock) { pcmQueue.async { self.bufferQueue.removeAll() resolve(nil) } } @objc static override func requiresMainQueueSetup() -> Bool { return false } private func queueSize() -> Int { return pcmQueue.sync { return self.bufferQueue.reduce(0) { $0 + $1.count } } } } I couldn't able to hear any audio via my real iOS device also it is working fine on emulator.
0
0
181
Jul ’25
AVKit - PiP with AVSampleBufferDisplayLayer Error
AVPictureInPictureControllerContentSource *contentSource = [[AVPictureInPictureControllerContentSource alloc] initWithSampleBufferDisplayLayer:self.renderView.sampleBufferDisplayLayer playbackDelegate:self]; AVPictureInPictureController *pictureInPictureController = [[AVPictureInPictureController alloc] initWithContentSource:contentSource]; pictureInPictureController.delegate = self; (void)pictureInPictureController:(AVPictureInPictureController *)pictureInPictureController failedToStartPictureInPictureWithError:(NSError *)error { //error NSError * domain: @"PGPegasusErrorDomain" - code: -1003 0x00000002819fe3a0 } when first start the PiP play, I got the error "//error NSError * domain: @"PGPegasusErrorDomain" - code: -1003 0x00000002819fe3a0", why? and second start is Ok.
0
0
144
Jul ’25
EXT-X-DISCONTINUITY misalignment
We encounter issue with avplayer in case of EXT-X-DISCONTINUITY misalignment between audio and video produced after insertion of gaps. The initial objective is to introduce an EXT-X-DISCONTINUITY in audio playlist after some missing segments (EXT-X-GAP) which durations are aligned to video segments durations, to handle irregular audio durations. Please find below an example of corresponding video and audio playlists: video: #EXTM3U #EXT-X-VERSION:7 #EXT-X-MEDIA-SEQUENCE:872524632 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-TARGETDURATION:2 #USP-X-TIMESTAMP-MAP:MPEGTS=7096045027,LOCAL=2025-05-09T12:38:32.369100Z #EXT-X-MAP:URI="hls/StreamingBasic-video=979200.m4s" #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.369111Z #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524632.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524633.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524634.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524635.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524636.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524637.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524638.m4s #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.383111Z #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524639.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524640.m4s audio: EXTM3U #EXT-X-VERSION:7 #EXT-X-MEDIA-SEQUENCE:872524632 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-TARGETDURATION:2 #USP-X-TIMESTAMP-MAP:MPEGTS=7096045867,LOCAL=2025-05-09T12:38:32.378400Z #EXT-X-MAP:URI="hls/StreamingBasic-audio_99500_eng=98800.m4s" #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.378444Z #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524632.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524633.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524634.m4s #EXTINF:1.984, no desc hls/StreamingBasic-audio_99500_eng=98800-872524635.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524636.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-audio_99500_eng=98800-872524637.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-audio_99500_eng=98800-872524638.m4s #EXT-X-DISCONTINUITY #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.778444Z #EXTINF:1.6213, no desc hls/StreamingBasic-audio_99500_eng=98800-872524639.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524640.m4s In this case playback is broken with avplayer. Is it conformed to Http Live Streaming? Is it an avplayer bug? What are the guidelines to handle such gaps?
0
0
154
Jul ’25
Instagram video embed in WKWebView freezes on start on iOS18.5
Hi everyone! Here's what I observed so far: On device it's reproducible on iOS/iPadOS18.5, but works on iPadOS17.7. On iPhone16 iOS 18.5 simulator that I was extensively using for development it was reproducible until I reset content and settings. On iPhone 16 iOS18.4 simulator, which was also used a lot during development it still works always, so I tend to think it's 18.5 issue. Setting config.websiteDataStore = .nonPersistent() doesn't help. Cleaning WKWebsiteDataStore doesn't help. It works fine using direct URL from the embedded code (see the code below). Can someone provide some insight on how this could be fixed? Here's the code: import SwiftUI import WebKit @main struct IGVideoApp: App { var body: some Scene { WindowGroup { WebView() } } } private struct WebView: UIViewRepresentable { func makeUIView(context: Context) -> WKWebView { let config = WKWebViewConfiguration() config.allowsInlineMediaPlayback = true return .init(frame: .zero, configuration: config) } func updateUIView(_ uiView: WKWebView, context: Context) { let urlString = "https://www.instagram.com/reel/DKHFOGct3z7/?utm_source=ig_embed&amp;utm_campaign=loading" /// It works when loading from the data-instgrm-permalink URL directly // uiView.load(.init(url: .init(string: "\(urlString)")!)) /// It doesn't work whith embedding /// Note: the code part for embedding (<blockquote>...</blockquote>) is taken from my /// Instagram post (https://www.instagram.com/p/DKHFOGct3z7/) /// and stripped down. The urlString was also extracted for demonstration of direct loading. let string = """ <!doctype html> <meta charset="utf-8" /> <meta name="viewport" content="width=device-width, initial-scale=1"> <html> <head /> <body style="background-color:black; margin:0px"> <blockquote class="instagram-media" data-instgrm-captioned data-instgrm-version="14" data-instgrm-permalink="\(urlString)"> </blockquote> <script async src="https://www.instagram.com/embed.js"></script> </body> </html> """ uiView.loadHTMLString(string, baseURL: .init(string: "https://www.instagram.com")) } }
0
0
238
Jul ’25
CoreMediaError with Lightning HDMI output on FairPlay content
Hello, our application is unable to HDMI output FairPlay protected content to TV via official Lightning HDMI AV Adapter, by checking the console log on mediaplayerd it is found that a CoreMediaErrorDomain Code=-19156 is raised, but we are unable to know what this error code means. default 11:18:15.121584+0800 mediaplaybackd keyboss ckb_customURLReadCallback: 0x7fa62f800 60/0 customURLReqID 4 isComplete 1 err -19156 error <private> (0) dokeyCallbacksExist 0 default 11:18:15.121670+0800 mediaplaybackd keyboss ckb_processErrorForRequest: 0x7fa62f800 60/0 handler 4 err 0 default 11:18:15.121752+0800 mediaplaybackd <<<< FigCustomURLHandling >>>> curll_cancelRequestOnQueue: 0x7fa031360: requestID: 4 default 11:18:15.121932+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 reqFin err Error Domain=CoreMediaErrorDomain Code=-19156 (-19156) dokeyCallbacksExist 0 default 11:18:15.122025+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 retry default 11:18:15.123195+0800 mediaplaybackd <<<< FigCPECryptorPKD >>>> PostKeyRequestErrorOccurred: 0x7fab7be80 029592C2-093D-400D-B57F-7AB06CC292D1 key request error: Error Domain=CoreMediaErrorDomain Code=-19160 (-19160)
0
2
119
Jul ’25
AVAssetResourceLoaderDelegate and CoreMediaErrorDomain -12881 When Playing HLS Audio
I am developing an app that plays HLS audio. When using AVPlayerItem with AVURLAsset, can AVAssetResourceLoaderDelegate correctly handle HLS segments? My goal is to use AVAssetResourceLoaderDelegate to add authentication HTTP headers when accessing HLS .m3u8 and .ts files. I can successfully download the files, but playback fails with errors. Specifically, I am observing the following cases: A. AVAssetResourceLoaderDelegate is canceled, and CoreMediaErrorDomain -12881 occurs In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest In didReceiveData, call dataRequest respondWithData resourceLoader didCancelLoadingRequest is called CoreMediaErrorDomain -12881 occurs B. CoreMediaErrorDomain -12881 occurs In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest In connection didReceiveData, buffer all received data until the end In connectionDidFinishLoading, pass the buffered data to respondWithData Call loadingRequest finishLoading CoreMediaErrorDomain -12881 occurs In both cases, dataRequest.requestsAllDataToEndOfResource is YES. For this use case, I am not using AVURLAssetHTTPHeaderFieldsKey because I need to apply the most up-to-date authentication data at the moment each file is accessed. I would appreciate any advice or suggestions you might have. Thank you in advance!
0
1
128
Aug ’25
New FairPlay Keys
Hello, My company has an in-store app with FPS SDK 4.x (1024) keys. We've handed those keys over to a trusted third-party and we do not have them. We've been in-store for several years. The person that created the keys in our organization mistakenly stored them encrypted to our third-party's PGP keys, so we cannot decrypt them, and the third party also has no mechanism to provide us with the keys even though it is in their runtime environment. They only have secure mechanisms for us to upload keys onto their servers. We are trying to migrate to a different third-party DRM provider, and would like to obtain new keys. Unfortunately, the developer portal won't let me create new keys, saying that we have exceeded the number of keys allowed, which I assume is one. Additionally, the new DRM provider can only support SDK 4.x keys, and it appears that we can only request SDK 5.x keys on the Apple Developer portal, as the SDK 4.0 option is grayed out. Regardless, it seems that we are not able to request any keys. We've submitted a request to the support e-mail address and received an automated e-mail that the response should take a few days, but may take longer on occasion. It's now been a month. The e-mail says that the reply address is not monitored. Is there any way we can accelerate this? Thank you, Carlos
0
1
230
Aug ’25
Crash iOS 26.0: [__NSSingleObjectArrayI selectedMediaOptionInMediaSelectionGroup:]: unrecognized selector sent to instance
I'm having a crash on an app that plays videos when the users activates close captions. I was able to replicate the issue on an empty project. The crash happens when the AVPlayerLayer is used to instantiate an AVPictureInPictureController These are the example project where I tested the crash: struct ContentView: View { var body: some View { VStack { VideoPlaylistView() } .frame(maxWidth: .infinity, maxHeight: .infinity) .background(Color.black.ignoresSafeArea()) } } class VideoPlaylistViewModel: ObservableObject { // Test with other videos var player: AVPlayer? = AVPlayer(url: URL(string:"https://d2ufudlfb4rsg4.cloudfront.net/newsnation/WIpkLz23h/adaptive/WIpkLz23h_master.m3u8")!) } struct VideoPlaylistView: View { @StateObject var viewModel = VideoPlaylistViewModel() var body: some View { ScrollView { VideoCellView(player: viewModel.player) .onAppear { viewModel.player?.play() } } .scrollTargetBehavior(.paging) .ignoresSafeArea() } } struct VideoCellView: View { let player: AVPlayer? @State var isCCEnabled: Bool = false var body: some View { ZStack { PlayerView(player: player) .accessibilityIdentifier("Player View") } .containerRelativeFrame([.horizontal, .vertical]) .overlay(alignment: .bottom) { Button { player?.currentItem?.asset.loadMediaSelectionGroup(for: .legible) { group,error in if let group { let option = !isCCEnabled ? group.options.first : nil player?.currentItem?.select(option, in: group) isCCEnabled.toggle() } } } label: { Text("Close Captions") .font(.subheadline) .foregroundStyle(isCCEnabled ? .red : .primary) .buttonStyle(.bordered) .padding(8) .background(Color.blue.opacity(0.75)) } .padding(.bottom, 48) .accessibilityIdentifier("Button Close Captions") } } } import Foundation import UIKit import SwiftUI import AVFoundation import AVKit struct PlayerView: UIViewRepresentable { let player: AVPlayer? func updateUIView(_ uiView: UIView, context: UIViewRepresentableContext<PlayerView>) { } func makeUIView(context: Context) -> UIView { let view = PlayerUIView() view.playerLayer.player = player view.layer.addSublayer(view.playerLayer) view.layer.backgroundColor = UIColor.red.cgColor view.pipController = AVPictureInPictureController(playerLayer: view.playerLayer) view.pipController?.requiresLinearPlayback = true view.pipController?.canStartPictureInPictureAutomaticallyFromInline = true view.pipController?.delegate = view return view } } class PlayerUIView: UIView, AVPictureInPictureControllerDelegate { let playerLayer = AVPlayerLayer() var pipController: AVPictureInPictureController? override init(frame: CGRect) { super.init(frame: frame) } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func layoutSubviews() { super.layoutSubviews() playerLayer.frame = bounds playerLayer.backgroundColor = UIColor.green.cgColor } func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: any Error) { print("Error starting Picture in Picture: \(error.localizedDescription)") } } class AppDelegate: NSObject, UIApplicationDelegate { func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey : Any]? = nil) -> Bool { let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(.playback, mode: .moviePlayback) try audioSession.setActive(true) } catch { print("ERR: \(error.localizedDescription)") } return true } } UITest to make the app crash: final class VideoPlaylistSampleUITests: XCTestCase { func testCrashiOS26ToggleCloseCaptions() throws { let app = XCUIApplication() app.launch() let videoPlayer = app.otherElements["Player View"] XCTAssertTrue(videoPlayer.waitForExistence(timeout: 30)) let closeCaptionButton = app.buttons["Button Close Captions"] for _ in 0..<2000 { closeCaptionButton.tap() } } }
0
5
273
Sep ’25
Getting CoreMediaErrorDomain -15628 playback failure in iOS 26 (AVPlayer, HLS stream)
Hi, After updating to iOS 26, our app is experiencing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier. Error: Domain [CoreMediaErrorDomain] Code [-15628] Description [The operation couldn’t be completed.] Underlying Error Domain [(null)] Code [0] Description [(null)] Environment: iOS version: iOS 26 Stream type: HLS (m3u8) with segment (.ts) files Observed behaviour: We don’t have concrete steps to reproduce the issue, but so far, we have observed that this error tends to occur under low network conditions.
0
1
177
Sep ’25
Playing FairPlay encrypted content works fine on ios17 but won't play on ios26
For devices that are still on ios17, playing Fairplay encrypted content still works fine. For devices that I've upgraded to ios26 playing the same content in the same app no longer works. I can advance and see the stream frames by tapping +10 scrubbing so I know that the content is being decrypted but tapping the play button of AVPlayer for an AVPlayerItem now does nothing in ios26. Is this a breaking change or is there a stricter requirement that I now have to implement?
0
0
105
Oct ’25
MusicKit broken in simulator with current tools? Can't get token.
Just updated my computer, phone, and dev tools to the latest versions of everything. Now when I run my app in a previously-working simulator (iPhone 16 w. iOS 18.5) I get: Failed retrieving MusicKit tokens: fetching the developer token is not supported in the simulator when running on this version of macOS; please upgrade your Mac to macOS Ventura. Also: <ICCloudServiceStatusMonitor: 0x600003320e60>: Invoking 1 completion handler for MusicKit tokens. error=<ICError.DeveloperTokenFetchingFailed (-8200) "Failed to fetch media token from <AMSMediaTokenService: 0x6000029049a0>." { underlyingErrors: [ <AMSErrorDomain.300 "Token request encoding failed The token request encoder finished with an error." { userInfo: { AMSDescription : "Token request encoding failed", AMSFailureReason : "The token request encoder finished with an error." }; underlyingErrors: [ <AMSErrorDomain.5 "Anisette Failed Platform not supported" { userInfo: { AMSDescription : "Anisette Failed", AMSFailureReason : "Platform not supported" }; Anybody know what gives here? The Ventura message is absurd because I'm on Tahoe 26.1. The same code works on a physical phone running iOS 26.
0
0
82
1w