Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

CoreMediaError with Lightning HDMI output on FairPlay content
Hello, our application is unable to HDMI output FairPlay protected content to TV via official Lightning HDMI AV Adapter, by checking the console log on mediaplayerd it is found that a CoreMediaErrorDomain Code=-19156 is raised, but we are unable to know what this error code means. default 11:18:15.121584+0800 mediaplaybackd keyboss ckb_customURLReadCallback: 0x7fa62f800 60/0 customURLReqID 4 isComplete 1 err -19156 error <private> (0) dokeyCallbacksExist 0 default 11:18:15.121670+0800 mediaplaybackd keyboss ckb_processErrorForRequest: 0x7fa62f800 60/0 handler 4 err 0 default 11:18:15.121752+0800 mediaplaybackd <<<< FigCustomURLHandling >>>> curll_cancelRequestOnQueue: 0x7fa031360: requestID: 4 default 11:18:15.121932+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 reqFin err Error Domain=CoreMediaErrorDomain Code=-19156 (-19156) dokeyCallbacksExist 0 default 11:18:15.122025+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 retry default 11:18:15.123195+0800 mediaplaybackd <<<< FigCPECryptorPKD >>>> PostKeyRequestErrorOccurred: 0x7fab7be80 029592C2-093D-400D-B57F-7AB06CC292D1 key request error: Error Domain=CoreMediaErrorDomain Code=-19160 (-19160)
0
2
142
Jul ’25
AVAssetResourceLoaderDelegate and CoreMediaErrorDomain -12881 When Playing HLS Audio
I am developing an app that plays HLS audio. When using AVPlayerItem with AVURLAsset, can AVAssetResourceLoaderDelegate correctly handle HLS segments? My goal is to use AVAssetResourceLoaderDelegate to add authentication HTTP headers when accessing HLS .m3u8 and .ts files. I can successfully download the files, but playback fails with errors. Specifically, I am observing the following cases: A. AVAssetResourceLoaderDelegate is canceled, and CoreMediaErrorDomain -12881 occurs In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest In didReceiveData, call dataRequest respondWithData resourceLoader didCancelLoadingRequest is called CoreMediaErrorDomain -12881 occurs B. CoreMediaErrorDomain -12881 occurs In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest In connection didReceiveData, buffer all received data until the end In connectionDidFinishLoading, pass the buffered data to respondWithData Call loadingRequest finishLoading CoreMediaErrorDomain -12881 occurs In both cases, dataRequest.requestsAllDataToEndOfResource is YES. For this use case, I am not using AVURLAssetHTTPHeaderFieldsKey because I need to apply the most up-to-date authentication data at the moment each file is accessed. I would appreciate any advice or suggestions you might have. Thank you in advance!
0
1
149
Aug ’25
New FairPlay Keys
Hello, My company has an in-store app with FPS SDK 4.x (1024) keys. We've handed those keys over to a trusted third-party and we do not have them. We've been in-store for several years. The person that created the keys in our organization mistakenly stored them encrypted to our third-party's PGP keys, so we cannot decrypt them, and the third party also has no mechanism to provide us with the keys even though it is in their runtime environment. They only have secure mechanisms for us to upload keys onto their servers. We are trying to migrate to a different third-party DRM provider, and would like to obtain new keys. Unfortunately, the developer portal won't let me create new keys, saying that we have exceeded the number of keys allowed, which I assume is one. Additionally, the new DRM provider can only support SDK 4.x keys, and it appears that we can only request SDK 5.x keys on the Apple Developer portal, as the SDK 4.0 option is grayed out. Regardless, it seems that we are not able to request any keys. We've submitted a request to the support e-mail address and received an automated e-mail that the response should take a few days, but may take longer on occasion. It's now been a month. The e-mail says that the reply address is not monitored. Is there any way we can accelerate this? Thank you, Carlos
0
1
269
Aug ’25
Crash iOS 26.0: [__NSSingleObjectArrayI selectedMediaOptionInMediaSelectionGroup:]: unrecognized selector sent to instance
I'm having a crash on an app that plays videos when the users activates close captions. I was able to replicate the issue on an empty project. The crash happens when the AVPlayerLayer is used to instantiate an AVPictureInPictureController These are the example project where I tested the crash: struct ContentView: View { var body: some View { VStack { VideoPlaylistView() } .frame(maxWidth: .infinity, maxHeight: .infinity) .background(Color.black.ignoresSafeArea()) } } class VideoPlaylistViewModel: ObservableObject { // Test with other videos var player: AVPlayer? = AVPlayer(url: URL(string:"https://d2ufudlfb4rsg4.cloudfront.net/newsnation/WIpkLz23h/adaptive/WIpkLz23h_master.m3u8")!) } struct VideoPlaylistView: View { @StateObject var viewModel = VideoPlaylistViewModel() var body: some View { ScrollView { VideoCellView(player: viewModel.player) .onAppear { viewModel.player?.play() } } .scrollTargetBehavior(.paging) .ignoresSafeArea() } } struct VideoCellView: View { let player: AVPlayer? @State var isCCEnabled: Bool = false var body: some View { ZStack { PlayerView(player: player) .accessibilityIdentifier("Player View") } .containerRelativeFrame([.horizontal, .vertical]) .overlay(alignment: .bottom) { Button { player?.currentItem?.asset.loadMediaSelectionGroup(for: .legible) { group,error in if let group { let option = !isCCEnabled ? group.options.first : nil player?.currentItem?.select(option, in: group) isCCEnabled.toggle() } } } label: { Text("Close Captions") .font(.subheadline) .foregroundStyle(isCCEnabled ? .red : .primary) .buttonStyle(.bordered) .padding(8) .background(Color.blue.opacity(0.75)) } .padding(.bottom, 48) .accessibilityIdentifier("Button Close Captions") } } } import Foundation import UIKit import SwiftUI import AVFoundation import AVKit struct PlayerView: UIViewRepresentable { let player: AVPlayer? func updateUIView(_ uiView: UIView, context: UIViewRepresentableContext<PlayerView>) { } func makeUIView(context: Context) -> UIView { let view = PlayerUIView() view.playerLayer.player = player view.layer.addSublayer(view.playerLayer) view.layer.backgroundColor = UIColor.red.cgColor view.pipController = AVPictureInPictureController(playerLayer: view.playerLayer) view.pipController?.requiresLinearPlayback = true view.pipController?.canStartPictureInPictureAutomaticallyFromInline = true view.pipController?.delegate = view return view } } class PlayerUIView: UIView, AVPictureInPictureControllerDelegate { let playerLayer = AVPlayerLayer() var pipController: AVPictureInPictureController? override init(frame: CGRect) { super.init(frame: frame) } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func layoutSubviews() { super.layoutSubviews() playerLayer.frame = bounds playerLayer.backgroundColor = UIColor.green.cgColor } func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: any Error) { print("Error starting Picture in Picture: \(error.localizedDescription)") } } class AppDelegate: NSObject, UIApplicationDelegate { func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey : Any]? = nil) -> Bool { let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(.playback, mode: .moviePlayback) try audioSession.setActive(true) } catch { print("ERR: \(error.localizedDescription)") } return true } } UITest to make the app crash: final class VideoPlaylistSampleUITests: XCTestCase { func testCrashiOS26ToggleCloseCaptions() throws { let app = XCUIApplication() app.launch() let videoPlayer = app.otherElements["Player View"] XCTAssertTrue(videoPlayer.waitForExistence(timeout: 30)) let closeCaptionButton = app.buttons["Button Close Captions"] for _ in 0..<2000 { closeCaptionButton.tap() } } }
0
5
312
Sep ’25
Getting CoreMediaErrorDomain -15628 playback failure in iOS 26 (AVPlayer, HLS stream)
Hi, After updating to iOS 26, our app is experiencing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier. Error: Domain [CoreMediaErrorDomain] Code [-15628] Description [The operation couldn’t be completed.] Underlying Error Domain [(null)] Code [0] Description [(null)] Environment: iOS version: iOS 26 Stream type: HLS (m3u8) with segment (.ts) files Observed behaviour: We don’t have concrete steps to reproduce the issue, but so far, we have observed that this error tends to occur under low network conditions.
0
4
327
Sep ’25
Playing FairPlay encrypted content works fine on ios17 but won't play on ios26
For devices that are still on ios17, playing Fairplay encrypted content still works fine. For devices that I've upgraded to ios26 playing the same content in the same app no longer works. I can advance and see the stream frames by tapping +10 scrubbing so I know that the content is being decrypted but tapping the play button of AVPlayer for an AVPlayerItem now does nothing in ios26. Is this a breaking change or is there a stricter requirement that I now have to implement?
0
0
161
Oct ’25
MusicKit broken in simulator with current tools? Can't get token.
Just updated my computer, phone, and dev tools to the latest versions of everything. Now when I run my app in a previously-working simulator (iPhone 16 w. iOS 18.5) I get: Failed retrieving MusicKit tokens: fetching the developer token is not supported in the simulator when running on this version of macOS; please upgrade your Mac to macOS Ventura. Also: <ICCloudServiceStatusMonitor: 0x600003320e60>: Invoking 1 completion handler for MusicKit tokens. error=<ICError.DeveloperTokenFetchingFailed (-8200) "Failed to fetch media token from <AMSMediaTokenService: 0x6000029049a0>." { underlyingErrors: [ <AMSErrorDomain.300 "Token request encoding failed The token request encoder finished with an error." { userInfo: { AMSDescription : "Token request encoding failed", AMSFailureReason : "The token request encoder finished with an error." }; underlyingErrors: [ <AMSErrorDomain.5 "Anisette Failed Platform not supported" { userInfo: { AMSDescription : "Anisette Failed", AMSFailureReason : "Platform not supported" }; Anybody know what gives here? The Ventura message is absurd because I'm on Tahoe 26.1. The same code works on a physical phone running iOS 26.
0
0
150
Nov ’25
Playing Apple Fairplay encrypted content on iOS 26
For iOS17 we've had no problem playing Apple Fairplay encrypted content with keys delivered from our key server running on FairPlay Streaming Server SDK 5.1 and subsequently FairPlay Streaming Server SDK 26. It's built and deployed using Xcode Version 26.1.1 (17B100) with no changes to the code and - as expected - the content continued to be successfully decrypted and played (so far so good). However, as soon as a device was updated to iOS26, that device would no longer play the encrypted content. Devices remaining on iOS17 continue to work normally and the debugging logs are a sanity-check that proves that. Is anyone else experiencing this issue? Here's the code (you should be able to drop it into a fresh iOS Xcode project and provide a server url, content url and certificate).
0
1
194
Dec ’25
Are there known cases where DepthData is empty while Face ID is working?
We are experiencing an issue related to DepthData from the TrueDepth camera on a specific device. On December 1, we tested with the complainant’s device iPhone 14 / iOS 26.0.1, and observed that the depth image is received with empty values. However, the same implementation works normally on iPhone 17 Pro Max (iOS 26.1) and iPhone 13 Pro Max (iOS 26.0.1), where depth data is delivered correctly. In the problematic case: TrueDepth camera is active Face ID works normally The app receives a DepthData object, but all values are empty (0), not nil Because the DepthData object is not nil, this makes it difficult to detect the issue through software fallback handling. We developed the feature with reference to the following Apple sample: https://developer.apple.com/documentation/AVFoundation/streaming-depth-data-from-the-truedepth-camera We would like to ask: Are there known cases where Face ID functions normally but DepthData from the TrueDepth camera is returned as empty values? If so, is there a recommended approach for identifying or handling this situation? Any guidance from Apple engineers or the community would be greatly appreciated. Thank you.
0
0
171
Dec ’25
CoreMediaErrorDomain error -42681
We are trying to port our code to Apple TV on tvosVersion 17.6 while running the sample we are getting error CoreMediaErrorDomain error -42681. We understand that this error occurs when the FairPlay license (CKC) returned by the server contains incompatible or malformed version information that the iOS/tvOS FairPlay CDM cannot parse. Can you please specify tvos 17.6 expect what fairplay version number or what fields are mandartory for fps version metadata ?
0
0
253
3w
Live Streaming issue for the RTSP
We have the application 'ADS Smart', a companion application for our ADS Dashcam. We offer a feature that lets users stream the live footage of the dashcam cameras through the app. Currently, we are experiencing a time delay of 30+ seconds to see the live stream, i.e the first frame of the live footage is taking around 30+ seconds to display in the app. We are using the MobileVLCKit library to stream the videos in the app. The current flow of the code, Flutter triggers the native playback via a method channel The Dart side calls the iOS method channel <identifier_name>/ts_player with method playTSFromURL passing: url(e.g rtsp://.... for live), playerId viewId (stable ID used to host native UI) showControls optional localIp AppDelegate receives the call and prepares networking Entry point: AppDelegate.tsChannel handler for "playTSFromURL" in AppDelegate.swift. It resolves the Wi‑Fi interface and local IP if possible: Sets VLC_SOURCE_ADDRESS to the Wi‑Fi IP (when available) to prefer Wi‑Fi for the stream. Uses NWPathMonitor and direct interface inspection to find the Wi‑Fi interface (e.g., en0) and IP. Kicks off best-effort route priming to the dashcam IP/ports (non-blocking), see establishWiFiRoutePriority. AppDelegate chooses the right player implementation createPlayerForURL(_: ) decides: RTSP(rasp://..) --> use VLCKit-backed player (class TSStreamPlayer -> TSStreamPlayer class provides a VLC-backed video player for iOS, handling playback of Transport Stream(TS) URLs with strict main-thread UI updates, view safety, and stream management, using MobileVLCKit) .ts files --> use VLCKit-based player for playing already recorded videos in the app. If the selected player supports extras (e.g. TSStreamPlayerExtras), it sets LocalIP (if resolved) Wi-fi interface name AppDeletegate creates the native 'platform view' container and overlay platformView(for:parent:showControls:): Creates a container UIView attached to the Flutter root view Adds a dedicated child videoHost[viewId]-the host UIView for rendering video. If showControls == true, adds TSPlayerControlsOverlay over the video and wires overlay callbacks back to the Flutter via controlChannel (/player_controls) If showControls == false, adds a minimal back button and wires it to onGalleryBack. The player starts playback inside the host view Class player.playTSFromURL(urlString, in:host){ success, error in...} on the main thread. For RTSP/TS streams: this is handled by TSStreamPlayer(VLCKit). Success/failure is reported back to Flutter The completion closure invoked in step 5 returns true on first real playback or an error message on failure. The method channel result responds: true --> Flutter knows playback started FlutterError -> Flutter can show an error Stopping and cleanup "stop" on tsChannel stops and disposes the player(s). "removePlatformView" removes the overlay, back button, the host, and the container, and disposes any remaining players. I am attaching the logs of the app while running. The actual issue happening is that when the iOS device is connected to the dashcam's Wi-Fi, for the app's live streaming-related information, the iOS is using Mobile Data even though the wifi is the main communication channel. The iOS device takes approximately 30 seconds to display the first frame of live footage in the app. Despite being connected to the dashcam’s Wi-Fi, the iOS device sets the value of ES (en0) to Wi-Fi after multiple attempts, causing the live footage to appear in the app after this delay. So, how can we set up the configuration to display the live footage from the dashcam cameras within just 2 to 3 seconds in the iOS device? ios.txt
0
0
167
2w
SCK/replayd behaviors and delays
We're troubleshooting SCK issues. They occur with a relatively small amount of sessions, but lack of context and/or ability to advise the customer on how they could make behavior more predictable and reliable is problematic. Generally, there is 2 distinct issues which may or may not have the same root cause: Failure to establish SCK session. Usually manifests within the app as SCShareableContent.getWithCompletionHandler call either never invoking the completion handler, or taking prohibitively long time (we usually give it 3-10 sec before giving up). In the system log it may look like this: (log omitted - suspecting it triggers the content filter) Note the 6 seconds delay to completion of fetchShareableContentWithOption (normally it's a 30-40ms operation). Sometime, we'd see the stream established, but some minutes (or even hours) into the recording we'd stop receiving frames. Both scenarios are likely to occur when the disk space is low, with reliable repro of the problem #2 at below 8gb of free space (in that case, we've seen replayd silently dropping the session, without ever notifying the client ... improving API could go a long way there). However, out of recent occurrences, while most have less than 100GB available, we've seen it on machines with as much as 500GB free. Unfortunately, it's almost never reproducible in dev environment, so we have to rely on diagnostics we're able to collect in the field -- which nothing obvious yet. I'd like to understand the root cause of both scenarios better and/or how what specific frameworks can cause these behaviors.
0
0
434
1w