Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

SCK/replayd behaviors and delays
We're troubleshooting SCK issues. They occur with a relatively small amount of sessions, but lack of context and/or ability to advise the customer on how they could make behavior more predictable and reliable is problematic. Generally, there is 2 distinct issues which may or may not have the same root cause: Failure to establish SCK session. Usually manifests within the app as SCShareableContent.getWithCompletionHandler call either never invoking the completion handler, or taking prohibitively long time (we usually give it 3-10 sec before giving up). In the system log it may look like this: (log omitted - suspecting it triggers the content filter) Note the 6 seconds delay to completion of fetchShareableContentWithOption (normally it's a 30-40ms operation). Sometime, we'd see the stream established, but some minutes (or even hours) into the recording we'd stop receiving frames. Both scenarios are likely to occur when the disk space is low, with reliable repro of the problem #2 at below 8gb of free space (in that case, we've seen replayd silently dropping the session, without ever notifying the client ... improving API could go a long way there). However, out of recent occurrences, while most have less than 100GB available, we've seen it on machines with as much as 500GB free. Unfortunately, it's almost never reproducible in dev environment, so we have to rely on diagnostics we're able to collect in the field -- which nothing obvious yet. I'd like to understand the root cause of both scenarios better and/or how what specific frameworks can cause these behaviors.
0
0
433
6d
Why doesn’t AVPlayer / AVFoundation support MPEG-DASH (MPD)? Any public rationale?
Hi, I understand that AVPlayer/AVFoundation doesn’t natively play MPEG-DASH manifests (.mpd) today, while HLS is supported and widely documented by Apple. I’m not asking for roadmap commitments, but I’d like to understand whether there is any publicly documented rationale for not supporting DASH/MPD in AVFoundation (e.g., technical constraints, platform integration, DRM ecosystem, power/performance considerations, etc.). Questions: Is there any Apple statement / documentation explaining why DASH (MPD) isn’t supported in AVFoundation? Is Apple’s recommended approach still “provide HLS for Apple clients” (potentially sharing CMAF segments and generating separate manifests)? If there’s no public rationale, is filing Feedback Assistant the best channel for requesting MPD playback support? Thanks!
1
1
319
1w
Inquiries about AVPlayer's ABR switching logic and control APIs for HLS/LL-HLS
Hello, I am developing a custom player SDK using AVPlayer to support HLS and LL-HLS live streaming. I have some questions about the internal logic of AVPlayer regarding ABR, as this information is not explicitly covered in the documentation. ABR Switching Logic: Does AVPlayer trigger bitrate switching primarily based on stall occurrences (buffer starvation)? I am curious if the switching logic is reactive to stalls or if it proactively switches to prevent them based on throughput estimation. Developer Controls for ABR: To influence or control the ABR selection, are preferredPeakBitRate and preferredForwardBufferDuration the only properties available to developers? Are there any other recommended APIs to assist with ABR decisions? Thank you for your help.
2
0
236
1w
Live Streaming issue for the RTSP
We have the application 'ADS Smart', a companion application for our ADS Dashcam. We offer a feature that lets users stream the live footage of the dashcam cameras through the app. Currently, we are experiencing a time delay of 30+ seconds to see the live stream, i.e the first frame of the live footage is taking around 30+ seconds to display in the app. We are using the MobileVLCKit library to stream the videos in the app. The current flow of the code, Flutter triggers the native playback via a method channel The Dart side calls the iOS method channel <identifier_name>/ts_player with method playTSFromURL passing: url(e.g rtsp://.... for live), playerId viewId (stable ID used to host native UI) showControls optional localIp AppDelegate receives the call and prepares networking Entry point: AppDelegate.tsChannel handler for "playTSFromURL" in AppDelegate.swift. It resolves the Wi‑Fi interface and local IP if possible: Sets VLC_SOURCE_ADDRESS to the Wi‑Fi IP (when available) to prefer Wi‑Fi for the stream. Uses NWPathMonitor and direct interface inspection to find the Wi‑Fi interface (e.g., en0) and IP. Kicks off best-effort route priming to the dashcam IP/ports (non-blocking), see establishWiFiRoutePriority. AppDelegate chooses the right player implementation createPlayerForURL(_: ) decides: RTSP(rasp://..) --> use VLCKit-backed player (class TSStreamPlayer -> TSStreamPlayer class provides a VLC-backed video player for iOS, handling playback of Transport Stream(TS) URLs with strict main-thread UI updates, view safety, and stream management, using MobileVLCKit) .ts files --> use VLCKit-based player for playing already recorded videos in the app. If the selected player supports extras (e.g. TSStreamPlayerExtras), it sets LocalIP (if resolved) Wi-fi interface name AppDeletegate creates the native 'platform view' container and overlay platformView(for:parent:showControls:): Creates a container UIView attached to the Flutter root view Adds a dedicated child videoHost[viewId]-the host UIView for rendering video. If showControls == true, adds TSPlayerControlsOverlay over the video and wires overlay callbacks back to the Flutter via controlChannel (/player_controls) If showControls == false, adds a minimal back button and wires it to onGalleryBack. The player starts playback inside the host view Class player.playTSFromURL(urlString, in:host){ success, error in...} on the main thread. For RTSP/TS streams: this is handled by TSStreamPlayer(VLCKit). Success/failure is reported back to Flutter The completion closure invoked in step 5 returns true on first real playback or an error message on failure. The method channel result responds: true --> Flutter knows playback started FlutterError -> Flutter can show an error Stopping and cleanup "stop" on tsChannel stops and disposes the player(s). "removePlatformView" removes the overlay, back button, the host, and the container, and disposes any remaining players. I am attaching the logs of the app while running. The actual issue happening is that when the iOS device is connected to the dashcam's Wi-Fi, for the app's live streaming-related information, the iOS is using Mobile Data even though the wifi is the main communication channel. The iOS device takes approximately 30 seconds to display the first frame of live footage in the app. Despite being connected to the dashcam’s Wi-Fi, the iOS device sets the value of ES (en0) to Wi-Fi after multiple attempts, causing the live footage to appear in the app after this delay. So, how can we set up the configuration to display the live footage from the dashcam cameras within just 2 to 3 seconds in the iOS device? ios.txt
0
0
166
1w
Broadcast Upload Extension not work in Apple Vision Pro, throw throw "getMXSessionProperty unsupported"
I am working on Screen Record function in Apple Vision Pro, when I use broadcast upload extension, after I click record button, the XCode console show the exception: <<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606 we create and config the project as flow: Create a Apple Vision Project. Create a Broadcast Upload Extension Target. Add App Group for Project Target and Extension Target, both use the same identifier. Add "Main Camera Access", "Passthrough in Screen Capture" Capabilities for all targets. Add "NSScreenCaptureUsageDescription", "NSMicrophoneUsageDescription" in Plist. Add record button in view Run debug in Apple Vision Pro device, after click record button, throw the exception.
1
0
510
2w
CoreMediaErrorDomain -15628 playback failure in iOS 26 (React Native / AVPlayer, HLS stream)
Hi, After updating to iOS 26, our app is facing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier. Error - Domain[CoreMediaErrorDomain]:Code[-15628]:Desc[The operation couldn’t be completed.]:Underlying Error Domain[(null)]:Code[0]:Desc[(null)] Environment: iOS version: ios 26 React Native: 0.69 Video library: react-native-video (AVPlayer under the hood) Stream type: HLS (m3u8) with segment (.ts) files Observed behaviour: Playback works initially on iOS 26. On iOS 26, the stream fails at runtime after a few seconds/minutes (not on first load). Network logs show 307 redirects on some segment requests. After this, AVPlayer throws the above error. Playback fails intermittently on slow/unstable networks.
6
18
1.4k
3w
CoreMediaErrorDomain error -42681
We are trying to port our code to Apple TV on tvosVersion 17.6 while running the sample we are getting error CoreMediaErrorDomain error -42681. We understand that this error occurs when the FairPlay license (CKC) returned by the server contains incompatible or malformed version information that the iOS/tvOS FairPlay CDM cannot parse. Can you please specify tvos 17.6 expect what fairplay version number or what fields are mandartory for fps version metadata ?
0
0
251
3w
Disconnect from AirPlay device programmatically
Hello there, I'm trying to implement feature which uses AirPlay with Apple TV. I want to disconnect from the device programmatically when something happens. Under something I mean a situation when a user wants to stop broadcasting (for example close the PiP window on his phone). I use this snippet: try audioSession.setCategory(.playAndRecord, options: .defaultToSpeaker) try audioSession.setActive(true, options: .notifyOthersOnDeactivation) It works fine sometimes but not always (it works on iOS 18 but it doesn't on iOS 17 or ). So I thought it's a bug and create a ticker to feedback assistant (FB21220013). The support told me write a post on the forum.
2
0
242
Dec ’25
Query regarding migrating from Fairplay Streaming SDK version 4.5.4 to version 26
Our license service is based on version 4.5.4 and we make use of sample .c/.h files for building license service. We are told that version 4.5.4 is going to be deprecated in 2026 and we should migrate to latest SDK version 26. When explored the SDK, we noticed that only python and Swift based SDk is provided. Does Apple also provide C/C++ based SDK as it is going to easier for us to integrate. If yes, please share the SDK package and sample license service solution.
1
0
123
Dec ’25
Are there known cases where DepthData is empty while Face ID is working?
We are experiencing an issue related to DepthData from the TrueDepth camera on a specific device. On December 1, we tested with the complainant’s device iPhone 14 / iOS 26.0.1, and observed that the depth image is received with empty values. However, the same implementation works normally on iPhone 17 Pro Max (iOS 26.1) and iPhone 13 Pro Max (iOS 26.0.1), where depth data is delivered correctly. In the problematic case: TrueDepth camera is active Face ID works normally The app receives a DepthData object, but all values are empty (0), not nil Because the DepthData object is not nil, this makes it difficult to detect the issue through software fallback handling. We developed the feature with reference to the following Apple sample: https://developer.apple.com/documentation/AVFoundation/streaming-depth-data-from-the-truedepth-camera We would like to ask: Are there known cases where Face ID functions normally but DepthData from the TrueDepth camera is returned as empty values? If so, is there a recommended approach for identifying or handling this situation? Any guidance from Apple engineers or the community would be greatly appreciated. Thank you.
0
0
171
Dec ’25
Playing Apple Fairplay encrypted content on iOS 26
For iOS17 we've had no problem playing Apple Fairplay encrypted content with keys delivered from our key server running on FairPlay Streaming Server SDK 5.1 and subsequently FairPlay Streaming Server SDK 26. It's built and deployed using Xcode Version 26.1.1 (17B100) with no changes to the code and - as expected - the content continued to be successfully decrypted and played (so far so good). However, as soon as a device was updated to iOS26, that device would no longer play the encrypted content. Devices remaining on iOS17 continue to work normally and the debugging logs are a sanity-check that proves that. Is anyone else experiencing this issue? Here's the code (you should be able to drop it into a fresh iOS Xcode project and provide a server url, content url and certificate).
0
1
193
Dec ’25
HLS CMAF playlist with multiple DRM changes
Hi Is it possible to have a playlist where I have a indication of a stream in clear, but then, someone started a DRM encrypted period and then someone turns it off. Can I just do the following (I've removed the video segments part, I'm just interested in the parts where I want notify the new drm region )? #EXT-X-MAP:URI="video_2_10000000_t17586401730000000_init.mp4" #EXT-X-KEY:METHOD=NONE ... #EXT-X-MAP:URI="video_2_10000000_t17587374640000000_init.mp4" #EXT-X-KEY:METHOD=SAMPLE-AES,URI="skd://5df0b36ac4bb4d0ff954a73b502ac332",KEYFORMAT="com.apple.streamingkeydelivery",KEYFORMATVERSIONS="1" ... #EXT-X-MAP:URI="video_2_10000000_t17587376740000000_init.mp4?" #EXT-X-KEY:METHOD=NONE Should I insert discontinuity tags or something else? Right now what I can observe is that I got some audio drops when I try to do this.
1
0
296
Dec ’25
AVPlayer: escaped characters displayed wrong from WebVTT subtitles (HLS)
quotes are displayed incorrectly in subtitles of AVPlayerViewController when streaming VOD content using HLS. single quote ' (escaped ') is displayed as apos; double quotes " (escaped ") is displayed as quot; following the vtt specification. The same stream works fine in VLC player, showing quotes correctly in subtitles. subtitle vtt files use Content-Type: text/vtt WEBVTT X-TIMESTAMP-MAP=LOCAL:490014:06:04.000,MPEGTS:158764568760056 example line: 490014:05:46.000 --> 490014:05:50.440 align:start line:83% position:14% and the playlist has: #EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subs",LANGUAGE="da",NAME="Dansk",AUTOSELECT=YES,CHARACTERISTICS="public.accessibility.transcribes-spoken-dialog,public.accessibility.describes-music-and-sound",URI="subs/dan_5/playlist.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=780000,CODECS="mp4a.40.5,avc1.42c01e",RESOLUTION=256x144,AUDIO="audio-aac",SUBTITLES="subs" lære dig endnu bedre at kende." adding 'wvtt' to CODECS list in playlist does not make a difference. Is this a known bug? Is there a workaround? I guess the AVResourceLoaderDelegate can be used to intercept and parse the subtitle files, but it seems like quite a hack and not really intended to be used for this.
2
0
144
Dec ’25
When does AVPlayer exclude _HLS_part query param in LL-HLS requests?
Hello, I'm investigating an issue with LL-HLS playback using AVPlayer, specifically during DVR Live seeking (seeking to a past time). I noticed that in certain seeking scenarios, AVPlayer sends a Blocking Playlist Reload request that includes the _HLS_msn parameter but is missing the _HLS_part parameter. While I understand this is compliant with the HLS spec, I would like to know the specific criteria AVPlayer uses to decide when to drop the _HLS_part parameter. Does AVPlayer intentionally omit the part info when it determines that loading a specific partial segment is unnecessary during a seek operation? Clarification on this behavior would help us greatly in debugging our stream delivery. Thanks in advance.
1
0
123
Dec ’25
Will the new Automix feature in iOS 26 be available for third-party apps using MusicKit?
Hi everyone, We’re currently developing a music-based app using MusicKit, and we recently noticed that iOS 26 beta introduces a new “Automix” feature in the Apple Music app. This enables seamless DJ-style transitions between songs—beyond the standard crossfade functionality. We’re trying to understand: Will this Automix feature be accessible to third-party apps that use MusicKit? If not available in the initial iOS 26 release, is there a plan to expose it through public APIs in a future update? Is there any technical documentation, WWDC session, or roadmap info regarding Automix support via MusicKit? This functionality would be a significant enhancement for our app, especially for intelligent audio transitions and curated playlists. Thanks.
2
3
731
Nov ’25
Issues with "AVMetricEventStreamPublisher Discover Media Performance Metrics in AVFoundation" Example Code
Hi everyone! I’ve been working with AVFoundation and trying to use the AVMetricEventStreamPublisher to discover media performance metrics, as described in the Apple documentation. https://developer.apple.com/cn/videos/play/wwdc2024/10113/?time=508 However, when following the example code, I’m not getting the expected results. The performance metrics for both audio and video don’t seem to be captured properly. Has anyone successfully used this example code? If so, could you share your experience or any solutions you’ve found? Any tips or insights would be greatly appreciated. Thanks in advance! Ps. the example code: AVPlayerItem *item = ... AVMetricEventStream *eventStream = [AVMetricEventStream eventStream]; id subscriber = [[MyMetricSubscriber alloc] init]; [eventStream setSubscriber:subscriber queue:mySerialQueue] [eventStream subscribeToMetricEvent:[AVMetricPlayerItemLikelyToKeepUpEvent class]]; [eventStream subscribeToMetricEvent:[AVMetricPlayerItemPlaybackSummaryEvent class]]; [eventStream addPublisher:item];
2
0
155
Nov ’25
MusicKit broken in simulator with current tools? Can't get token.
Just updated my computer, phone, and dev tools to the latest versions of everything. Now when I run my app in a previously-working simulator (iPhone 16 w. iOS 18.5) I get: Failed retrieving MusicKit tokens: fetching the developer token is not supported in the simulator when running on this version of macOS; please upgrade your Mac to macOS Ventura. Also: <ICCloudServiceStatusMonitor: 0x600003320e60>: Invoking 1 completion handler for MusicKit tokens. error=<ICError.DeveloperTokenFetchingFailed (-8200) "Failed to fetch media token from <AMSMediaTokenService: 0x6000029049a0>." { underlyingErrors: [ <AMSErrorDomain.300 "Token request encoding failed The token request encoder finished with an error." { userInfo: { AMSDescription : "Token request encoding failed", AMSFailureReason : "The token request encoder finished with an error." }; underlyingErrors: [ <AMSErrorDomain.5 "Anisette Failed Platform not supported" { userInfo: { AMSDescription : "Anisette Failed", AMSFailureReason : "Platform not supported" }; Anybody know what gives here? The Ventura message is absurd because I'm on Tahoe 26.1. The same code works on a physical phone running iOS 26.
0
0
149
Nov ’25
Feature / Workaround wanted: Seamless, Automated AirPlay Screen Streaming on visionOS for Demos
Hello Apple team and developer community, I am preparing a visionOS app for a fair environment, where we want to automatically stream the current experience to a nearby monitor via AirPlay, without requiring guests or staff to manually interact with the Control Center or AirPlay pickers all the time. The goal is to provide a smooth, frictionless setup so attendees can focus on the demo, not the configuration. Feature Request: A supported API or method to programmatically start/stop AirPlay video streaming (mirroring or external playback) from within a visionOS app, allowing the current experience to be instantly displayed on an external monitor or Apple TV for the audience. Context & Rationale: In a trade fair or exhibition setting, rapid guest turnaround and minimal staff intervention are crucial. Having to manually guide each visitor through AirPlay setup is impractical. As I understood, AVRoutePickerView can be used for this on iOS/macOS, but this is not available in visionOS. Enabling similar automated streaming on visionOS would make the device far more suitable for live demos and public showcases. Questions: Are there any supported workarounds or best practices for enabling automated screen streaming or AirPlay initiation on visionOS in public demo environments that I missed? Is Apple considering adding programmatic AirPlay control or accessibility features to support such use cases in future visionOS releases? Thank you for considering this request! If there are recommended patterns, entitlements, or accessibility solutions we could explore for trade fair scenarios, your guidance would be greatly appreciated. Best regards, Julian Zürn - IPI, HS Kempten
2
0
368
Nov ’25
Low-Latency HLS stream playback issue on iOS 26
We have a Low-Latency HLS stream, and on iOS 26, even though the bandwidth is sufficient, it still selects a low-bandwidth resolution (e.g., RESOLUTION=640x360) for playback instead of using a higher-bandwidth resolution (e.g., RESOLUTION=1920x1080) when using AVPlayerViewController with AVPlayer. This works fine on iOS version 18 and previous versions. What could be the solution to this issue?
1
0
207
Nov ’25