Hello, I am developing a custom player SDK using AVPlayer to support HLS and LL-HLS live streaming. I have some questions about the internal logic of AVPlayer regarding ABR, as this information is not explicitly covered in the documentation.
ABR Switching Logic: Does AVPlayer trigger bitrate switching primarily based on stall occurrences (buffer starvation)? I am curious if the switching logic is reactive to stalls or if it proactively switches to prevent them based on throughput estimation.
Developer Controls for ABR: To influence or control the ABR selection, are preferredPeakBitRate and preferredForwardBufferDuration the only properties available to developers? Are there any other recommended APIs to assist with ABR decisions?
Thank you for your help.
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
I understand that AVPlayer/AVFoundation doesn’t natively play MPEG-DASH manifests (.mpd) today, while HLS is supported and widely documented by Apple.
I’m not asking for roadmap commitments, but I’d like to understand whether there is any publicly documented rationale for not supporting DASH/MPD in AVFoundation (e.g., technical constraints, platform integration, DRM ecosystem, power/performance considerations, etc.).
Questions:
Is there any Apple statement / documentation explaining why DASH (MPD) isn’t supported in AVFoundation?
Is Apple’s recommended approach still “provide HLS for Apple clients” (potentially sharing CMAF segments and generating separate manifests)?
If there’s no public rationale, is filing Feedback Assistant the best channel for requesting MPD playback support?
Thanks!
We're troubleshooting SCK issues. They occur with a relatively small amount of sessions, but lack of context and/or ability to advise the customer on how they could make behavior more predictable and reliable is problematic.
Generally, there is 2 distinct issues which may or may not have the same root cause:
Failure to establish SCK session. Usually manifests within the app as SCShareableContent.getWithCompletionHandler call either never invoking the completion handler, or taking prohibitively long time (we usually give it 3-10 sec before giving up). In the system log it may look like this:
(log omitted - suspecting it triggers the content filter)
Note the 6 seconds delay to completion of fetchShareableContentWithOption (normally it's a 30-40ms operation).
Sometime, we'd see the stream established, but some minutes (or even hours) into the recording we'd stop receiving frames.
Both scenarios are likely to occur when the disk space is low, with reliable repro of the problem #2 at below 8gb of free space (in that case, we've seen replayd silently dropping the session, without ever notifying the client ... improving API could go a long way there). However, out of recent occurrences, while most have less than 100GB available, we've seen it on machines with as much as 500GB free.
Unfortunately, it's almost never reproducible in dev environment, so we have to rely on diagnostics we're able to collect in the field -- which nothing obvious yet.
I'd like to understand the root cause of both scenarios better and/or how what specific frameworks can cause these behaviors.
I want develop an app for real-time streaming spatial video transmission from an Apple Vision Pro to another Apple Vision Pro and play, like MV-HEVC, does it's possible? If it's possible how to make it?
I am trying to Build server for testing on Linux(Alma linux 9 VM)
NAME="AlmaLinux"
VERSION="9.7 (Moss Jungle Cat)"
ID="almalinux"
ID_LIKE="rhel centos fedora"
VERSION_ID="9.7"
PLATFORM_ID="platform:el9"
PRETTY_NAME="AlmaLinux 9.7 (Moss Jungle Cat)"
ANSI_COLOR="0;34"
[azuki@AlmaDevVM ~]$ uname -m
x86_64
I have tried the following steps:
Before starting, ensured that Swift 6 installed. Referred https://www.swift.org/install/ for instructions.
Build the library
In Terminal, uses the following commands to compile the Swift library:
cd Development/Key_Server_Module/Swift
swift build -Xbuild-tools-swiftc -DTEST_CREDENTIALS
After building the library, ran test cases to ensure the library behaves as expected. ALL unit tests are passing with the development credentials.
• Since I was using an x86_64 machine:
export LD_LIBRARY_PATH=./Sources/prebuilt/x86_64-unknown-linux-gnu/
Run all tests:
swift test -Xbuild-tools-swiftc -DTEST_CREDENTIALS --disable-swift-testing
Build the server
Build the server: Apache
Before starting, ensured the following:
a. Installed Apache HTTPD and the dev tools. Using the following command for installation:
yum install httpd httpd-devel redhat-rpm-config
b. After this, integrated it into the Apache server
environment with swift library that was built above. Used the following command to build the server using apxs:
• Since I was using an x86_64 machine:
apxs -i -a -c
-Wl,-L${PWD}/.build/x86_64-unknown-linux-gnu/debug/
-Wl,-lswift_fpssdk
-Wl,-L${PWD}/Sources/prebuilt/x86_64-unknown-linux-gnu -lfpscrypto
-Wl,-R${PWD}/.build/x86_64-unknown-linux-gnu/debug
server_setup/mod_fps.c
c. Next, copied the dependent libraries to the Apache modules folder using these commands:
• If using an x86_64 machine:
cp Sources/prebuilt/x86_64-unknown-linux-gnu/libfpscrypto.so /usr/lib64/httpd/modules/libfpscrypto.so
cp .build/x86_64-unknown-linux-gnu/debug/libswift_fpssdk.so /usr/lib64/httpd/modules/libswift_fpssdk.so
d. Configuring Apache HTTPD
Configured Apache HTTPD by adding the module and handler to your Apache
HTTPD configuration (/etc/httpd/conf/httpd.conf). Note that the apxs command
may automatically add the LoadModule line in the previous step.
Listen 8080
LoadFile /usr/lib64/httpd/modules/libfpscrypto.so
LoadFile /usr/lib64/httpd/modules/libswift_fpssdk.so
LoadModule fps_module /usr/lib64/httpd/modules/mod_fps.so
<Location "/fps">
SetHandler fps_handler
Copy the credentials to the Apache modules folder.
cp -r ../credentials /usr/lib64/httpd/modules/
export FPS_CERT_PATH=
/usr/lib64/httpd/modules/credentials/test_certificates.json
e. Run your server
You can run the Apache HTTPD server with the configured module by using the following command:
httpd -D FOREGROUND
No issues see till step.
Get SDK version
[azuki@AlmaDevVM Key_Server_Module]$ curl localhost:8080/fps/v
26.0.0
But when i try to generate license
[azuki@AlmaDevVM Key_Server_Module]$ curl -d ../Test_Inputs/iOS/spc_ios_hd_lease_2048.json localhost:8080/fps
{"fairplay-streaming-response":{"create-ckc":[{"id":1,"status":-42601}]}}
Can you please suggest what i might be missing here?
The ASk is used by the KSM to derive the dASk, which is then used to decrypt the SK...R1.
If the only thing we give the client is the certificate, how does it encrypt the SK...R1 so the server is able to process it.
Would be nice to know it it works generally, because I've been getting questions about it and can't provide a helpful answer.
Thanks in advance.
Hello,
I am currently developing a video player using Custom AVPlayer SDK and testing LL-HLS live streaming.
I encountered a specific error, CoreMediaErrorDomain -15418, during playback. I have searched through the official documentation and the forums, but I could not find any information regarding this error code.
I would like to inquire about the following:
Description & Cause: What does the error code -15418 specifically represent in the context of CoreMedia and LL-HLS?
Severity: Is this a critical error that halts playback, or is it merely a warning?
Environment Details:
iOS Version: iOS 26.2
Device: iPhone 15 Pro Max
Stream Type: LL-HLS (Low-Latency HLS)
Impact: Quality drops
Any insights or references to documentation would be greatly appreciated.
Thank you.
Hello,
I have a problem generating a 2048-bit FairPlay Streaming certificate.
I tried generating SDK v26.x certificate in two ways.
(1) Use existing certificate
(2) Create new certificate
Though, in both ways, Apple gives me a certificate bundle of 1024-bit certificate.
(fps_certificate.bin)
I've uploaded 2048-bit CSR on creating a certificate.
Just to note, I have created a SDK v4.x certificate few years ago.
Have anyone bumped into a same issue?
Or am I missing something?
Hi
We’re updating our KSM to support SPC v2/v3 and currently operate with both legacy SDK4 credentials (ASK + 1024 cert) and SDK26 credentials (certificate bundle + provisioning data + 1024/2048 keys).
Our client apps run across a wide range of iOS/tvOS versions, so we want to follow Apple’s recommended client strategy for certificate selection. The docs describe SHA‑1 vs SHA‑256 in the SPC header, but do not specify which OS versions should use SDK4 vs SDK26 credentials.
Could you clarify:
Is there an official minimum iOS/tvOS version where you recommend SDK26 credentials for client apps?
For older OS versions (e.g. iOS 15), is SDK4 still the recommended choice for client apps?
Are there any official migration guidelines for client apps moving from SDK4 to SDK26 credentials?
Thanks in advance.
Hello,
I have implemented Low-Latency Frame Interpolation using the VTFrameProcessor framework, based on the sample code from https://developer.apple.com/kr/videos/play/wwdc2025/300. It is currently working well for both LIVE and VOD streams.
However, I have a few questions regarding the lifecycle management and synchronization of this feature:
1. Common Questions (Applicable to both Frame Interpolation & Super Resolution)
1.1 Dynamic Toggling
Do you recommend enabling/disabling these features dynamically during playback?
Or is it better practice to configure them only during the initial setup/preparation phase?
If dynamic toggling is supported, are there any recommended patterns for managing VTFrameProcessor session lifecycle (e.g., startSession / endSession timing)?
1.2 Synchronization Method
I am currently using CADisplayLink to fetch frames from AVPlayerItemVideoOutput and perform processing.
Is CADisplayLink the recommended approach for real-time frame acquisition with VTFrameProcessor?
If the feature needs to be toggled on/off during active playback, are there any concerns or alternative approaches you would recommend?
1.3 Supported Resolution/Quality Range
What are the minimum and maximum video resolutions supported for each feature?
Are there any aspect ratio restrictions (e.g., does it support 1:1 square videos)?
Is there a recommended resolution range for optimal performance and quality?
2. Frame Interpolation Specific Questions
2.1 LIVE Stream Support
Is Low-Latency Frame Interpolation suitable for LIVE streaming scenarios where latency is critical?
Are there any special considerations for LIVE vs VOD?
3. Super Resolution Specific Questions
3.1 Adaptive Bitrate (ABR) Stream Support
In ABR (HLS/DASH) streams, the video resolution can change dynamically during playback.
Is VTLowLatencySuperResolutionScaler compatible with ABR streams where resolution changes mid-playback?
If resolution changes occur, should I recreate the VTLowLatencySuperResolutionScalerConfiguration and restart the session, or does the API handle this automatically?
3.2 Small/Square Resolution Issue
I observed that 144x144 (1:1 square) videos fail with error:
"VTFrameProcessorErrorDomain Code=-19730: processWithSourceFrame within VCPFrameSuperResolutionProcessor failed"
However, 480x270 (16:9) videos work correctly.
minimumDimensions reports 96x96, but 144x144 still fails. Is there an undocumented restriction on aspect ratio or a practical minimum resolution?
3.3 Scale Factor Selection
supportedScaleFactors returns [2.0, 4.0] for most resolutions.
Is there a recommended scale factor for balancing quality and performance?
Are there scenarios where 4.0x should be avoided?
The documentation on this specific topic seems limited, so I would appreciate any insights or advice.
Thank you.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
VideoToolbox
HTTP Live Streaming
AVKit
AVFoundation
According to the documentation (https://developer.apple.com/documentation/avfoundation/avcontentkeyrequest/originatingrecipient?changes=_3&language=objc), starting with ios 18.4, I can get AVContentKeyRecipient from AVContentKeyRequest. But when I try to get it, I get a crash. What could be the issue?
I want to note that I add the asset to the AVContentKeySession using the addContentKeyRecipient method (https://developer.apple.com/documentation/avfoundation/avcontentkeysession/addcontentkeyrecipient(_:)?changes=_3&language=objc).
Hello,
I am developing a custom player SDK based on AVPlayer.
While testing LL-HLS streams, I intermittently encounter the following error: Error Domain=CoreMediaErrorDomain Code=-12880
Since I cannot find documentation for this specific code, could you please clarify its meaning?
Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored.
Any insights would be appreciated.
Thank you.
Hello,
I’m using a valid certificate bundle generated with SDK 26 (combined RSA‑1024 + RSA‑2048).
However, all my devices currently still generate SPC v2 during playback, including my iPhone 16 under iOS 26.2.
Apple staff mentioned that future iOS versions will send SPC v3 when using an SDK 26 certificate bundle.
Could you please clarify:
Which iOS/macOS versions will first support SPC v3?
Are there any additional client‑side requirements (Safari version, playback APIs, headers, etc.) to trigger SPC v3?
Is there any way to test SPC v3 today, e.g., using beta builds?
Thank you!