Our license service is based on version 4.5.4 and we make use of sample .c/.h files for building license service.
We are told that version 4.5.4 is going to be deprecated in 2026 and we should migrate to latest SDK version 26.
When explored the SDK, we noticed that only python and Swift based SDk is provided.
Does Apple also provide C/C++ based SDK as it is going to easier for us to integrate.
If yes, please share the SDK package and sample license service solution.
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
We are experiencing an issue related to DepthData from the TrueDepth camera on a specific device.
On December 1, we tested with the complainant’s device iPhone 14 / iOS 26.0.1, and observed that the depth image is received with empty values.
However, the same implementation works normally on iPhone 17 Pro Max (iOS 26.1) and iPhone 13 Pro Max (iOS 26.0.1), where depth data is delivered correctly.
In the problematic case:
TrueDepth camera is active
Face ID works normally
The app receives a DepthData object, but all values are empty (0), not nil
Because the DepthData object is not nil, this makes it difficult to detect the issue through software fallback handling.
We developed the feature with reference to the following Apple sample:
https://developer.apple.com/documentation/AVFoundation/streaming-depth-data-from-the-truedepth-camera
We would like to ask:
Are there known cases where Face ID functions normally but DepthData from the TrueDepth camera is returned as empty values?
If so, is there a recommended approach for identifying or handling this situation?
Any guidance from Apple engineers or the community would be greatly appreciated.
Thank you.
For iOS17 we've had no problem playing Apple Fairplay encrypted content with keys delivered from our key server running on FairPlay Streaming Server SDK 5.1 and subsequently FairPlay Streaming Server SDK 26. It's built and deployed using Xcode Version 26.1.1 (17B100) with no changes to the code and - as expected - the content continued to be successfully decrypted and played (so far so good). However, as soon as a device was updated to iOS26, that device would no longer play the encrypted content.
Devices remaining on iOS17 continue to work normally and the debugging logs are a sanity-check that proves that. Is anyone else experiencing this issue?
Here's the code (you should be able to drop it into a fresh iOS Xcode project and provide a server url, content url and certificate).
Hi
Is it possible to have a playlist where I have a indication of a stream in clear, but then, someone started a DRM encrypted period and then someone turns it off.
Can I just do the following (I've removed the video segments part, I'm just interested in the parts where I want notify the new drm region )?
#EXT-X-MAP:URI="video_2_10000000_t17586401730000000_init.mp4"
#EXT-X-KEY:METHOD=NONE
...
#EXT-X-MAP:URI="video_2_10000000_t17587374640000000_init.mp4"
#EXT-X-KEY:METHOD=SAMPLE-AES,URI="skd://5df0b36ac4bb4d0ff954a73b502ac332",KEYFORMAT="com.apple.streamingkeydelivery",KEYFORMATVERSIONS="1"
...
#EXT-X-MAP:URI="video_2_10000000_t17587376740000000_init.mp4?"
#EXT-X-KEY:METHOD=NONE
Should I insert discontinuity tags or something else?
Right now what I can observe is that I got some audio drops when I try to do this.
quotes are displayed incorrectly in subtitles of AVPlayerViewController when streaming VOD content using HLS.
single quote ' (escaped ') is displayed as apos;
double quotes " (escaped ") is displayed as quot;
following the vtt specification.
The same stream works fine in VLC player, showing quotes correctly in subtitles.
subtitle vtt files use
Content-Type: text/vtt
WEBVTT
X-TIMESTAMP-MAP=LOCAL:490014:06:04.000,MPEGTS:158764568760056
example line:
490014:05:46.000 --> 490014:05:50.440 align:start line:83% position:14%
and the playlist has:
#EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subs",LANGUAGE="da",NAME="Dansk",AUTOSELECT=YES,CHARACTERISTICS="public.accessibility.transcribes-spoken-dialog,public.accessibility.describes-music-and-sound",URI="subs/dan_5/playlist.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=780000,CODECS="mp4a.40.5,avc1.42c01e",RESOLUTION=256x144,AUDIO="audio-aac",SUBTITLES="subs"
lære dig endnu bedre at kende."
adding 'wvtt' to CODECS list in playlist does not make a difference.
Is this a known bug? Is there a workaround?
I guess the AVResourceLoaderDelegate can be used to intercept and parse the subtitle files, but it seems like quite a hack and not really intended to be used for this.
Hello,
I'm investigating an issue with LL-HLS playback using AVPlayer, specifically during DVR Live seeking (seeking to a past time).
I noticed that in certain seeking scenarios, AVPlayer sends a Blocking Playlist Reload request that includes the _HLS_msn parameter but is missing the _HLS_part parameter.
While I understand this is compliant with the HLS spec, I would like to know the specific criteria AVPlayer uses to decide when to drop the _HLS_part parameter. Does AVPlayer intentionally omit the part info when it determines that loading a specific partial segment is unnecessary during a seek operation?
Clarification on this behavior would help us greatly in debugging our stream delivery.
Thanks in advance.
Hi,
After updating to iOS 26, our app is facing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier.
Error - Domain[CoreMediaErrorDomain]:Code[-15628]:Desc[The operation couldn’t be completed.]:Underlying Error Domain[(null)]:Code[0]:Desc[(null)]
Environment:
iOS version: ios 26
React Native: 0.69
Video library: react-native-video (AVPlayer under the hood)
Stream type: HLS (m3u8) with segment (.ts) files
Observed behaviour:
Playback works initially on iOS 26.
On iOS 26, the stream fails at runtime after a few seconds/minutes (not on first load).
Network logs show 307 redirects on some segment requests. After this, AVPlayer throws the above error.
Playback fails intermittently on slow/unstable networks.
I am working on Screen Record function in Apple Vision Pro, when I use broadcast upload extension, after I click record button, the XCode console show the exception:
<<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606
we create and config the project as flow:
Create a Apple Vision Project.
Create a Broadcast Upload Extension Target.
Add App Group for Project Target and Extension Target, both use the same identifier.
Add "Main Camera Access", "Passthrough in Screen Capture" Capabilities for all targets.
Add "NSScreenCaptureUsageDescription", "NSMicrophoneUsageDescription" in Plist.
Add record button in view
Run debug in Apple Vision Pro device, after click record button, throw the exception.
Hi everyone,
We’re currently developing a music-based app using MusicKit, and we recently noticed that iOS 26 beta introduces a new “Automix” feature in the Apple Music app. This enables seamless DJ-style transitions between songs—beyond the standard crossfade functionality.
We’re trying to understand:
Will this Automix feature be accessible to third-party apps that use MusicKit?
If not available in the initial iOS 26 release, is there a plan to expose it through public APIs in a future update?
Is there any technical documentation, WWDC session, or roadmap info regarding Automix support via MusicKit?
This functionality would be a significant enhancement for our app, especially for intelligent audio transitions and curated playlists.
Thanks.
Macs do not support Multi-Stream Transport (MST), which prevents from using a single DisplayPort or USB-C port to daisy-chain multiple external monitors in an extended display mode. So the the virtual multiple display modes are not working correctly on Mac.
Topic:
Media Technologies
SubTopic:
Streaming
Hi everyone! I’ve been working with AVFoundation and trying to use the AVMetricEventStreamPublisher to discover media performance metrics, as described in the Apple documentation.
https://developer.apple.com/cn/videos/play/wwdc2024/10113/?time=508
However, when following the example code, I’m not getting the expected results. The performance metrics for both audio and video don’t seem to be captured properly.
Has anyone successfully used this example code? If so, could you share your experience or any solutions you’ve found? Any tips or insights would be greatly appreciated. Thanks in advance!
Ps. the example code:
AVPlayerItem *item = ...
AVMetricEventStream *eventStream = [AVMetricEventStream eventStream];
id subscriber = [[MyMetricSubscriber alloc] init];
[eventStream setSubscriber:subscriber queue:mySerialQueue]
[eventStream subscribeToMetricEvent:[AVMetricPlayerItemLikelyToKeepUpEvent class]];
[eventStream subscribeToMetricEvent:[AVMetricPlayerItemPlaybackSummaryEvent class]];
[eventStream addPublisher:item];
Just updated my computer, phone, and dev tools to the latest versions of everything. Now when I run my app in a previously-working simulator (iPhone 16 w. iOS 18.5) I get:
Failed retrieving MusicKit tokens: fetching the developer token is not supported in the simulator when running on this version of macOS; please upgrade your Mac to macOS Ventura.
Also:
<ICCloudServiceStatusMonitor: 0x600003320e60>: Invoking 1 completion handler for MusicKit tokens. error=<ICError.DeveloperTokenFetchingFailed (-8200) "Failed to fetch media token from <AMSMediaTokenService: 0x6000029049a0>." { underlyingErrors: [ <AMSErrorDomain.300 "Token request encoding failed The token request encoder finished with an error." { userInfo: { AMSDescription : "Token request encoding failed", AMSFailureReason : "The token request encoder finished with an error." }; underlyingErrors: [ <AMSErrorDomain.5 "Anisette Failed Platform not supported" { userInfo: { AMSDescription : "Anisette Failed", AMSFailureReason : "Platform not supported" };
Anybody know what gives here? The Ventura message is absurd because I'm on Tahoe 26.1. The same code works on a physical phone running iOS 26.
Hello Apple team and developer community,
I am preparing a visionOS app for a fair environment, where we want to automatically stream the current experience to a nearby monitor via AirPlay, without requiring guests or staff to manually interact with the Control Center or AirPlay pickers all the time.
The goal is to provide a smooth, frictionless setup so attendees can focus on the demo, not the configuration.
Feature Request:
A supported API or method to programmatically start/stop AirPlay video streaming (mirroring or external playback) from within a visionOS app, allowing the current experience to be instantly displayed on an external monitor or Apple TV for the audience.
Context & Rationale:
In a trade fair or exhibition setting, rapid guest turnaround and minimal staff intervention are crucial. Having to manually guide each visitor through AirPlay setup is impractical.
As I understood, AVRoutePickerView can be used for this on iOS/macOS, but this is not available in visionOS. Enabling similar automated streaming on visionOS would make the device far more suitable for live demos and public showcases.
Questions:
Are there any supported workarounds or best practices for enabling automated screen streaming or AirPlay initiation on visionOS in public demo environments that I missed?
Is Apple considering adding programmatic AirPlay control or accessibility features to support such use cases in future visionOS releases?
Thank you for considering this request! If there are recommended patterns, entitlements, or accessibility solutions we could explore for trade fair scenarios, your guidance would be greatly appreciated.
Best regards,
Julian Zürn - IPI, HS Kempten
We have a Low-Latency HLS stream, and on iOS 26, even though the bandwidth is sufficient, it still selects a low-bandwidth resolution (e.g., RESOLUTION=640x360) for playback instead of using a higher-bandwidth resolution (e.g., RESOLUTION=1920x1080) when using AVPlayerViewController with AVPlayer.
This works fine on iOS version 18 and previous versions. What could be the solution to this issue?
Topic:
Media Technologies
SubTopic:
Streaming
Hello, We have Video Stream app. It has HLS VOD Content. We supply 1080p, 4K Contents to users. Users were watching 1080p content before tvOS 26. Users can not watch 1080p content anymore when they update to tvOS 26. We have not changed anything at HLS playlist side and application version. This problem only occurs on Apple TV 4th Gen (A1625) tvOS 26 version. There is no problem with newer Apple TV devices. Would you help to resolve problem? Thanks in advance
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Apple TV
tvOS
HTTP Live Streaming
Hi, I submitted the FairPlay Streaming Credentials Approval request, but it's been 15 days and I haven't received a response yet. Do you happen to know how long they usually take to reply to these requests?
Hi,
While I don't normally use FairPlay,
I got this email that is so strangely worded I am wondering if it makes sense to people who do know it or if it has some typo or what.
You can only generate certificates for SDK 4, also SDK 4 is no longer supported?
(Also "will not expire" is imprecise phrasing, certificates presumably will expire, I think it meant to say are still valid / are not invalidated.)
Since iOS and tvOS 18, CMCD can now be automatically sent by AVPlayer (https://developer.apple.com/streaming/Whats-new-HLS.pdf).
However, after enabling CMCD, our streams occasionally fail with the following error: CoreMediaErrorDomain Error -17383
This issue appears to affect only DRM-protected (FairPlay) streams so far.
We activate CMCD via the resource loader of an AVURLAsset, before assigning the item to an AVPlayer.
Unfortunately, we haven’t found a reliable way to reproduce the issue, and we’ve been unable to gather any useful diagnostic information.
Has anyone else observed this behavior when enabling CMCD on FairPlay streams?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
iOS
HTTP Live Streaming
AVFoundation
Hello All,
I am looking for assistance with our FairPlay Streaming (FPS) certificates. We are in the process of migrating to a new video streaming vendor and need to create a new FPS certificate using SDK 4. However, we have reached the limit of allowed FPS certificates in our account and cannot create a new one.
Issue Details:
• We currently have two FPS certificates active in our developer account.
• One of these was created using SDK 5, but our new vendor (Mux) requires an FPS certificate based on SDK 4.
• Since Apple does not allow deleting FPS certificates from the developer portal, we are unable to create a new SDK 4 certificate.
• We kindly request Apple to revoke one of our existing FPS certificates to allow us to generate a new SDK 4 certificate.
Request:
We would greatly appreciate it if you could assist us on how to delete one of our existing FPS certificates so that we can proceed with creating a new SDK 4 certificate for our vendor integration.
Thank you for your support.
The operation couldn’t be completed. (CoreMediaErrorDomain error -19156 - The operation couldn’t be completed. (CoreMediaErrorDomain error -19156.