Hello,
Our users have started to see a new fatal AVPlayer error during playback starting with iOS/tvOS 18.0. The error is defined as "CoreMediaErrorDomain Code=-15486".
We have not been able to reproduce this issue locally within our development team.
Is there any documentation on the cause of this error or steps to recover from this error?
Thank you,
Howard
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Post
Replies
Boosts
Views
Activity
Hello, Apple video engineers.
According to the official documentation, HLS is built on HTTP and traditionally ran on top of TCP. However, with the introduction of HTTP/3, which uses QUIC (runs on top of UDP), I would like to clarify the following:
Has the official HLS specification changed in a way that allows it to be considered UDP-based when using HTTP/3? And is it fair to say that HLS supports UDP since the transport can go over HTTP/3 and QUIC?
Would it be more accurate to say that HLS remains HTTP-dependent, and the transport protocol (TCP or QUIC) only determines how HTTP requests are delivered?
My thoughts: Since HTTP/3 uses QUIC running over UDP, we still can't say that HLS supports UDP in a classical way, as it is introduced in RTP, RTSP, SRT.
Our team conducted security testing and found one vulnerability with fairplay license acquisition.
Our QA engineer manually changed the device's system date and time (setting it 4 days into the future) and was able to successfully obtain a license response and initiate playback on an iOS device. However, on an Android device, the license acquisition failed.
Can you please tell us if Time Manipulation Detection is available in FairPlay SDK?
I am would like to look at AVMetricEvent data during video playback, so I have added this code to a test video player app:
let playerItem: AVPlayerItem = ...
let allMetrics = playerItem.allMetrics()
Task.init {
print("metrics task started")
do {
for try await metricEvent in allMetrics {
print("metric event: \(metricEvent.description)")
}
} catch {
print("unexpected metric iterator error \(error)")
}
}
Running this in Simulator on iPhone 16 Pro (18.0) does not result in any "metric event" diagnostic messages being printed when the video associated with this AVPlayerItem is playing. Only the "metric task started" diagnostic message is seen.
What am I doing wrong that prevents metric data being received?
No external cameras show up in the app on visionOS. We use this sample code as a basis for our tests: https://developer.apple.com/documentation/visionos/displaying-video-from-connected-devices
We also received the needed entitlement from Apple, but every camera we tried so far does not show up on visionOS.
We tried the following devices and hubs:
Insta360 X4
Somikon Endoscope Camera: USB HD Endoscope Camera
EMEET Full HD Webcam - C960
BENFEI Video/Audio Capture Card, 4K HDMI auf USB C/A
Logitech C920 HD PRO Webcam,
Anker PowerConf C200
Insta360 GO 3S
Anker 341 USB-C Hub
UGREEN Revodok Pro 10Gbps USB-C Hub
All Vision Pro devices we tried run with visionOS 2.3. When trying the same code on iPad we can actually use external cameras.
Steps to reproduce:
Start the app on a Vision Pro device and connect an external camera. The connected camera does not show up in the dropdown.
Development environment:
Xcode 16.2, macOS 15.3
Run-time configuration:
iOS 18.3, visionOS 2.3
I have an SCStreamDelegate for capturing frames from applications. On recent point releases of macOS Sonoma, I've noticed that the stream is being cancelled with no user action being taken. I started trying to debug it and when my on error method is called, the error parameter being passed is null:
func stream(_ stream: SCStream, didStopWithError error: Error) {
/*debugger shows this and segfaults if I try to print "\(error)"
error (Error)
> error = (Builtin.RawPointer) 0x0
*/
From what I can tell, error should be a valid NSError so I can check the error code, based on similar code I've seen in, for example OBS (https://github.com/obsproject/obs-studio/blob/265239d4174f8d291b0de437088c5b78f8e27687/plugins/mac-capture/mac-sck-common.m#L29)
Usually when this happens, the menubar icon for screen sharing (where I would click to change sharing window, etc) stays there even after my app has closed an no apps are doing sharing stuff.
Has anyone come across this before? Am I misinterpreting what the debugger is saying about the error parameter?
I'm running macos 14.7.3, but I just updated from 14.7.2 earlier and had basically the same issue on both macos versions
I am developing an app to stream and download DRM protected HLS videos based on the official “FairPlay Streaming Server SDK”.
When I play the downloaded video, it asks the server for .ts or .aac, even though I have passed the path of the downloaded video to AVURLAsset.
As a result, playback fails when the device is offline, such as in airplane mode.
This behavior depends on the playback time of the video and occurs when trying to download and play a video with a playback time of 19 hours or more.
It did not occur for videos with a playback time of 18 hours.
The environment we checked is iOS 18.3.
The solution at this time is to limit the video playback time to 18 hours, but if possible, we would like to allow download playback of videos longer than 19 hours.
Does anyone have any information or know of a solution to this problem, such as if you have experienced this type of event, or if you know that content longer than 19 hours cannot be played offline?
// load
let path = ".../***.movpkg" // Path of the downloaded file
videoAsset = AVURLAsset(url: path)
playerItem = AVPlayerItem(asset: videoAsset!)
player.replaceCurrentItem(with: playerItem)
// isPlayableOffline
print("videoAsset.assetCache.isPlayableOffline = \(videoAsset.assetCache.isPlayableOffline)") // true
I want my iOS app to be able to use USB client mode to send LiDAR data and camera frames to another device. What are my options for doing this. I've found IOUSBHost for host mode, but I want to see all my options. The device I want driving the bus is a Meta Quest 3, which, and I mention for the sake of clarity, is inherently an Android device. The iPhone is to be used as a sensor hub, sending data to the Quest 3 for further processing.
As a workaround, I could let the iPhone drive the bus and have the Quest 3 use Android's accessory mode, which lets other devices drive the USB bus. But, there are more USB devices I want to attach for my project, and doing this makes such more difficult. I want to avoid it.
Hi folks,
When doing HLS v6 live streaming with fmp4 chunks we noticed that when the encoder timestamps slightly drift and a #EXT-X-DISCONTINUITY tag is created in either the audio or video playlist (in an ABR setup), the tag is not correctly handled by the player leading to a broken playback containing black screen or no audio (depending on which playlist the tag is printed in).
We noticed that this is often true when the number of tags is odd between the playlists (eg. the audio playlist contains 1 tag and the video contains 2 tags will result in a black screen with audio).
By using the same "broken" source but using Shaka player instead won't break the playback at all.
Are there any possible fix (or upcoming) for AV Player?
Hello, we have HLS Stream app on Apple TV. Our streams are DRM protected. We have problem with streams when source device is turned off. For example, user start to watch our HLS DRM Protected content. After some time, user turns off device (it can be Monitor or TV via connected HDMI). Our app does not understand HDMI Source device turned off. Is there any way to understand HDMI connected device is turned off on Swift?
In our Apple TV application, we use the native AVPlayer for live playback functionality. Until tvOS 17.6 and during the tvOS 18 beta, the Pause/Resume feature worked as expected, allowing us to pause live playback. However, after updating to tvOS 18.1, the pause functionality no longer works.
The same app still works fine on tvOS 17, but on tvOS 18, attempting to pause live playback has no effect. We reviewed the tvOS 18 release notes but couldn't find any relevant changes or deprecations related to AVPlayer or live playback behavior.
Has there been any change in the handling of live playback or the Pause/Resume functionality in tvOS 18.1? Any guidance or suggestions to address this issue would be greatly appreciated.
Thank you!
We move to another streaming service and need to deliver a ASK, .PEM &key, and CRT to enable DRM. Now the issue is that we don't have that information anymore.
Most logical would be to revoke the current certificate and create a new one. Unfortunately for Fairplay Streaming Certificates there is no revoke button.
We asked developer support who isn't able to help. We then did a request to revoke as described in article 2.7 of the Apple Developer Program License Agreement. They can only do this when the certificate is compromised.
So now we are stuck. Anyone out there who had the same issue and found a solution?
Your help is much appreciated.
Hello!
Just curious if AVAssetWriter append() randomly fails for anyone but me? Seems to happen when live streaming (encoding with VTCompressionSession) and recording (with AVAssetWriter) at the same time.
We are having issues with ScreenCaptureKit. Our use case is the following:
We have multiple applications that each starts a stream capture, using ScreenCaptureKit. It works fine when just one application is streaming, but when starting multiple streams continuesly, all streams stops or crashes, without ScreenCaptureKit reporting an error back.
Restarting replayd for the user will allow us to start streaming again, if the streaming applications are restartet too.
We have build a small test program that we have tested on different Macs, running different versions of macOS, with identical results. The test program just calls the 'getShareableContentExcludingDesktopWindows' function multiple times, since this was the simplest way to show the problem.
Our test setup is the following:
Mac Mini M2 macOS 13.6
MacBook Pro M3 macOS 14.4
MacBook Pro M3 macOS 15.1
Code main.m
#import <Foundation/Foundation.h>
#import <ScreenCaptureKit/ScreenCaptureKit.h>
@interface Runner : NSObject
@property(atomic, assign) BOOL keepRunning;
@property(atomic, retain) SCShareableContent* availableWindows;
-(void) updateAvailableWindows;
-(void) notif:(NSNotification*) aNotif;
@end
int main(int argc, const char * argv[]) {
@autoreleasepool {
NSRunLoop* loo = [NSRunLoop mainRunLoop];
Runner* r = [[Runner alloc] init];
[r updateAvailableWindows];
while (r.keepRunning)
[loo runUntilDate:[NSDate distantFuture]];
NSLog(@"Program exit");
}
return 0;
}
@implementation Runner
-(instancetype) init
{
self = [super init];
if(self){
_keepRunning = YES;
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasNotFound" object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasFound" object:nil];
}
return self;
}
-(void) updateAvailableWindows
{
@autoreleasepool {
NSLog(@"begin");
[SCShareableContent getShareableContentExcludingDesktopWindows:NO onScreenWindowsOnly:YES completionHandler:^(SCShareableContent* content, NSError* error){
NSLog(@"running");
if(!error){
self.availableWindows = content;
for (__unused SCWindow* aWin in self.availableWindows.windows) {
[NSThread sleepForTimeInterval:0.01];
}
[[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasFound" object:self.availableWindows];
}
else{
[[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasNotFound" object:nil];
}
}];
}
}
-(void) notif:(NSNotification*) aNotif
{
if(aNotif.object)
[self performSelectorOnMainThread:@selector(updateAvailableWindows) withObject:nil waitUntilDone:NO];
else{
NSLog(@"Not Found");
self.keepRunning = NO;
}
}
@end
How to replicate:
Compile and run the program from multiple terminal windows (terminal must be granted screen recording permission) and notice that the output stops, when the replayd stops responding (our assumption). Restarting the application does nothing, before the replayd also is restarted.
Running th application in Xcode gives the following error:
[ERROR] -[RPDaemonProxy fetchShareableContentWithOption:windowID:currentProcess:withCompletionHandler:]_block_invoke:902 error: 4097
This error is not something we have been able to detect in our applications - and since the only workaround is restarting replayd and the applications, catching the error would help us.
It seems currently 5G Network Slicing is only available on URL Request / URL Session / Low level network and not on AVPlayer, is it possible to have the app using default slice when start the app and only enable Network Slicing when streaming video via AVPlayer?
In our Apple TV application, we are using the native AVPlayer for live playback functionality. During live restart playback, we intermittently encounter an error when the playback timeline approaches the actual live event end time.
Error:
The operation couldn’t be completed. (CoreMediaErrorDomain error -16839 - Unable to get playlist before long download timer) / Failure reason:
Scenario:
The live event is scheduled from 7:00 AM to 8:00 AM.
Restart playback begins at 7:20 AM, allowing the user to watch the event from the start while the live stream continues in real-time.
As the restart playback timeline approaches the actual event end time (8:00 AM), AVPlayer displays an error, and playback continues in the background.
I am reaching out regarding an issue with my Apple FairPlay Streaming Certificate. To generate the certificate signing request (CSR), I used the following OpenSSL commands:
openssl genrsa -out private_key.pem 1024 openssl req -new -key private_key.pem -out request.csr
However, according to the guide provided by Apple and instructions from my DRM provider, I should have used:
openssl genrsa -aes256 -out privatekey.pem 1024 openssl req -new -sha1 -key privatekey.pem -out certreq.csr -subj "/CN=SubjectName /OU=OrganizationalUnit /O=Organization /C=US"
I suspect this discrepancy might be causing the issue with my FairPlay certificate. After obtaining the fairplay.cer file and importing it into Keychain Access, I noticed the following:
When I expand the certificate in Keychain Access, I can only see a public key and no private key.
As a result, I am unable to export the certificate as a .p12 file, as this option is disabled.
As per my DRM provider's instructions, I need to export the certificate along with the corresponding private key as a .p12 file with a password. Since the private key is not visible in Keychain Access, I am unable to proceed further.
I have read the FairPlay Streaming Overview but could not find any reasons as to why this issue is occurring or guidance on the procedure to revoke a certificate.
Additionally, I came across the terms and conditions which mentioned reaching out to product-security at Apple for assistance in revoking corrupt certificates. However, despite reaching out, I have not received a response.
Any help on how to proceed will be great!
ApplicationMusicPlayer with queue created from playlist crashes with random occurrence shortly after skipping back or forth using controls embedded in the notification, with the error on console log: applicationController: xpc service connection interrupted.
I've noticed that the issue occurs more frequently the shorter is time between skipping entries. Since ApplicationMusicPlayer is run on a remote process, the main app does not crash, but the music stops playing without any exception, and the playback control turns uninitiated.
Here is how I'm initiating the queue:
let entries = playlist
.with(.entries).entries!
.map { ApplicationMusicPlayer.Queue.Entry($0) }
ApplicationMusicPlayer.shared.queue = .init(
entries, startingAt: entries.last
)
Please give me some tips on how to solve this.
EDIT:
The issue does not occur when navigating quickly through the station.
Hi,
I have been working on a project that enables users to listen to their favorite music using a streaming service, which so far was Spotify. The app had a programmable 3D/2D interface with the ability to connect to devices in your home and have them react to music. As of September 2024, Spotify decomissioned their Audio Analysis API. I have seen other posts mention playing Apple Music through AVFoundation, which would break DRM and so it’s not supported. However, the Spotify Audio Analysis API does not allow for a full frequency reconstruction. It is entirely temporal data on beats, kicks, loudness, and timbre changes, which themselves are operators on the spectral data from the FFT. It would be very useful for the developer community if we get the ability to do this and it will probably Apple Music among developers and those who use their apps a lot more.
Would love to hear your thoughts about this and Happy New Year!
Can some one please answer me How Can I update the cookies of the previously set m3u8 video in AVPlayer without creating the new AVURLAsset and replacing the AVPlayer current Item with it