Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

CoreMediaErrorDomain error -42709
We are getting reports from customers that they are not able to play videos in our app after updating their phones to iOS18.3.1. (Further checking indicates that it happens on all iOS18 versions. It suddenly started occurring from February 18th, 2025) When checking logs we see that playback is failing due to CoreMediaErrorDomain error -42709. This is an undocumented error code and hence we do not know the cause of the playback issue. Does anyone know what this error code means and how the app should handle it? Reported as FB16638501.
10
3
1.9k
Mar ’25
AV Player Live playback Pause is not working on tvOS 18
In our Apple TV application, we use the native AVPlayer for live playback functionality. Until tvOS 17.6 and during the tvOS 18 beta, the Pause/Resume feature worked as expected, allowing us to pause live playback. However, after updating to tvOS 18.1, the pause functionality no longer works. The same app still works fine on tvOS 17, but on tvOS 18, attempting to pause live playback has no effect. We reviewed the tvOS 18 release notes but couldn't find any relevant changes or deprecations related to AVPlayer or live playback behavior. Has there been any change in the handling of live playback or the Pause/Resume functionality in tvOS 18.1? Any guidance or suggestions to address this issue would be greatly appreciated. Thank you!
6
9
722
Feb ’25
Receiving eventMetadata from AVPlayerItemMetadataOutput stops responding on iOS18 only
Case-ID: 9391388 Our application uses timed Metadata as part of a rating control system. We noticed a problem in production and diagnosis shows that we stop receiving timed Metadata on iOS18 only Our live streams are primed with metadata at least once per second but we are seeing extended gaps in receiving this content, in excess of 10 minutes. We have also observed that this happens more as the player climbs the bitrate ladder, and doesn't happen if we cap to a low resolution i.e. a preferredMaximumResolution of 768x432. Furthermore, if we throttle network conditions after we stop receiving metadata the we start receiving them again. Following is a simple example that demonstrates the above behaviour, unfortunately I cannot share the live stream endpoint which is primed with metadata publicly, but can provide privately to Apple to reproduce the problem. import UIKit import AVKit class ViewController: UIViewController, AVPlayerItemMetadataOutputPushDelegate { var player: AVPlayer? var itemMetadataOutput: AVPlayerItemMetadataOutput? override func viewDidAppear(_ animated: Bool) { guard let url = URL(string: "endpoint redacted") else { return } let player = AVPlayer(url: url) let controller = AVPlayerViewController() controller.player = player self.player = player present(controller, animated: true) { player.play() let currentItem = player.currentItem let itemMetadataOutput = AVPlayerItemMetadataOutput(identifiers: nil) self.itemMetadataOutput = itemMetadataOutput self.itemMetadataOutput?.setDelegate(self, queue: .main) currentItem?.add(itemMetadataOutput) } } public func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { print("received metadata \(Date())") } }
5
4
873
Oct ’24
CallKit breaks web based MediaStreams
We're integrating a web based group calling application within a native iOS application and finding that every time a CallKit session gets fully established the web based media streams break, rendering as gray with no audio. Up to iOS 18 we worked around it by not fulfilling the call start action but that's no longer an option as the audio stopped getting automatically redirected to the speakers. We would now need the CXProvider's didActivateAudioSession callback but that would break the video. The sample project loads up a simple webpage in a WKWebView which contains a video tag streaming the media from the device's camera. At the same time it sets up a new CallKit session by requesting and fulfilling a CXStartCallAction transaction. You will notice that the media doesn't render and, if you are to follow the warnings we left, you will find that not fulfilling the CXStartCallAction fixes it. Unfortunately that's not a workaround we can use as we need the CXProvider delegate to inform us about audio session changes so we can redirect the audio to the speaker (so the proximity sensor doesn't activate and locking the screen doesn't end the call) Any insights or workarounds would be greatly appreciated.
4
1
923
Nov ’24
Multiview HLS with HDR
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error. To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro. The relevant part of the m3u8 is: #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO" {{url}} Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
3
2
1k
Feb ’25
iOS 18: ApplicationMusicPlayer.Queue breaks if one more songs in Library
Hi there, After upgrading to iOS 18, I noticed that ApplicationMusicPlayer.Queue behavior has broken if at least one song that is added to the queue is also in to the Apple Music Library on the device. The resulting behavior is that the queue does not accept all the items, and only items that are in the Library are playable in the queue. The expected behavior and the previous behavior on iOS 17 was that all the items would be added to the queue successfully. I confirmed this behavior on a separate test device running iOS 17.7. The items added are all being fetched via MusicCatalogResourceRequest<Song> so I would expect that a requested song being present in the library would have no effect.
3
0
820
Oct ’24
I cannot revoke Apple Fairplay Streaming certificate
We move to another streaming service and need to deliver a ASK, .PEM &amp;key, and CRT to enable DRM. Now the issue is that we don't have that information anymore. Most logical would be to revoke the current certificate and create a new one. Unfortunately for Fairplay Streaming Certificates there is no revoke button. We asked developer support who isn't able to help. We then did a request to revoke as described in article 2.7 of the Apple Developer Program License Agreement. They can only do this when the certificate is compromised. So now we are stuck. Anyone out there who had the same issue and found a solution? Your help is much appreciated.
3
2
601
Jan ’25
SCStreamDelegetate stream:didStopWithError receiving null pointer for error
I have an SCStreamDelegate for capturing frames from applications. On recent point releases of macOS Sonoma, I've noticed that the stream is being cancelled with no user action being taken. I started trying to debug it and when my on error method is called, the error parameter being passed is null: func stream(_ stream: SCStream, didStopWithError error: Error) { /*debugger shows this and segfaults if I try to print "\(error)" error (Error) > error = (Builtin.RawPointer) 0x0 */ From what I can tell, error should be a valid NSError so I can check the error code, based on similar code I've seen in, for example OBS (https://github.com/obsproject/obs-studio/blob/265239d4174f8d291b0de437088c5b78f8e27687/plugins/mac-capture/mac-sck-common.m#L29) Usually when this happens, the menubar icon for screen sharing (where I would click to change sharing window, etc) stays there even after my app has closed an no apps are doing sharing stuff. Has anyone come across this before? Am I misinterpreting what the debugger is saying about the error parameter? I'm running macos 14.7.3, but I just updated from 14.7.2 earlier and had basically the same issue on both macos versions
3
0
499
Mar ’25
Issue with Low-Latency HLS Playback Using AVAssetResourceLoaderDelegate
Hi, I am writing to seek any help or workaround regarding an issue I have encountered while implementing Low-Latency HLS playback using the AVAssetResourceLoaderDelegate. I have been successfully loading playlists during HLS live playback using the AVAssetResourceLoaderDelegate. However, after introducing Low-Latency HLS, I have run into a problem. When the AVPlayer loads low-latency content playback natively, everything works fine. But when I use the delegate for loading, I encounter the following error from AVPlayer's status observer: CoreMediaErrorDomain -15410 Low Latency: Server must support http2 ECN and SACK It seems there is no problem since playback does not stop, but there is a very critical part missing. The playback does not achieve the expected low latency and behaves similarly to standard HLS. Additionally, this behavior only occurs on iOS 16 devices and simulators. On iOS 17 simulators and devices, the error message does not appear, and the latency remains low as expected. Therefore, I suspect that there might be some misjudgment in the verification process within the internal implementation of AVPlayer. Since our app needs to support iOS 16, I would appreciate any solutions, methods to try, or workarounds that you could share regarding this issue. Thank you.
2
0
808
Oct ’24
macOS Sonoma 'Cannot Decode' HLS Video
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me: Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode} Thanks!
2
1
601
Mar ’25
Get channels count for an audio stream in AVPlayer
In an m3u8 manifest, audio EXT-X-MEDIA tags usually contain CHANNELS tag containing the audio channels count like so: #EXT-X-MEDIA:TYPE=AUDIO,URI="audio_clear_eng_stereo.m3u8",GROUP-ID="default-audio-group",LANGUAGE="en",NAME="stream_5",AUTOSELECT=YES,CHANNELS="2" Is it possible to get this info from AVPlayer, AVMediaSelectionOption or some related API?
2
0
576
Dec ’24
ScreenCaptureKit crash
We are having issues with ScreenCaptureKit. Our use case is the following: We have multiple applications that each starts a stream capture, using ScreenCaptureKit. It works fine when just one application is streaming, but when starting multiple streams continuesly, all streams stops or crashes, without ScreenCaptureKit reporting an error back. Restarting replayd for the user will allow us to start streaming again, if the streaming applications are restartet too. We have build a small test program that we have tested on different Macs, running different versions of macOS, with identical results. The test program just calls the 'getShareableContentExcludingDesktopWindows' function multiple times, since this was the simplest way to show the problem. Our test setup is the following: Mac Mini M2 macOS 13.6 MacBook Pro M3 macOS 14.4 MacBook Pro M3 macOS 15.1 Code main.m #import <Foundation/Foundation.h> #import <ScreenCaptureKit/ScreenCaptureKit.h> @interface Runner : NSObject @property(atomic, assign) BOOL keepRunning; @property(atomic, retain) SCShareableContent* availableWindows; -(void) updateAvailableWindows; -(void) notif:(NSNotification*) aNotif; @end int main(int argc, const char * argv[]) { @autoreleasepool { NSRunLoop* loo = [NSRunLoop mainRunLoop]; Runner* r = [[Runner alloc] init]; [r updateAvailableWindows]; while (r.keepRunning) [loo runUntilDate:[NSDate distantFuture]]; NSLog(@"Program exit"); } return 0; } @implementation Runner -(instancetype) init { self = [super init]; if(self){ _keepRunning = YES; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasNotFound" object:nil]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasFound" object:nil]; } return self; } -(void) updateAvailableWindows { @autoreleasepool { NSLog(@"begin"); [SCShareableContent getShareableContentExcludingDesktopWindows:NO onScreenWindowsOnly:YES completionHandler:^(SCShareableContent* content, NSError* error){ NSLog(@"running"); if(!error){ self.availableWindows = content; for (__unused SCWindow* aWin in self.availableWindows.windows) { [NSThread sleepForTimeInterval:0.01]; } [[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasFound" object:self.availableWindows]; } else{ [[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasNotFound" object:nil]; } }]; } } -(void) notif:(NSNotification*) aNotif { if(aNotif.object) [self performSelectorOnMainThread:@selector(updateAvailableWindows) withObject:nil waitUntilDone:NO]; else{ NSLog(@"Not Found"); self.keepRunning = NO; } } @end How to replicate: Compile and run the program from multiple terminal windows (terminal must be granted screen recording permission) and notice that the output stops, when the replayd stops responding (our assumption). Restarting the application does nothing, before the replayd also is restarted. Running th application in Xcode gives the following error: [ERROR] -[RPDaemonProxy fetchShareableContentWithOption:windowID:currentProcess:withCompletionHandler:]_block_invoke:902 error: 4097 This error is not something we have been able to detect in our applications - and since the only workaround is restarting replayd and the applications, catching the error would help us.
2
0
598
Jan ’25
Problem with UVC Device Access on visionOS
No external cameras show up in the app on visionOS. We use this sample code as a basis for our tests: https://developer.apple.com/documentation/visionos/displaying-video-from-connected-devices We also received the needed entitlement from Apple, but every camera we tried so far does not show up on visionOS. We tried the following devices and hubs: Insta360 X4 Somikon Endoscope Camera: USB HD Endoscope Camera EMEET Full HD Webcam - C960 BENFEI Video/Audio Capture Card, 4K HDMI auf USB C/A Logitech C920 HD PRO Webcam, Anker PowerConf C200 Insta360 GO 3S Anker 341 USB-C Hub UGREEN Revodok Pro 10Gbps USB-C Hub All Vision Pro devices we tried run with visionOS 2.3. When trying the same code on iPad we can actually use external cameras. Steps to reproduce: Start the app on a Vision Pro device and connect an external camera. The connected camera does not show up in the dropdown. Development environment: Xcode 16.2, macOS 15.3 Run-time configuration: iOS 18.3, visionOS 2.3
2
0
578
Feb ’25
How to receive AVMetricEvent performance data?
I am would like to look at AVMetricEvent data during video playback, so I have added this code to a test video player app: let playerItem: AVPlayerItem = ... let allMetrics = playerItem.allMetrics() Task.init { print("metrics task started") do { for try await metricEvent in allMetrics { print("metric event: \(metricEvent.description)") } } catch { print("unexpected metric iterator error \(error)") } } Running this in Simulator on iPhone 16 Pro (18.0) does not result in any "metric event" diagnostic messages being printed when the video associated with this AVPlayerItem is playing. Only the "metric task started" diagnostic message is seen. What am I doing wrong that prevents metric data being received?
2
0
359
Feb ’25
FairPlay-Protected HLS Files Not Transferred via Quick Start
FairPlay-Protected HLS Files Not Transferred via Quick Start I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths. Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures. Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
2
0
114
Mar ’25