Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

Fairplay error on MacOS catalyst
Hi, I have a IOS app and we are using fairplay DRM to play videos. In IOS app we are allowing offline download of the videos and hence we are getting a persistent fairplay license. In IOS app everything is working fine. Now we have used the same app and built for MacOS catalyst. In MAC OS catalyst app we are not able to play the video and getting error code -42650 We are able to get the persistent license from server, but when we play the video with the license we are getting the error. Below are the logs: 2024-12-06 22:05:48.911266+0530 0x4dffe2 Default 0x0 85505 0 teachonline: (MediaToolbox) [com.apple.coremedia:] <<<< FigPKDKeyManager >>>> keyManager_processOfflineKeyInternal: 0x600000322000 160D4519-C60B-4FD0-B69A-20B2A4597017 created decrypt context:0x0 with offline key; updated offline key:0x0 err:-42650 2024-12-06 22:05:48.911369+0530 0x4dffe2 Default 0x0 85505 0 teachonline: (MediaToolbox) [com.apple.coremedia:player] <<<< FigStreamPlayer >>>> fpfs_ensureDecryptorHasStarted: [0x7fc44e4dc520|P/NW] <0x7fc44fa44000|I/SRA.01>: track 1 latching decryptorFailure -42650 85505 0 teachonline: (MediaToolbox) [com.apple.coremedia:player] <<<< FigStreamPlayer >>>> fpfs_StopPlayingItem: [0x7fc44e4dc520|P/NW] <0x7fc44fa44000|I/SRA.01>: Pausing, err=Error Domain=CoreMediaErrorDomain Code=-42650 "(null)" I have copied only the lines which has errors. You can download the full logs from https://drive.google.com/file/d/1feb9pKZERUr--PMt6m-6IrO_mDvoFbjO/view?usp=sharing Can you please help me to fix the issue.
1
0
617
Dec ’24
Mediastreamvalidator Error: Invalid URL
I try to validate low latency HLS fragmented MP4 setup with meadistreamvalidator. I get following error: Error: Invalid URL Detail: '(null)' is not a valid URL Source: mediaplaylistURL.m3u8 - segmentURL.mp4 meadistreamvalidator version is 1.23.14 What does that error mean?
1
0
373
Dec ’24
Use AVPlayer for multiple videos
I'm developing a tutorial style tvOS app with multiple videos. The examples I've seen so far deal with only one video. Defining the player and source(url) before body view let avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../video.mp4")!)) and then in the body view the video is displayed VideoPlayer(player: avPlayer) This allows options such as stop/start etc. When I try something similar with a video title passed into this view I can't define the player with this title variable. var vTitle: String var avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../" + vTitle + ".mp4"")!)) var body: some View { I het an error that vTitle can't be used in the url above the body view. Any thoughts or suggestions? Thanks
1
0
757
Dec ’24
ScreenCaptureKit crash
We are having issues with ScreenCaptureKit. Our use case is the following: We have multiple applications that each starts a stream capture, using ScreenCaptureKit. It works fine when just one application is streaming, but when starting multiple streams continuesly, all streams stops or crashes, without ScreenCaptureKit reporting an error back. Restarting replayd for the user will allow us to start streaming again, if the streaming applications are restartet too. We have build a small test program that we have tested on different Macs, running different versions of macOS, with identical results. The test program just calls the 'getShareableContentExcludingDesktopWindows' function multiple times, since this was the simplest way to show the problem. Our test setup is the following: Mac Mini M2 macOS 13.6 MacBook Pro M3 macOS 14.4 MacBook Pro M3 macOS 15.1 Code main.m #import <Foundation/Foundation.h> #import <ScreenCaptureKit/ScreenCaptureKit.h> @interface Runner : NSObject @property(atomic, assign) BOOL keepRunning; @property(atomic, retain) SCShareableContent* availableWindows; -(void) updateAvailableWindows; -(void) notif:(NSNotification*) aNotif; @end int main(int argc, const char * argv[]) { @autoreleasepool { NSRunLoop* loo = [NSRunLoop mainRunLoop]; Runner* r = [[Runner alloc] init]; [r updateAvailableWindows]; while (r.keepRunning) [loo runUntilDate:[NSDate distantFuture]]; NSLog(@"Program exit"); } return 0; } @implementation Runner -(instancetype) init { self = [super init]; if(self){ _keepRunning = YES; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasNotFound" object:nil]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasFound" object:nil]; } return self; } -(void) updateAvailableWindows { @autoreleasepool { NSLog(@"begin"); [SCShareableContent getShareableContentExcludingDesktopWindows:NO onScreenWindowsOnly:YES completionHandler:^(SCShareableContent* content, NSError* error){ NSLog(@"running"); if(!error){ self.availableWindows = content; for (__unused SCWindow* aWin in self.availableWindows.windows) { [NSThread sleepForTimeInterval:0.01]; } [[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasFound" object:self.availableWindows]; } else{ [[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasNotFound" object:nil]; } }]; } } -(void) notif:(NSNotification*) aNotif { if(aNotif.object) [self performSelectorOnMainThread:@selector(updateAvailableWindows) withObject:nil waitUntilDone:NO]; else{ NSLog(@"Not Found"); self.keepRunning = NO; } } @end How to replicate: Compile and run the program from multiple terminal windows (terminal must be granted screen recording permission) and notice that the output stops, when the replayd stops responding (our assumption). Restarting the application does nothing, before the replayd also is restarted. Running th application in Xcode gives the following error: [ERROR] -[RPDaemonProxy fetchShareableContentWithOption:windowID:currentProcess:withCompletionHandler:]_block_invoke:902 error: 4097 This error is not something we have been able to detect in our applications - and since the only workaround is restarting replayd and the applications, catching the error would help us.
2
0
598
Jan ’25
SCStreamDelegetate stream:didStopWithError receiving null pointer for error
I have an SCStreamDelegate for capturing frames from applications. On recent point releases of macOS Sonoma, I've noticed that the stream is being cancelled with no user action being taken. I started trying to debug it and when my on error method is called, the error parameter being passed is null: func stream(_ stream: SCStream, didStopWithError error: Error) { /*debugger shows this and segfaults if I try to print "\(error)" error (Error) > error = (Builtin.RawPointer) 0x0 */ From what I can tell, error should be a valid NSError so I can check the error code, based on similar code I've seen in, for example OBS (https://github.com/obsproject/obs-studio/blob/265239d4174f8d291b0de437088c5b78f8e27687/plugins/mac-capture/mac-sck-common.m#L29) Usually when this happens, the menubar icon for screen sharing (where I would click to change sharing window, etc) stays there even after my app has closed an no apps are doing sharing stuff. Has anyone come across this before? Am I misinterpreting what the debugger is saying about the error parameter? I'm running macos 14.7.3, but I just updated from 14.7.2 earlier and had basically the same issue on both macos versions
3
0
499
Mar ’25
Problem with UVC Device Access on visionOS
No external cameras show up in the app on visionOS. We use this sample code as a basis for our tests: https://developer.apple.com/documentation/visionos/displaying-video-from-connected-devices We also received the needed entitlement from Apple, but every camera we tried so far does not show up on visionOS. We tried the following devices and hubs: Insta360 X4 Somikon Endoscope Camera: USB HD Endoscope Camera EMEET Full HD Webcam - C960 BENFEI Video/Audio Capture Card, 4K HDMI auf USB C/A Logitech C920 HD PRO Webcam, Anker PowerConf C200 Insta360 GO 3S Anker 341 USB-C Hub UGREEN Revodok Pro 10Gbps USB-C Hub All Vision Pro devices we tried run with visionOS 2.3. When trying the same code on iPad we can actually use external cameras. Steps to reproduce: Start the app on a Vision Pro device and connect an external camera. The connected camera does not show up in the dropdown. Development environment: Xcode 16.2, macOS 15.3 Run-time configuration: iOS 18.3, visionOS 2.3
2
0
578
Feb ’25
How to receive AVMetricEvent performance data?
I am would like to look at AVMetricEvent data during video playback, so I have added this code to a test video player app: let playerItem: AVPlayerItem = ... let allMetrics = playerItem.allMetrics() Task.init { print("metrics task started") do { for try await metricEvent in allMetrics { print("metric event: \(metricEvent.description)") } } catch { print("unexpected metric iterator error \(error)") } } Running this in Simulator on iPhone 16 Pro (18.0) does not result in any "metric event" diagnostic messages being printed when the video associated with this AVPlayerItem is playing. Only the "metric task started" diagnostic message is seen. What am I doing wrong that prevents metric data being received?
2
0
359
Feb ’25
FairPlay-Protected HLS Files Not Transferred via Quick Start
I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths. Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures. Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
1
0
115
Mar ’25
FairPlay-Protected HLS Files Not Transferred via Quick Start
FairPlay-Protected HLS Files Not Transferred via Quick Start I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths. Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures. Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
2
0
114
Mar ’25
Playback Issues for DRM content when sending CMCD
Since iOS and tvOS 18, CMCD can now be automatically sent by AVPlayer (https://developer.apple.com/streaming/Whats-new-HLS.pdf). However, after enabling CMCD, our streams occasionally fail with the following error: CoreMediaErrorDomain Error -17383 This issue appears to affect only DRM-protected (FairPlay) streams so far. We activate CMCD via the resource loader of an AVURLAsset, before assigning the item to an AVPlayer. Unfortunately, we haven’t found a reliable way to reproduce the issue, and we’ve been unable to gather any useful diagnostic information. Has anyone else observed this behavior when enabling CMCD on FairPlay streams?
2
0
157
Mar ’25
Support for encrypted Mp4 on Apple devices.
Hello, hope everybody is doing well. I have some reels (of aspect ratio 9X16) content, which I want to playback on iOS phones. My question is does AV player support out of box playback of encrypted Mp4. Please note, this is not HLS fMp4, rather unfragmented Mp4 content. If it is supported, what algorithm of encryption shall be used? Please let me know.
1
0
79
Mar ’25
Issues with "AVMetricEventStreamPublisher Discover Media Performance Metrics in AVFoundation" Example Code
Hi everyone! I’ve been working with AVFoundation and trying to use the AVMetricEventStreamPublisher to discover media performance metrics, as described in the Apple documentation. https://developer.apple.com/cn/videos/play/wwdc2024/10113/?time=508 However, when following the example code, I’m not getting the expected results. The performance metrics for both audio and video don’t seem to be captured properly. Has anyone successfully used this example code? If so, could you share your experience or any solutions you’ve found? Any tips or insights would be greatly appreciated. Thanks in advance! Ps. the example code: AVPlayerItem *item = ... AVMetricEventStream *eventStream = [AVMetricEventStream eventStream]; id subscriber = [[MyMetricSubscriber alloc] init]; [eventStream setSubscriber:subscriber queue:mySerialQueue] [eventStream subscribeToMetricEvent:[AVMetricPlayerItemLikelyToKeepUpEvent class]]; [eventStream subscribeToMetricEvent:[AVMetricPlayerItemPlaybackSummaryEvent class]]; [eventStream addPublisher:item];
1
0
63
Apr ’25
AVAssetWriterInputTaggedPixelBufferGroupAdaptor Hanging With Tagged Buffers
We've successfully implemented an AVAssetWriter to produce HLS streams (all code is Objective-C++ for interop with existing codebase) but are struggling to extend the operations to use tagged buffers. We're starting to wonder if the tagged buffers required for an MV-HEVC signal are fully supported when producing HLS segments in a live-stream setting. We generate a live stream of data using something like: UTType *t = [UTType typeWithIdentifier:AVFileTypeMPEG4]; m_writter = [[AVAssetWriter alloc] initWithContentType:t]; // - videoHint describes HEVC and width/height // - m_videoConfig includes compression settings and, when using MV-HEVC, // the correct keys are added (i.e. kVTCompressionPropertyKey_MVHEVCVideoLayerIDs) // The app was throwing an exception without these which was // useful to know when we got the configuration right. m_video = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:m_videoConfig sourceFormatHint:videoHint]; For either path we're producing CVPixelBufferRefs that contain the raw pixel information (i.e. 32BGRA) so we use an adapter to make that as simple as possible. If we use a single view and a AVAssetWriterInputPixelBufferAdaptor things work out very well. We produce segments and the delegate is called. However, if we use the AVAssetWriterInputTaggedPixelBufferGroupAdaptor as exampled in the SideBySideToMVHEVC demo project, things go poorly. We create the tagged buffers with something like: CMTagCollectionRef collections[2]; CMTag leftTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)0), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_LeftEye) }; CMTagCollectionCreate( kCFAllocatorDefault, leftTags, 2, &(collections[0]) ); CMTag rightTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)1), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_RightEye) }; CMTagCollectionCreate( kCFAllocatorDefault, rightTags, 2, &(collections[1]) ); CFArrayRef tagCollections = CFArrayCreate( kCFAllocatorDefault, (const void **)collections, 2, &kCFTypeArrayCallBacks ); CVPixelBufferRef buffers[] = {*b, *alt}; CFArrayRef b = CFArrayCreate( kCFAllocatorDefault, (const void **)buffers, 2, &kCFTypeArrayCallBacks ); CMTaggedBufferGroupRef bufferGroup; OSStatus res = CMTaggedBufferGroupCreate( kCFAllocatorDefault, tagCollections, b, &bufferGroup ); Perhaps there's something about this OBJC code that I've buggered up? Hopefully! Anyways, when I submit this tagged bugger group to the adaptor: if (![mvVideoAdapter appendTaggedPixelBufferGroup:bufferGroup withPresentationTime:pts]) { // report error... } Appending does not raise any errors - eventually it just hangs on us and we never return from it... Real Issue: So either: The delegate assigned to the AVAssetWriter doesn't fire its assetWriter callback which should produce the segments The adapter hangs on the appendTaggedPixelBufferGroup before a segment is ready to be completed (but succeeds for a number of buffer groups before this happens). This is the same delegate class that's assigned to the non multi view code path if MV-HEVC is turned off which works perfectly.
1
0
62
Apr ’25
CoreMediaErrorDomain -12888: Bandwidth down-stepping when using 2sec segment duration
the problem is when using HLS live stream with AVPlayer on iOS/ tvOS the player chooses first highest bandwidth then slowly steps down to lowest (within 1-3min) and eventually steps up again then repeats to step down. the AVPlayer error log sends events: errorStatusCode: -12888, errorDomain: Optional("CoreMediaErrorDomain"), errorComment: Optional("The operation couldn't be completed. (CoreMediaErrorDomain error -12888 - Playlist File unchanged for longer than 1.5 * target duration we use standard segments in CMAF format, 2sec duration #EXTM3U #EXT-X-VERSION:6 #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:147065903 #EXT-X-MAP:URI="video_1_4660000_init.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2" #EXT-X-PROGRAM-DATE-TIME:2025-04-30T12:51:07 #EXTINF:2.000, video_1_4660000_t17460174670001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174690001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174710001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 when using 6sec segments the player stays stable at highest bandwidth. is there a way to avoid this error? in AVPlayer or HLS configuration?
2
0
179
May ’25
FPS Certificate Re-issuance and Validity of Existing Certificate
Keywords: FairPlay, FPS Certificate, DRM, FairPlay Streaming, license server Hi all, We are currently using FairPlay Streaming in production and already have an FPS certificate in place. However, the passphrase for the existing FPS certificate has unfortunately been lost. We are now considering reissuing a new FPS certificate, and I would like to confirm a few points before proceeding: 1️⃣ If we reissue a new FPS certificate, will the existing certificate be automatically revoked? Or will it remain valid until its original expiration date? 2️⃣ Is it possible to have both the newly issued and the existing certificates valid at the same time? In other words, can we serve DRM licenses using either certificate depending on the packaging or client? 3️⃣ Are there any caveats or best practices we should be aware of when reissuing an FPS certificate? For example, would existing packaged content become unplayable, or would CDN/packaging server configurations need to be updated carefully? Since this affects our production environment, we would like to minimize any service disruption or compatibility issues. Unfortunately, when we contacted Apple support directly, we were advised to post this question here in the Forums for additional guidance. Any advice or experiences would be greatly appreciated! Thank you in advance.
1
0
119
Jun ’25
macOS Sonoma 'Cannot Decode' HLS Video
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me: Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode} Thanks!
2
1
602
Mar ’25