Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

iOS: Want to use USB client mode to send data.
I want my iOS app to be able to use USB client mode to send LiDAR data and camera frames to another device. What are my options for doing this. I've found IOUSBHost for host mode, but I want to see all my options. The device I want driving the bus is a Meta Quest 3, which, and I mention for the sake of clarity, is inherently an Android device. The iPhone is to be used as a sensor hub, sending data to the Quest 3 for further processing. As a workaround, I could let the iPhone drive the bus and have the Quest 3 use Android's accessory mode, which lets other devices drive the USB bus. But, there are more USB devices I want to attach for my project, and doing this makes such more difficult. I want to avoid it.
0
0
294
Jan ’25
EXT-X-DISCONTINUITY not properly handled in native iOS and Safari player (AVPlayer), broken playback.
Hi folks, When doing HLS v6 live streaming with fmp4 chunks we noticed that when the encoder timestamps slightly drift and a #EXT-X-DISCONTINUITY tag is created in either the audio or video playlist (in an ABR setup), the tag is not correctly handled by the player leading to a broken playback containing black screen or no audio (depending on which playlist the tag is printed in). We noticed that this is often true when the number of tags is odd between the playlists (eg. the audio playlist contains 1 tag and the video contains 2 tags will result in a black screen with audio). By using the same "broken" source but using Shaka player instead won't break the playback at all. Are there any possible fix (or upcoming) for AV Player?
1
0
543
Jan ’25
Apple TV HDMI Connected device turn off detection problem via HDMI
Hello, we have HLS Stream app on Apple TV. Our streams are DRM protected. We have problem with streams when source device is turned off. For example, user start to watch our HLS DRM Protected content. After some time, user turns off device (it can be Monitor or TV via connected HDMI). Our app does not understand HDMI Source device turned off. Is there any way to understand HDMI connected device is turned off on Swift?
0
0
380
Jan ’25
I cannot revoke Apple Fairplay Streaming certificate
We move to another streaming service and need to deliver a ASK, .PEM &key, and CRT to enable DRM. Now the issue is that we don't have that information anymore. Most logical would be to revoke the current certificate and create a new one. Unfortunately for Fairplay Streaming Certificates there is no revoke button. We asked developer support who isn't able to help. We then did a request to revoke as described in article 2.7 of the Apple Developer Program License Agreement. They can only do this when the certificate is compromised. So now we are stuck. Anyone out there who had the same issue and found a solution? Your help is much appreciated.
3
2
601
Jan ’25
ScreenCaptureKit crash
We are having issues with ScreenCaptureKit. Our use case is the following: We have multiple applications that each starts a stream capture, using ScreenCaptureKit. It works fine when just one application is streaming, but when starting multiple streams continuesly, all streams stops or crashes, without ScreenCaptureKit reporting an error back. Restarting replayd for the user will allow us to start streaming again, if the streaming applications are restartet too. We have build a small test program that we have tested on different Macs, running different versions of macOS, with identical results. The test program just calls the 'getShareableContentExcludingDesktopWindows' function multiple times, since this was the simplest way to show the problem. Our test setup is the following: Mac Mini M2 macOS 13.6 MacBook Pro M3 macOS 14.4 MacBook Pro M3 macOS 15.1 Code main.m #import <Foundation/Foundation.h> #import <ScreenCaptureKit/ScreenCaptureKit.h> @interface Runner : NSObject @property(atomic, assign) BOOL keepRunning; @property(atomic, retain) SCShareableContent* availableWindows; -(void) updateAvailableWindows; -(void) notif:(NSNotification*) aNotif; @end int main(int argc, const char * argv[]) { @autoreleasepool { NSRunLoop* loo = [NSRunLoop mainRunLoop]; Runner* r = [[Runner alloc] init]; [r updateAvailableWindows]; while (r.keepRunning) [loo runUntilDate:[NSDate distantFuture]]; NSLog(@"Program exit"); } return 0; } @implementation Runner -(instancetype) init { self = [super init]; if(self){ _keepRunning = YES; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasNotFound" object:nil]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasFound" object:nil]; } return self; } -(void) updateAvailableWindows { @autoreleasepool { NSLog(@"begin"); [SCShareableContent getShareableContentExcludingDesktopWindows:NO onScreenWindowsOnly:YES completionHandler:^(SCShareableContent* content, NSError* error){ NSLog(@"running"); if(!error){ self.availableWindows = content; for (__unused SCWindow* aWin in self.availableWindows.windows) { [NSThread sleepForTimeInterval:0.01]; } [[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasFound" object:self.availableWindows]; } else{ [[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasNotFound" object:nil]; } }]; } } -(void) notif:(NSNotification*) aNotif { if(aNotif.object) [self performSelectorOnMainThread:@selector(updateAvailableWindows) withObject:nil waitUntilDone:NO]; else{ NSLog(@"Not Found"); self.keepRunning = NO; } } @end How to replicate: Compile and run the program from multiple terminal windows (terminal must be granted screen recording permission) and notice that the output stops, when the replayd stops responding (our assumption). Restarting the application does nothing, before the replayd also is restarted. Running th application in Xcode gives the following error: [ERROR] -[RPDaemonProxy fetchShareableContentWithOption:windowID:currentProcess:withCompletionHandler:]_block_invoke:902 error: 4097 This error is not something we have been able to detect in our applications - and since the only workaround is restarting replayd and the applications, catching the error would help us.
2
0
598
Jan ’25
The operation couldn’t be completed error during live restart playback in native AV player
In our Apple TV application, we are using the native AVPlayer for live playback functionality. During live restart playback, we intermittently encounter an error when the playback timeline approaches the actual live event end time. Error: The operation couldn’t be completed. (CoreMediaErrorDomain error -16839 - Unable to get playlist before long download timer) / Failure reason: Scenario: The live event is scheduled from 7:00 AM to 8:00 AM. Restart playback begins at 7:20 AM, allowing the user to watch the event from the start while the live stream continues in real-time. As the restart playback timeline approaches the actual event end time (8:00 AM), AVPlayer displays an error, and playback continues in the background.
1
0
740
Jan ’25
Unable to generate .p12 file from fairplay.cer
I am reaching out regarding an issue with my Apple FairPlay Streaming Certificate. To generate the certificate signing request (CSR), I used the following OpenSSL commands: openssl genrsa -out private_key.pem 1024 openssl req -new -key private_key.pem -out request.csr However, according to the guide provided by Apple and instructions from my DRM provider, I should have used: openssl genrsa -aes256 -out privatekey.pem 1024 openssl req -new -sha1 -key privatekey.pem -out certreq.csr -subj "/CN=SubjectName /OU=OrganizationalUnit /O=Organization /C=US" I suspect this discrepancy might be causing the issue with my FairPlay certificate. After obtaining the fairplay.cer file and importing it into Keychain Access, I noticed the following: When I expand the certificate in Keychain Access, I can only see a public key and no private key. As a result, I am unable to export the certificate as a .p12 file, as this option is disabled. As per my DRM provider's instructions, I need to export the certificate along with the corresponding private key as a .p12 file with a password. Since the private key is not visible in Keychain Access, I am unable to proceed further. I have read the FairPlay Streaming Overview but could not find any reasons as to why this issue is occurring or guidance on the procedure to revoke a certificate. Additionally, I came across the terms and conditions which mentioned reaching out to product-security at Apple for assistance in revoking corrupt certificates. However, despite reaching out, I have not received a response. Any help on how to proceed will be great!
0
0
491
Jan ’25
ApplicationMusicPlayer stops with error after skipping quickly over playlist entries
ApplicationMusicPlayer with queue created from playlist crashes with random occurrence shortly after skipping back or forth using controls embedded in the notification, with the error on console log: applicationController: xpc service connection interrupted. I've noticed that the issue occurs more frequently the shorter is time between skipping entries. Since ApplicationMusicPlayer is run on a remote process, the main app does not crash, but the music stops playing without any exception, and the playback control turns uninitiated. Here is how I'm initiating the queue: let entries = playlist .with(.entries).entries! .map { ApplicationMusicPlayer.Queue.Entry($0) } ApplicationMusicPlayer.shared.queue = .init( entries, startingAt: entries.last ) Please give me some tips on how to solve this. EDIT: The issue does not occur when navigating quickly through the station.
1
0
486
Jan ’25
[Request] Support for Spotify-like Audio Analysis API for Apple Music.
Hi, I have been working on a project that enables users to listen to their favorite music using a streaming service, which so far was Spotify. The app had a programmable 3D/2D interface with the ability to connect to devices in your home and have them react to music. As of September 2024, Spotify decomissioned their Audio Analysis API. I have seen other posts mention playing Apple Music through AVFoundation, which would break DRM and so it’s not supported. However, the Spotify Audio Analysis API does not allow for a full frequency reconstruction. It is entirely temporal data on beats, kicks, loudness, and timbre changes, which themselves are operators on the spectral data from the FFT. It would be very useful for the developer community if we get the ability to do this and it will probably Apple Music among developers and those who use their apps a lot more. Would love to hear your thoughts about this and Happy New Year!
0
1
586
Dec ’24
Unknown error -12881 when using AVAssetResourceLoader
Here we are focusing to change the cookie at every 120 seconds while playing , in apple avplayer we can't modify cookie after initialisation due to that we followed the approach to using " Resource loader delegate " to pass cookie as a header value . What I notice is that the playlist file (.m3u8) gets downloaded correctly. Then video file (.m4a) some chunks also gets downloaded. I know that the .ts file is downloaded because I can see the GET request completing on the web server with status 200. I also set a breakpoint at the following line: loadingRequest.dataRequest?.respond(with: data) immediately got error from avplayer status as "The operation could not be completed. An unknown error occurred (-12881) From core media" Need confirmation on why I am unable to load HLS using resource loader. is it possible to update cookie value while paying continuously on avplayer. override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. let urlString = "localhost://demo.unified-streaming.com/k8s/features/stable/video/tears-of-steel/tears-of-steel.ism/.m3u8" guard let url = URL(string: urlString) else { print("Invalid URL") return } //Create cookie to prepare for player asset let cookie = HTTPCookie(properties: [ .name: "dazn-token", .value: "cookie value", .domain: url.host() ?? "", .path: "/", .discard: true ]) //Create cookie key to set AVURLAsset let options = [AVURLAssetHTTPCookiesKey: [cookie]] let asset = AVURLAsset(url: url,options: options) proxy = ReverseProxyResourceLoader() proxy?.cookie = "exampleCookie" // Set resource loader delegate to moniter the chunks asset.resourceLoader.setDelegate(proxy, queue: DispatchQueue.global()) // Load asset keys asynchronously (e.g., "playable") let keys = ["playable"] // Initialize the AVPlayer with the URL let playerItem = AVPlayerItem(asset: asset) self.player = AVPlayer(playerItem: playerItem) playerItem.addObserver(self, forKeyPath: "status", options: [.new, .initial], context: nil) // Observe 'error' property (if needed) playerItem.addObserver(self, forKeyPath: "error", options: [.new], context: nil) let contentKeySessionDelegate = ContentKeyDelegate() // Initialize AVContentKeySession let contentKeySession = AVContentKeySession(keySystem: .clearKey) self.contentKeySession = contentKeySession contentKeySession.setDelegate(contentKeySessionDelegate, queue: DispatchQueue.main) // Associate the asset with the content key session contentKeySession.addContentKeyRecipient(asset) // Create a layer for the AVPlayer and add it to the view playerLayer = AVPlayerLayer(player: player) playerLayer?.frame = view.bounds playerLayer?.videoGravity = .resizeAspect if let playerLayer = playerLayer { view.layer.addSublayer(playerLayer) } NotificationCenter.default.addObserver( self, selector: #selector(playerDidFinishPlaying), name: .AVPlayerItemDidPlayToEndTime, object: player?.currentItem ) // Start playback player?.play() } // Update cookie when ever needed func updateCookie() { proxy?.cookie = "update exampleCookie" } @objc private func playerDidFinishPlaying(notification: Notification) { print("Playback finished!") // Optionally, handle end-of-playback actions here } // // ReverseProxyResourceLoader.swift // HLSDemo // // Created by Gajje.Venkatarao on 12/12/24. // import Foundation import AVKit import AVFoundation class ReverseProxyResourceLoader: NSObject, AVAssetResourceLoaderDelegate { var cookie = "" func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { resourceLoader.preloadsEligibleContentKeys = true guard let interceptedURL = loadingRequest.request.url else { loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Invalid URL"])) return false } if interceptedURL.scheme == "skd" { print("Token updated Cookie:", interceptedURL ) return false } var components = URLComponents(url: interceptedURL, resolvingAgainstBaseURL: false) components?.scheme = "https" // Replace with the original scheme guard let originalURL = components?.url else { loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to map URL"])) loadingRequest.finishLoading() return false } var request = URLRequest(url: originalURL) request.httpMethod = "GET" if let storeCoockie = HTTPCookie(properties: [ .name: "dazn-token", .value: cookie, .domain: originalURL.host ?? "", .path: "/", .discard: true ]){ HTTPCookieStorage.shared.setCookie(storeCoockie) } let headers = loadingRequest.request.allHTTPHeaderFields ?? [:] for (key, value) in headers { request.addValue(value, forHTTPHeaderField: key) } request.addValue(cookie, forHTTPHeaderField: "Cookie") URLSession.shared.configuration.httpShouldSetCookies = true request.httpShouldHandleCookies = true let task = (URLSession.shared.dataTask(with: originalURL) { data, response, error in if let error = error { print("Error Received:", error) loadingRequest.finishLoading(with: error) return } print(originalURL) guard let data = data , let url = response?.url else { loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "No data received"])) return } loadingRequest.dataRequest?.respond(with: data) loadingRequest.finishLoading() } as URLSessionDataTask) task.resume() return true } } Example project
0
0
511
Dec ’24
Get channels count for an audio stream in AVPlayer
In an m3u8 manifest, audio EXT-X-MEDIA tags usually contain CHANNELS tag containing the audio channels count like so: #EXT-X-MEDIA:TYPE=AUDIO,URI="audio_clear_eng_stereo.m3u8",GROUP-ID="default-audio-group",LANGUAGE="en",NAME="stream_5",AUTOSELECT=YES,CHANNELS="2" Is it possible to get this info from AVPlayer, AVMediaSelectionOption or some related API?
2
0
576
Dec ’24
Use AVPlayer for multiple videos
I'm developing a tutorial style tvOS app with multiple videos. The examples I've seen so far deal with only one video. Defining the player and source(url) before body view let avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../video.mp4")!)) and then in the body view the video is displayed VideoPlayer(player: avPlayer) This allows options such as stop/start etc. When I try something similar with a video title passed into this view I can't define the player with this title variable. var vTitle: String var avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../" + vTitle + ".mp4"")!)) var body: some View { I het an error that vTitle can't be used in the url above the body view. Any thoughts or suggestions? Thanks
1
0
757
Dec ’24
iOS Radio App: Need to extract and stream audio-only from HLS streams with video content
I'm developing an iOS radio app that plays various HLS streams. The challenge is that some stations broadcast HLS streams containing both audio and video (example: https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8), but I want to: Extract and play only the audio track Support AirPlay for audio-only streaming Minimize data usage by not downloading video content Technical Details: iOS 17+ Swift 5.9 Using AVFoundation for playback Current implementation uses AVPlayer with AVPlayerItem Current Code Structure: class StreamPlayer: ObservableObject { @Published var isPlaying = false private var player: AVPlayer? private var playerItem: AVPlayerItem? func playStream(url: URL) { let asset = AVURLAsset(url: url) playerItem = AVPlayerItem(asset: asset) player = AVPlayer(playerItem: playerItem) player?.play() } Stream Analysis: When analyzing the video stream using FFmpeg: CopyInput #0, hls, from 'https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8': Stream #0:0: Video: h264, yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 25 fps Stream #0:1: Audio: aac, 44100 Hz, stereo, fltp Attempted Solutions: Using MobileFFmpeg: let command = [ "-i", streamUrl, "-vn", "-acodec", "aac", "-ac", "2", "-ar", "44100", "-b:a", "128k", "-f", "mpegts", "udp://127.0.0.1:12345" ].joined(separator: " ") ffmpegProcess = MobileFFmpeg.execute(command) Issue: While FFmpeg successfully extracts audio, playback through AVPlayer doesn't work reliably. Tried using HLS output: let command = [ "-i", streamUrl, "-vn", "-acodec", "aac", "-ac", "2", "-ar", "44100", "-b:a", "128k", "-f", "hls", "-hls_time", "2", "-hls_list_size", "3", outputUrl.path ] Issue: Creates temporary files but faces synchronization issues with live streams. Requirements: Real-time audio extraction from HLS stream Maintain live streaming capabilities Full AirPlay support Minimal data usage (avoid downloading video content) Handle network interruptions gracefully Questions: What's the most efficient way to extract only audio from an HLS stream in real-time? Is there a way to tell AVPlayer to ignore video tracks completely? Are there better alternatives to FFmpeg for this specific use case? What's the recommended approach for handling AirPlay with modified streams? Any guidance or alternative approaches would be greatly appreciated. Thank you!
1
1
440
Dec ’24
Mediastreamvalidator Error: Invalid URL
I try to validate low latency HLS fragmented MP4 setup with meadistreamvalidator. I get following error: Error: Invalid URL Detail: '(null)' is not a valid URL Source: mediaplaylistURL.m3u8 - segmentURL.mp4 meadistreamvalidator version is 1.23.14 What does that error mean?
1
0
373
Dec ’24
Decode HLS livestream with VideoToolbox
Hi, I'm trying to decode a HLS livestream with VideoToolbox. The CMSampleBuffer is successfully created (OSStatus == noErr). When I enqueue the CMSampleBuffer to a AVSampleBufferDisplayLayer the view isn't displaying anything and the status of the AVSampleBufferDisplayLayer is 1 (rendering). When I use a VTDecompressionSession to convert the CMSampleBuffer to a CVPixelBuffer the VTDecompressionOutputCallback returns a -8969 (bad data error). What do I need to fix in my code? Do I incorrectly parse the data from the segment for the CMSampleBuffer? let segmentData = try await downloadSegment(from: segment.url) let (sps, pps, idr) = try parseH264FromTSSegment(tsData: segmentData) if self.formatDescription == nil { self.formatDescription = try CMFormatDescription(h264ParameterSets: [sps, pps]) } if let sampleBuffer = try createSampleBuffer(from: idr, segment: segment) { try self.decodeSampleBuffer(sampleBuffer) } func parseH264FromTSSegment(tsData: Data) throws -> (sps: Data, pps: Data, idr: Data) { let tsSize = 188 var pesData = Data() for i in stride(from: 0, to: tsData.count, by: tsSize) { let tsPacket = tsData.subdata(in: i..<min(i + tsSize, tsData.count)) guard let payload = extractPayloadFromTSPacket(tsPacket) else { continue } pesData.append(payload) } let nalUnits = parseNalUnits(from: pesData) var sps: Data? var pps: Data? var idr: Data? for nalUnit in nalUnits { guard let firstByte = nalUnit.first else { continue } let nalType = firstByte & 0x1F switch nalType { case 7: // SPS sps = nalUnit case 8: // PPS pps = nalUnit case 5: // IDR idr = nalUnit default: break } if sps != nil, pps != nil, idr != nil { break } } guard let validSPS = sps, let validPPS = pps, let validIDR = idr else { throw NSError() } return (validSPS, validPPS, validIDR) } func extractPayloadFromTSPacket(_ tsPacket: Data) -> Data? { let syncByte: UInt8 = 0x47 guard tsPacket.count == 188, tsPacket[0] == syncByte else { return nil } let payloadStart = (tsPacket[1] & 0x40) != 0 let adaptationFieldControl = (tsPacket[3] & 0x30) >> 4 var payloadOffset = 4 if adaptationFieldControl == 2 || adaptationFieldControl == 3 { let adaptationFieldLength = Int(tsPacket[4]) payloadOffset += 1 + adaptationFieldLength } guard adaptationFieldControl == 1 || adaptationFieldControl == 3 else { return nil } let payload = tsPacket.subdata(in: payloadOffset..<tsPacket.count) return payloadStart ? payload : nil } func parseNalUnits(from h264Data: Data) -> [Data] { let startCode = Data([0x00, 0x00, 0x00, 0x01]) var nalUnits: [Data] = [] var searchRange = h264Data.startIndex..<h264Data.endIndex while let range = h264Data.range(of: startCode, options: [], in: searchRange) { let nextStart = h264Data.range(of: startCode, options: [], in: range.upperBound..<h264Data.endIndex)?.lowerBound ?? h264Data.endIndex let nalUnit = h264Data.subdata(in: range.upperBound..<nextStart) nalUnits.append(nalUnit) searchRange = nextStart..<h264Data.endIndex } return nalUnits } private func createSampleBuffer(from data: Data, segment: HLSSegment) throws -> CMSampleBuffer? { var blockBuffer: CMBlockBuffer? let alignedData = UnsafeMutableRawPointer.allocate(byteCount: data.count, alignment: MemoryLayout<UInt8>.alignment) data.copyBytes(to: alignedData.assumingMemoryBound(to: UInt8.self), count: data.count) let blockStatus = CMBlockBufferCreateWithMemoryBlock( allocator: kCFAllocatorDefault, memoryBlock: alignedData, blockLength: data.count, blockAllocator: nil, customBlockSource: nil, offsetToData: 0, dataLength: data.count, flags: 0, blockBufferOut: &blockBuffer ) guard blockStatus == kCMBlockBufferNoErr, let validBlockBuffer = blockBuffer else { alignedData.deallocate() throw NSError() } var sampleBuffer: CMSampleBuffer? var timing = [calculateTiming(for: segment)] var sampleSizes = [data.count] let sampleStatus = CMSampleBufferCreate( allocator: kCFAllocatorDefault, dataBuffer: validBlockBuffer, dataReady: true, makeDataReadyCallback: nil, refcon: nil, formatDescription: formatDescription, sampleCount: 1, sampleTimingEntryCount: 1, sampleTimingArray: &timing, sampleSizeEntryCount: sampleSizes.count, sampleSizeArray: &sampleSizes, sampleBufferOut: &sampleBuffer ) guard sampleStatus == noErr else { alignedData.deallocate() throw NSError() } return sampleBuffer } private func decodeSampleBuffer(_ sampleBuffer: CMSampleBuffer) throws { guard let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer) else { throw NSError() } if decompressionSession == nil { try setupDecompressionSession(formatDescription: formatDescription) } guard let session = decompressionSession else { throw NSError() } let flags: VTDecodeFrameFlags = [._EnableAsynchronousDecompression, ._EnableTemporalProcessing] var flagOut = VTDecodeInfoFlags() let status = VTDecompressionSessionDecodeFrame( session, sampleBuffer: sampleBuffer, flags: flags, frameRefcon: nil, infoFlagsOut: nil) if status != noErr { throw NSError() } } private func setupDecompressionSession(formatDescription: CMFormatDescription) throws { self.formatDescription = formatDescription if let session = decompressionSession { VTDecompressionSessionInvalidate(session) self.decompressionSession = nil } var decompressionSession: VTDecompressionSession? var callback = VTDecompressionOutputCallbackRecord( decompressionOutputCallback: decompressionOutputCallback, decompressionOutputRefCon: Unmanaged.passUnretained(self).toOpaque()) let status = VTDecompressionSessionCreate( allocator: kCFAllocatorDefault, formatDescription: formatDescription, decoderSpecification: nil, imageBufferAttributes: nil, outputCallback: &callback, decompressionSessionOut: &decompressionSession ) if status != noErr { throw NSError() } self.decompressionSession = decompressionSession } let decompressionOutputCallback: VTDecompressionOutputCallback = { ( decompressionOutputRefCon, sourceFrameRefCon, status, infoFlags, imageBuffer, presentationTimeStamp, presentationDuration ) in guard status == noErr else { print("Callback: \(status)") return } if let imageBuffer = imageBuffer { } }
1
0
641
Dec ’24
HLS CMAF/fMP4 CENC CBCS pattern encryption
Hello, I'm writing a program to create CMAF compliant HLS files, with encryption. I have a copy of ISO_IEC_23001-7_2023 to attempt to follow the spec. I am following the 1:9 pattern encryption using CBCS, so for every 16 bytes of encrypted NAL unit data (of type 1 and 5), there's 144 bytes of clear data. When testing my output in Safari with 'identity' keys Quickly Diagnosing Content Key and IV Issues, Safari will request the identity key from my test server and first few bytes of the CMAF renditions, but will not play and console gives away no clues to the error. I am setting the subsample bytesofclear/protected data in the senc boxes. What I'm not sure of, is whether HLS/Safari/iOS acknowledges the senc/saiz/saio boxes of the MP4. There are other third party packagers Bento4, who suggest that they do not: those clients ignore the explicit encryption layout metadata found in saio/saiz boxes, and instead rely purely on the video slice header size to determine the portions of the sample that is encrypted So now I'm fairly sure I need to decipher the video slice header size, and apply the protected blocks from that point on. My question is, is that all there is to it? And is there a better way to debug my output? mediastreamvalidator will only work against unencrypted variants (which I'm outputting okay). Thanks in advance!
0
0
606
Dec ’24
Mediastreamvalidator Error: Invalid URL
I have a low latency hls with fragmented mp4 setup. When I try to validate it with mediavalidatorstream tool, it gives following error: Detail: '(null)' is not a valid URL Source: media playlisturl - segment url in that playlist What does that error mean?
0
0
286
Dec ’24