Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

CoreMediaErrorDomain -12888: Bandwidth down-stepping when using 2sec segment duration
the problem is when using HLS live stream with AVPlayer on iOS/ tvOS the player chooses first highest bandwidth then slowly steps down to lowest (within 1-3min) and eventually steps up again then repeats to step down. the AVPlayer error log sends events: errorStatusCode: -12888, errorDomain: Optional("CoreMediaErrorDomain"), errorComment: Optional("The operation couldn't be completed. (CoreMediaErrorDomain error -12888 - Playlist File unchanged for longer than 1.5 * target duration we use standard segments in CMAF format, 2sec duration #EXTM3U #EXT-X-VERSION:6 #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:147065903 #EXT-X-MAP:URI="video_1_4660000_init.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2" #EXT-X-PROGRAM-DATE-TIME:2025-04-30T12:51:07 #EXTINF:2.000, video_1_4660000_t17460174670001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174690001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174710001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 when using 6sec segments the player stays stable at highest bandwidth. is there a way to avoid this error? in AVPlayer or HLS configuration?
2
0
217
May ’25
FPS Certificate Re-issuance and Validity of Existing Certificate
Keywords: FairPlay, FPS Certificate, DRM, FairPlay Streaming, license server Hi all, We are currently using FairPlay Streaming in production and already have an FPS certificate in place. However, the passphrase for the existing FPS certificate has unfortunately been lost. We are now considering reissuing a new FPS certificate, and I would like to confirm a few points before proceeding: 1️⃣ If we reissue a new FPS certificate, will the existing certificate be automatically revoked? Or will it remain valid until its original expiration date? 2️⃣ Is it possible to have both the newly issued and the existing certificates valid at the same time? In other words, can we serve DRM licenses using either certificate depending on the packaging or client? 3️⃣ Are there any caveats or best practices we should be aware of when reissuing an FPS certificate? For example, would existing packaged content become unplayable, or would CDN/packaging server configurations need to be updated carefully? Since this affects our production environment, we would like to minimize any service disruption or compatibility issues. Unfortunately, when we contacted Apple support directly, we were advised to post this question here in the Forums for additional guidance. Any advice or experiences would be greatly appreciated! Thank you in advance.
1
0
147
Jun ’25
Feature / Workaround wanted: Seamless, Automated AirPlay Screen Streaming on visionOS for Demos
Hello Apple team and developer community, I am preparing a visionOS app for a fair environment, where we want to automatically stream the current experience to a nearby monitor via AirPlay, without requiring guests or staff to manually interact with the Control Center or AirPlay pickers all the time. The goal is to provide a smooth, frictionless setup so attendees can focus on the demo, not the configuration. Feature Request: A supported API or method to programmatically start/stop AirPlay video streaming (mirroring or external playback) from within a visionOS app, allowing the current experience to be instantly displayed on an external monitor or Apple TV for the audience. Context & Rationale: In a trade fair or exhibition setting, rapid guest turnaround and minimal staff intervention are crucial. Having to manually guide each visitor through AirPlay setup is impractical. As I understood, AVRoutePickerView can be used for this on iOS/macOS, but this is not available in visionOS. Enabling similar automated streaming on visionOS would make the device far more suitable for live demos and public showcases. Questions: Are there any supported workarounds or best practices for enabling automated screen streaming or AirPlay initiation on visionOS in public demo environments that I missed? Is Apple considering adding programmatic AirPlay control or accessibility features to support such use cases in future visionOS releases? Thank you for considering this request! If there are recommended patterns, entitlements, or accessibility solutions we could explore for trade fair scenarios, your guidance would be greatly appreciated. Best regards, Julian Zürn - IPI, HS Kempten
2
0
302
1w
Fairplay 4k streams decode errors - no video, audio only
HLS live streaming 4k is not displaying any video, but is streaming audio. Getting the following errors in the console where it shows that it is failing to decode every frame. Can I get some help as to what these error codes refer to and why it would fail to decode? 08:30:42.675879-0800 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): avdDec - Frame# 3588, DecodeFrame failed with error: 0x196 08:30:42.675908-0800 videocodecd AppleAVD: AppleAVDDisplayCallback(): Asking fig to drop frame # 3588 with err -12909 - internalStatus: 315 08:30:42.697412-0800 videocodecd AppleAVD: AppleAVDDecodeFrameResponse(): Frame# 3589 DecodeFrame failed with error 0x00000196 08:30:42.697876-0800 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): failed - error: 406
0
0
405
Dec ’24
Get channels count for an audio stream in AVPlayer
In an m3u8 manifest, audio EXT-X-MEDIA tags usually contain CHANNELS tag containing the audio channels count like so: #EXT-X-MEDIA:TYPE=AUDIO,URI="audio_clear_eng_stereo.m3u8",GROUP-ID="default-audio-group",LANGUAGE="en",NAME="stream_5",AUTOSELECT=YES,CHANNELS="2" Is it possible to get this info from AVPlayer, AVMediaSelectionOption or some related API?
2
0
616
Dec ’24
Decode HLS livestream with VideoToolbox
Hi, I'm trying to decode a HLS livestream with VideoToolbox. The CMSampleBuffer is successfully created (OSStatus == noErr). When I enqueue the CMSampleBuffer to a AVSampleBufferDisplayLayer the view isn't displaying anything and the status of the AVSampleBufferDisplayLayer is 1 (rendering). When I use a VTDecompressionSession to convert the CMSampleBuffer to a CVPixelBuffer the VTDecompressionOutputCallback returns a -8969 (bad data error). What do I need to fix in my code? Do I incorrectly parse the data from the segment for the CMSampleBuffer? let segmentData = try await downloadSegment(from: segment.url) let (sps, pps, idr) = try parseH264FromTSSegment(tsData: segmentData) if self.formatDescription == nil { self.formatDescription = try CMFormatDescription(h264ParameterSets: [sps, pps]) } if let sampleBuffer = try createSampleBuffer(from: idr, segment: segment) { try self.decodeSampleBuffer(sampleBuffer) } func parseH264FromTSSegment(tsData: Data) throws -> (sps: Data, pps: Data, idr: Data) { let tsSize = 188 var pesData = Data() for i in stride(from: 0, to: tsData.count, by: tsSize) { let tsPacket = tsData.subdata(in: i..<min(i + tsSize, tsData.count)) guard let payload = extractPayloadFromTSPacket(tsPacket) else { continue } pesData.append(payload) } let nalUnits = parseNalUnits(from: pesData) var sps: Data? var pps: Data? var idr: Data? for nalUnit in nalUnits { guard let firstByte = nalUnit.first else { continue } let nalType = firstByte & 0x1F switch nalType { case 7: // SPS sps = nalUnit case 8: // PPS pps = nalUnit case 5: // IDR idr = nalUnit default: break } if sps != nil, pps != nil, idr != nil { break } } guard let validSPS = sps, let validPPS = pps, let validIDR = idr else { throw NSError() } return (validSPS, validPPS, validIDR) } func extractPayloadFromTSPacket(_ tsPacket: Data) -> Data? { let syncByte: UInt8 = 0x47 guard tsPacket.count == 188, tsPacket[0] == syncByte else { return nil } let payloadStart = (tsPacket[1] & 0x40) != 0 let adaptationFieldControl = (tsPacket[3] & 0x30) >> 4 var payloadOffset = 4 if adaptationFieldControl == 2 || adaptationFieldControl == 3 { let adaptationFieldLength = Int(tsPacket[4]) payloadOffset += 1 + adaptationFieldLength } guard adaptationFieldControl == 1 || adaptationFieldControl == 3 else { return nil } let payload = tsPacket.subdata(in: payloadOffset..<tsPacket.count) return payloadStart ? payload : nil } func parseNalUnits(from h264Data: Data) -> [Data] { let startCode = Data([0x00, 0x00, 0x00, 0x01]) var nalUnits: [Data] = [] var searchRange = h264Data.startIndex..<h264Data.endIndex while let range = h264Data.range(of: startCode, options: [], in: searchRange) { let nextStart = h264Data.range(of: startCode, options: [], in: range.upperBound..<h264Data.endIndex)?.lowerBound ?? h264Data.endIndex let nalUnit = h264Data.subdata(in: range.upperBound..<nextStart) nalUnits.append(nalUnit) searchRange = nextStart..<h264Data.endIndex } return nalUnits } private func createSampleBuffer(from data: Data, segment: HLSSegment) throws -> CMSampleBuffer? { var blockBuffer: CMBlockBuffer? let alignedData = UnsafeMutableRawPointer.allocate(byteCount: data.count, alignment: MemoryLayout<UInt8>.alignment) data.copyBytes(to: alignedData.assumingMemoryBound(to: UInt8.self), count: data.count) let blockStatus = CMBlockBufferCreateWithMemoryBlock( allocator: kCFAllocatorDefault, memoryBlock: alignedData, blockLength: data.count, blockAllocator: nil, customBlockSource: nil, offsetToData: 0, dataLength: data.count, flags: 0, blockBufferOut: &blockBuffer ) guard blockStatus == kCMBlockBufferNoErr, let validBlockBuffer = blockBuffer else { alignedData.deallocate() throw NSError() } var sampleBuffer: CMSampleBuffer? var timing = [calculateTiming(for: segment)] var sampleSizes = [data.count] let sampleStatus = CMSampleBufferCreate( allocator: kCFAllocatorDefault, dataBuffer: validBlockBuffer, dataReady: true, makeDataReadyCallback: nil, refcon: nil, formatDescription: formatDescription, sampleCount: 1, sampleTimingEntryCount: 1, sampleTimingArray: &timing, sampleSizeEntryCount: sampleSizes.count, sampleSizeArray: &sampleSizes, sampleBufferOut: &sampleBuffer ) guard sampleStatus == noErr else { alignedData.deallocate() throw NSError() } return sampleBuffer } private func decodeSampleBuffer(_ sampleBuffer: CMSampleBuffer) throws { guard let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer) else { throw NSError() } if decompressionSession == nil { try setupDecompressionSession(formatDescription: formatDescription) } guard let session = decompressionSession else { throw NSError() } let flags: VTDecodeFrameFlags = [._EnableAsynchronousDecompression, ._EnableTemporalProcessing] var flagOut = VTDecodeInfoFlags() let status = VTDecompressionSessionDecodeFrame( session, sampleBuffer: sampleBuffer, flags: flags, frameRefcon: nil, infoFlagsOut: nil) if status != noErr { throw NSError() } } private func setupDecompressionSession(formatDescription: CMFormatDescription) throws { self.formatDescription = formatDescription if let session = decompressionSession { VTDecompressionSessionInvalidate(session) self.decompressionSession = nil } var decompressionSession: VTDecompressionSession? var callback = VTDecompressionOutputCallbackRecord( decompressionOutputCallback: decompressionOutputCallback, decompressionOutputRefCon: Unmanaged.passUnretained(self).toOpaque()) let status = VTDecompressionSessionCreate( allocator: kCFAllocatorDefault, formatDescription: formatDescription, decoderSpecification: nil, imageBufferAttributes: nil, outputCallback: &callback, decompressionSessionOut: &decompressionSession ) if status != noErr { throw NSError() } self.decompressionSession = decompressionSession } let decompressionOutputCallback: VTDecompressionOutputCallback = { ( decompressionOutputRefCon, sourceFrameRefCon, status, infoFlags, imageBuffer, presentationTimeStamp, presentationDuration ) in guard status == noErr else { print("Callback: \(status)") return } if let imageBuffer = imageBuffer { } }
1
0
661
Dec ’24
Mediastreamvalidator Error: Invalid URL
I have a low latency hls with fragmented mp4 setup. When I try to validate it with mediavalidatorstream tool, it gives following error: Detail: '(null)' is not a valid URL Source: media playlisturl - segment url in that playlist What does that error mean?
0
0
299
Dec ’24
HLS CMAF/fMP4 CENC CBCS pattern encryption
Hello, I'm writing a program to create CMAF compliant HLS files, with encryption. I have a copy of ISO_IEC_23001-7_2023 to attempt to follow the spec. I am following the 1:9 pattern encryption using CBCS, so for every 16 bytes of encrypted NAL unit data (of type 1 and 5), there's 144 bytes of clear data. When testing my output in Safari with 'identity' keys Quickly Diagnosing Content Key and IV Issues, Safari will request the identity key from my test server and first few bytes of the CMAF renditions, but will not play and console gives away no clues to the error. I am setting the subsample bytesofclear/protected data in the senc boxes. What I'm not sure of, is whether HLS/Safari/iOS acknowledges the senc/saiz/saio boxes of the MP4. There are other third party packagers Bento4, who suggest that they do not: those clients ignore the explicit encryption layout metadata found in saio/saiz boxes, and instead rely purely on the video slice header size to determine the portions of the sample that is encrypted So now I'm fairly sure I need to decipher the video slice header size, and apply the protected blocks from that point on. My question is, is that all there is to it? And is there a better way to debug my output? mediastreamvalidator will only work against unencrypted variants (which I'm outputting okay). Thanks in advance!
0
0
631
Dec ’24
Unknown error -12881 when using AVAssetResourceLoader
Here we are focusing to change the cookie at every 120 seconds while playing , in apple avplayer we can't modify cookie after initialisation due to that we followed the approach to using " Resource loader delegate " to pass cookie as a header value . What I notice is that the playlist file (.m3u8) gets downloaded correctly. Then video file (.m4a) some chunks also gets downloaded. I know that the .ts file is downloaded because I can see the GET request completing on the web server with status 200. I also set a breakpoint at the following line: loadingRequest.dataRequest?.respond(with: data) immediately got error from avplayer status as "The operation could not be completed. An unknown error occurred (-12881) From core media" Need confirmation on why I am unable to load HLS using resource loader. is it possible to update cookie value while paying continuously on avplayer. override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. let urlString = "localhost://demo.unified-streaming.com/k8s/features/stable/video/tears-of-steel/tears-of-steel.ism/.m3u8" guard let url = URL(string: urlString) else { print("Invalid URL") return } //Create cookie to prepare for player asset let cookie = HTTPCookie(properties: [ .name: "dazn-token", .value: "cookie value", .domain: url.host() ?? "", .path: "/", .discard: true ]) //Create cookie key to set AVURLAsset let options = [AVURLAssetHTTPCookiesKey: [cookie]] let asset = AVURLAsset(url: url,options: options) proxy = ReverseProxyResourceLoader() proxy?.cookie = "exampleCookie" // Set resource loader delegate to moniter the chunks asset.resourceLoader.setDelegate(proxy, queue: DispatchQueue.global()) // Load asset keys asynchronously (e.g., "playable") let keys = ["playable"] // Initialize the AVPlayer with the URL let playerItem = AVPlayerItem(asset: asset) self.player = AVPlayer(playerItem: playerItem) playerItem.addObserver(self, forKeyPath: "status", options: [.new, .initial], context: nil) // Observe 'error' property (if needed) playerItem.addObserver(self, forKeyPath: "error", options: [.new], context: nil) let contentKeySessionDelegate = ContentKeyDelegate() // Initialize AVContentKeySession let contentKeySession = AVContentKeySession(keySystem: .clearKey) self.contentKeySession = contentKeySession contentKeySession.setDelegate(contentKeySessionDelegate, queue: DispatchQueue.main) // Associate the asset with the content key session contentKeySession.addContentKeyRecipient(asset) // Create a layer for the AVPlayer and add it to the view playerLayer = AVPlayerLayer(player: player) playerLayer?.frame = view.bounds playerLayer?.videoGravity = .resizeAspect if let playerLayer = playerLayer { view.layer.addSublayer(playerLayer) } NotificationCenter.default.addObserver( self, selector: #selector(playerDidFinishPlaying), name: .AVPlayerItemDidPlayToEndTime, object: player?.currentItem ) // Start playback player?.play() } // Update cookie when ever needed func updateCookie() { proxy?.cookie = "update exampleCookie" } @objc private func playerDidFinishPlaying(notification: Notification) { print("Playback finished!") // Optionally, handle end-of-playback actions here } // // ReverseProxyResourceLoader.swift // HLSDemo // // Created by Gajje.Venkatarao on 12/12/24. // import Foundation import AVKit import AVFoundation class ReverseProxyResourceLoader: NSObject, AVAssetResourceLoaderDelegate { var cookie = "" func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { resourceLoader.preloadsEligibleContentKeys = true guard let interceptedURL = loadingRequest.request.url else { loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Invalid URL"])) return false } if interceptedURL.scheme == "skd" { print("Token updated Cookie:", interceptedURL ) return false } var components = URLComponents(url: interceptedURL, resolvingAgainstBaseURL: false) components?.scheme = "https" // Replace with the original scheme guard let originalURL = components?.url else { loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to map URL"])) loadingRequest.finishLoading() return false } var request = URLRequest(url: originalURL) request.httpMethod = "GET" if let storeCoockie = HTTPCookie(properties: [ .name: "dazn-token", .value: cookie, .domain: originalURL.host ?? "", .path: "/", .discard: true ]){ HTTPCookieStorage.shared.setCookie(storeCoockie) } let headers = loadingRequest.request.allHTTPHeaderFields ?? [:] for (key, value) in headers { request.addValue(value, forHTTPHeaderField: key) } request.addValue(cookie, forHTTPHeaderField: "Cookie") URLSession.shared.configuration.httpShouldSetCookies = true request.httpShouldHandleCookies = true let task = (URLSession.shared.dataTask(with: originalURL) { data, response, error in if let error = error { print("Error Received:", error) loadingRequest.finishLoading(with: error) return } print(originalURL) guard let data = data , let url = response?.url else { loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "No data received"])) return } loadingRequest.dataRequest?.respond(with: data) loadingRequest.finishLoading() } as URLSessionDataTask) task.resume() return true } } Example project
0
0
577
Dec ’24
ApplicationMusicPlayer stops with error after skipping quickly over playlist entries
ApplicationMusicPlayer with queue created from playlist crashes with random occurrence shortly after skipping back or forth using controls embedded in the notification, with the error on console log: applicationController: xpc service connection interrupted. I've noticed that the issue occurs more frequently the shorter is time between skipping entries. Since ApplicationMusicPlayer is run on a remote process, the main app does not crash, but the music stops playing without any exception, and the playback control turns uninitiated. Here is how I'm initiating the queue: let entries = playlist .with(.entries).entries! .map { ApplicationMusicPlayer.Queue.Entry($0) } ApplicationMusicPlayer.shared.queue = .init( entries, startingAt: entries.last ) Please give me some tips on how to solve this. EDIT: The issue does not occur when navigating quickly through the station.
1
0
496
Jan ’25
Unable to generate .p12 file from fairplay.cer
I am reaching out regarding an issue with my Apple FairPlay Streaming Certificate. To generate the certificate signing request (CSR), I used the following OpenSSL commands: openssl genrsa -out private_key.pem 1024 openssl req -new -key private_key.pem -out request.csr However, according to the guide provided by Apple and instructions from my DRM provider, I should have used: openssl genrsa -aes256 -out privatekey.pem 1024 openssl req -new -sha1 -key privatekey.pem -out certreq.csr -subj "/CN=SubjectName /OU=OrganizationalUnit /O=Organization /C=US" I suspect this discrepancy might be causing the issue with my FairPlay certificate. After obtaining the fairplay.cer file and importing it into Keychain Access, I noticed the following: When I expand the certificate in Keychain Access, I can only see a public key and no private key. As a result, I am unable to export the certificate as a .p12 file, as this option is disabled. As per my DRM provider's instructions, I need to export the certificate along with the corresponding private key as a .p12 file with a password. Since the private key is not visible in Keychain Access, I am unable to proceed further. I have read the FairPlay Streaming Overview but could not find any reasons as to why this issue is occurring or guidance on the procedure to revoke a certificate. Additionally, I came across the terms and conditions which mentioned reaching out to product-security at Apple for assistance in revoking corrupt certificates. However, despite reaching out, I have not received a response. Any help on how to proceed will be great!
0
0
512
Jan ’25
The operation couldn’t be completed error during live restart playback in native AV player
In our Apple TV application, we are using the native AVPlayer for live playback functionality. During live restart playback, we intermittently encounter an error when the playback timeline approaches the actual live event end time. Error: The operation couldn’t be completed. (CoreMediaErrorDomain error -16839 - Unable to get playlist before long download timer) / Failure reason: Scenario: The live event is scheduled from 7:00 AM to 8:00 AM. Restart playback begins at 7:20 AM, allowing the user to watch the event from the start while the live stream continues in real-time. As the restart playback timeline approaches the actual event end time (8:00 AM), AVPlayer displays an error, and playback continues in the background.
1
0
782
Jan ’25
Apple TV HDMI Connected device turn off detection problem via HDMI
Hello, we have HLS Stream app on Apple TV. Our streams are DRM protected. We have problem with streams when source device is turned off. For example, user start to watch our HLS DRM Protected content. After some time, user turns off device (it can be Monitor or TV via connected HDMI). Our app does not understand HDMI Source device turned off. Is there any way to understand HDMI connected device is turned off on Swift?
0
0
405
Jan ’25
EXT-X-DISCONTINUITY not properly handled in native iOS and Safari player (AVPlayer), broken playback.
Hi folks, When doing HLS v6 live streaming with fmp4 chunks we noticed that when the encoder timestamps slightly drift and a #EXT-X-DISCONTINUITY tag is created in either the audio or video playlist (in an ABR setup), the tag is not correctly handled by the player leading to a broken playback containing black screen or no audio (depending on which playlist the tag is printed in). We noticed that this is often true when the number of tags is odd between the playlists (eg. the audio playlist contains 1 tag and the video contains 2 tags will result in a black screen with audio). By using the same "broken" source but using Shaka player instead won't break the playback at all. Are there any possible fix (or upcoming) for AV Player?
1
0
586
Jan ’25
iOS: Want to use USB client mode to send data.
I want my iOS app to be able to use USB client mode to send LiDAR data and camera frames to another device. What are my options for doing this. I've found IOUSBHost for host mode, but I want to see all my options. The device I want driving the bus is a Meta Quest 3, which, and I mention for the sake of clarity, is inherently an Android device. The iPhone is to be used as a sensor hub, sending data to the Quest 3 for further processing. As a workaround, I could let the iPhone drive the bus and have the Quest 3 use Android's accessory mode, which lets other devices drive the USB bus. But, there are more USB devices I want to attach for my project, and doing this makes such more difficult. I want to avoid it.
0
0
305
Jan ’25
Why do I get .ts and .aac requests to the server when playing a downloaded HLS?
I am developing an app to stream and download DRM protected HLS videos based on the official “FairPlay Streaming Server SDK”. When I play the downloaded video, it asks the server for .ts or .aac, even though I have passed the path of the downloaded video to AVURLAsset. As a result, playback fails when the device is offline, such as in airplane mode. This behavior depends on the playback time of the video and occurs when trying to download and play a video with a playback time of 19 hours or more. It did not occur for videos with a playback time of 18 hours. The environment we checked is iOS 18.3. The solution at this time is to limit the video playback time to 18 hours, but if possible, we would like to allow download playback of videos longer than 19 hours. Does anyone have any information or know of a solution to this problem, such as if you have experienced this type of event, or if you know that content longer than 19 hours cannot be played offline? // load let path = ".../xxx.movpkg" // Path of the downloaded file videoAsset = AVURLAsset(url: path) playerItem = AVPlayerItem(asset: videoAsset!) player.replaceCurrentItem(with: playerItem) // isPlayableOffline print("videoAsset.assetCache.isPlayableOffline = \(videoAsset.assetCache.isPlayableOffline)") // true
0
0
405
Feb ’25
[Fairplay Streaming] Time Manipulation Detection DRM License Servers
Our team conducted security testing and found one vulnerability with fairplay license acquisition. Our QA engineer manually changed the device's system date and time (setting it 4 days into the future) and was able to successfully obtain a license response and initiate playback on an iOS device. However, on an Android device, the license acquisition failed. Can you please tell us if Time Manipulation Detection is available in FairPlay SDK?
0
0
410
Feb ’25