Hi folks,
When doing HLS v6 live streaming with fmp4 chunks we noticed that when the encoder timestamps slightly drift and a #EXT-X-DISCONTINUITY tag is created in either the audio or video playlist (in an ABR setup), the tag is not correctly handled by the player leading to a broken playback containing black screen or no audio (depending on which playlist the tag is printed in).
We noticed that this is often true when the number of tags is odd between the playlists (eg. the audio playlist contains 1 tag and the video contains 2 tags will result in a black screen with audio).
By using the same "broken" source but using Shaka player instead won't break the playback at all.
Are there any possible fix (or upcoming) for AV Player?
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I want my iOS app to be able to use USB client mode to send LiDAR data and camera frames to another device. What are my options for doing this. I've found IOUSBHost for host mode, but I want to see all my options. The device I want driving the bus is a Meta Quest 3, which, and I mention for the sake of clarity, is inherently an Android device. The iPhone is to be used as a sensor hub, sending data to the Quest 3 for further processing.
As a workaround, I could let the iPhone drive the bus and have the Quest 3 use Android's accessory mode, which lets other devices drive the USB bus. But, there are more USB devices I want to attach for my project, and doing this makes such more difficult. I want to avoid it.
Topic:
Media Technologies
SubTopic:
Streaming
I am developing an app to stream and download DRM protected HLS videos based on the official “FairPlay Streaming Server SDK”.
When I play the downloaded video, it asks the server for .ts or .aac, even though I have passed the path of the downloaded video to AVURLAsset.
As a result, playback fails when the device is offline, such as in airplane mode.
This behavior depends on the playback time of the video and occurs when trying to download and play a video with a playback time of 19 hours or more.
It did not occur for videos with a playback time of 18 hours.
The environment we checked is iOS 18.3.
The solution at this time is to limit the video playback time to 18 hours, but if possible, we would like to allow download playback of videos longer than 19 hours.
Does anyone have any information or know of a solution to this problem, such as if you have experienced this type of event, or if you know that content longer than 19 hours cannot be played offline?
// load
let path = ".../***.movpkg" // Path of the downloaded file
videoAsset = AVURLAsset(url: path)
playerItem = AVPlayerItem(asset: videoAsset!)
player.replaceCurrentItem(with: playerItem)
// isPlayableOffline
print("videoAsset.assetCache.isPlayableOffline = \(videoAsset.assetCache.isPlayableOffline)") // true
Our team conducted security testing and found one vulnerability with fairplay license acquisition.
Our QA engineer manually changed the device's system date and time (setting it 4 days into the future) and was able to successfully obtain a license response and initiate playback on an iOS device. However, on an Android device, the license acquisition failed.
Can you please tell us if Time Manipulation Detection is available in FairPlay SDK?
Hello, Apple video engineers.
According to the official documentation, HLS is built on HTTP and traditionally ran on top of TCP. However, with the introduction of HTTP/3, which uses QUIC (runs on top of UDP), I would like to clarify the following:
Has the official HLS specification changed in a way that allows it to be considered UDP-based when using HTTP/3? And is it fair to say that HLS supports UDP since the transport can go over HTTP/3 and QUIC?
Would it be more accurate to say that HLS remains HTTP-dependent, and the transport protocol (TCP or QUIC) only determines how HTTP requests are delivered?
My thoughts: Since HTTP/3 uses QUIC running over UDP, we still can't say that HLS supports UDP in a classical way, as it is introduced in RTP, RTSP, SRT.
I can't play video content with HEVC and DRM. Tested HEVC only: OK. Tested DRM+AVC: Ok.
Tested 2 players (Clappr/Stevie and BitMovin)
Master, variants and EXT-X-MAPs are downloaded Ok, DRM keys Ok and then, for instance with BitMovin Player:
[BMP] [Player] [Error] Event: SourceError, Data: {"code":2001,"data":{"message":"The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","code":-12927},"message":"Source Error. The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","timestamp":1740320663.4505711,"type":"onSourceError"} code: 2001 [Data code: -12927, message: The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.), underlying error: Error Domain=CoreMediaErrorDomain Code=-12927 "(null)"]
4k-master.m3u8.txt
4k.m3u8.txt
4k-audio.m3u8.txt
Hello All,
I am looking for assistance with our FairPlay Streaming (FPS) certificates. We are in the process of migrating to a new video streaming vendor and need to create a new FPS certificate using SDK 4. However, we have reached the limit of allowed FPS certificates in our account and cannot create a new one.
Issue Details:
• We currently have two FPS certificates active in our developer account.
• One of these was created using SDK 5, but our new vendor (Mux) requires an FPS certificate based on SDK 4.
• Since Apple does not allow deleting FPS certificates from the developer portal, we are unable to create a new SDK 4 certificate.
• We kindly request Apple to revoke one of our existing FPS certificates to allow us to generate a new SDK 4 certificate.
Request:
We would greatly appreciate it if you could assist us on how to delete one of our existing FPS certificates so that we can proceed with creating a new SDK 4 certificate for our vendor integration.
Thank you for your support.
I am playing the protected HLS streams and the authorization token expires in 3 minutes. I am trying to achieve this with 'AVAssetResourceLoaderDelegate'. I can refresh the token and play it, but the problem is in between the session, the player stalls for a small time, LIKE 1 SECOND.
Here's my code :
class APLCustomAVARLDelegate: NSObject, AVAssetResourceLoaderDelegate {
static let httpsScheme = "https"
static let redirectErrorCode = 302
static let badRequestErrorCode = 400
private var token: String?
private var retryDictionary = [String: Int]()
private let maxRetries = 3
private func schemeSupported(_ scheme: String) -> Bool {
let supported = ishttpSchemeValid(scheme)
print("Scheme '\(scheme)' supported: \(supported)")
return supported
}
private func reportError(loadingRequest: AVAssetResourceLoadingRequest, error: Int) {
let nsError = NSError(domain: NSURLErrorDomain, code: error, userInfo: nil)
print("Reporting error: \(nsError)")
loadingRequest.finishLoading(with: nsError)
}
// Handle token renewal requests to prevent playback stalls
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForRenewalOfRequestedResource renewalRequest: AVAssetResourceRenewalRequest) -> Bool {
print("Resource renewal requested for URL: \(renewalRequest.request.url?.absoluteString ?? "unknown URL")")
// Handle renewal the same way we handle initial requests
guard let scheme = renewalRequest.request.url?.scheme else {
print("No scheme found in the renewal URL.")
return false
}
if isHttpsSchemeValid(scheme) {
return handleHttpsRequest(renewalRequest)
}
print("Scheme not supported for renewal.")
return false
}
private func isHttpsSchemeValid(_ scheme: String) -> Bool {
let isValid = scheme == APLCustomAVARLDelegate.httpsScheme
print("httpsScheme scheme '\(scheme)' valid: \(isValid)")
return isValid
}
private func generateHttpsURL(sourceURL: URL) -> URL? {
// If you need to modify the URL, do it here
// Currently this just returns the same URL
let urlString = sourceURL.absoluteString
print("Generated HTTPS URL: \(urlString)")
return URL(string: urlString)
}
private func handleHttpsRequest(_ loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
print("Handling HTTPS request.")
guard let sourceURL = loadingRequest.request.url,
var redirectURL = generateHttpsURL(sourceURL: sourceURL) else {
print("Failed to generate HTTPS URL.")
reportError(loadingRequest: loadingRequest, error: APLCustomAVARLDelegate.badRequestErrorCode)
return true
}
// Track retry attempts with a dictionary keyed by request URL
let urlString = sourceURL.absoluteString
let currentRetries = retryDictionary[urlString] ?? 0
if currentRetries < maxRetries {
retryDictionary[urlString] = currentRetries + 1
} else {
// Too many retries, report a more specific error
reportError(loadingRequest: loadingRequest, error: NSURLErrorTimedOut)
retryDictionary.removeValue(forKey: urlString)
return true
}
if var urlComponents = URLComponents(url: redirectURL, resolvingAgainstBaseURL: false) {
var queryItems = urlComponents.queryItems ?? []
// Generate a fresh token each time
let freshToken = AESTimeBaseEncription.secureEncryptSecretText()
// Check if the token already exists
if let existingTokenIndex = queryItems.firstIndex(where: { $0.name == "token" }) {
// Update the existing token
queryItems[existingTokenIndex].value = freshToken
} else {
// Add the token if it doesn't exist
queryItems.append(URLQueryItem(name: "token", value: freshToken))
}
urlComponents.queryItems = queryItems
redirectURL = urlComponents.url!
}
let redirectRequest = URLRequest(url: redirectURL)
let response = HTTPURLResponse(url: redirectURL, statusCode: APLCustomAVARLDelegate.redirectErrorCode, httpVersion: nil, headerFields: nil)
print("Redirecting HTTPS to URL: \(redirectURL)")
loadingRequest.redirect = redirectRequest
loadingRequest.response = response
loadingRequest.finishLoading()
// If successful, reset the retry counter
if retryDictionary[urlString] == maxRetries {
retryDictionary.removeValue(forKey: urlString)
}
return true
}
}
Hello! I am trying to determine the best approach with AVPlayer for implementing auto-play, that is, playback that automatically starts without user initiation. Ideally this would work for both local and streaming audio.
My current approach is using KVO and the status on an AVPlayerItem equal to readyToPlay to do this, but I was wondering if there was a better property or state to use, or, alternatively, whether this use case may already be handled when automaticallyWaitsToMinimizeStalling is true, so that I could simply write:
player.replaceCurrentItem(with: AVPlayerItem(url: streamingUrl))
player.rate = 1
or
let playerItem = AVPlayerItem(url: streamingUrl)
player = AVPlayer(playerItem: playerItem)
player.rate = 1
and expect the item to be auto-played when ready.
In the context of user-initiated playback, I've typically seen code that makes a button's enabled state contingent on player.currentItem.duration, e.g. in AVFoundationSimplePlayer-iOS. On the other hand, AVAutoWait, which utilizes automaticallyWaitsToMinimizeStalling, does not seem to do this.
As a side note, I am not using an AVQueuePlayer.
Topic:
Media Technologies
SubTopic:
Streaming
Hello,
I am developing a video streaming service that uses FairPlay. Since around February 20th, we have started receiving reports of CoreMediaErrorDomain -42709 errors.
Unfortunately, there is no documentation from Apple that explains what this error means, so we are not sure how to address or fix the issue.
Most of the users who reported this error are using iOS 18.2.1 and iOS 18.3.1.
Could you please advise on what we should check or how we might resolve this error?
We noticed the behaviour of expiration of FairPlay license changed from iOS 16.x to some version iOS 17 and the latest iOS 18.3.2 and Safari.
On iOS 16.x, the video playback will stop when the license expires, but on iOS 17.x + the video continues but no audio and no error fired.
On latest Safari the video and audio all continues.
Any changes for the latest FairPlay and how we adapt this from the license server?
Thanks
t has been quite some time since I requested the Apple FPS package, yet I haven’t received it. I haven’t received any email either. Is there a developer support inquiry center where I can check the status of the process? Alternatively, could you share approximately how long it took for you to receive a response email?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
Accounts
FairPlay Streaming
Video
HTTP Live Streaming
In our logging tools (Firebase) I see a lot of errors reported when users are playing content and the app transitions to the background. A AVPlayerItemFailedToPlayToEndTime notification is fired with an error containing error codes like -1102 and 1852797029 which seem to correspond to NSURLErrorNoPermissionsToReadFile and kCMIOHardwareIllegalOperationError respectively. To me, it looks like these might have something to do with caching logic.
The items being played are HLS streams and we make use of AVAssetDownloadTask to make any streamed content offline available. Our setup is similar to the sample provided here: https://developer.apple.com/documentation/avfoundation/using-avfoundation-to-play-and-persist-http-live-streams. Whenever an item is selected for playback the app will check if a cached version is available and if so gets the url to the stored file like the "localAssetForStream()" method in the example, or get the asset from a currently running AVAssetDownloadTask for the item, or else, starts a new AVAssetDownloadTask and returns a AVAsset from that task to play.
This seems to work fine, and I can't reproduce the issues our users and our logging tools are reporting.
Is there some case I am missing where AVAssetDownloadTask and associated AVAssets might become unreadable when the app transitions to the background? Or do these errors indicate a different problem entirely?
Topic:
Media Technologies
SubTopic:
Streaming
A few months ago, I had the opportunity to receive a 2018 iMac, and I’ve been using it to create content for my social media. I was truly impressed by the power of its processors. Even with this older model, I’ve been able to grow my presence online—something I couldn’t achieve with newer computers from other brands that I previously purchased.
I would love to become a promoter of your brand in the gaming world. All I ask for is technological support with more recent equipment and a minimal payment for collaborating with you. I am genuinely interested in being part of your company and leveraging the potential and reputation of Apple to reach even greater heights.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
GameplayKit
External Graphics Processors
Developer Tools
I am playing FairPlay + Multi-Key content (fMP4) in Safari browser.
I want to implement the implementation to distinguish between SD and HD video quality, and play it in HD if HDCP is supported, and in SD if HDCP is not supported.
I have already confirmed that HDCP support is the default, and that a black screen is output in non-HDCP environments.
What I want is to improve the user experience by appropriately switching to SD/HD depending on HDCP support when playing DRM content.
Question: Is there an API or function that can detect HDCP support in Safari through JavaScript or other methods? Or is there a way to indirectly guess it?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
WebKit
Safari
HTTP Live Streaming
I’m building a professional camera app where users can customize the video recording format and color grading. In the func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) method, I handle video frames and use Metal for real-time color grading. This works well when device.activeColorSpace is sRGB or P3, and the results are great. However, when the color space is HLG_BT2020 or appleLog, the MTKTextureLoader.newTexture(cgImage: cgImage, options: options) method throws an error. After researching, I found that the video frame in these color spaces has a bit-per-channel (bpc) greater than 8 after being converted to CGImage, causing the texture creation to fail. I tried converting the CGImage to a lower bpc to successfully create the texture, but the final output image is garbled and not as expected. Is there a solution to this issue?
We are encountering an issue where AVPlayer throws the error:
Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" > Underlying Error Domain[(null)]:Code[0]:Desc[(null)]
This error seems to occur intermittently during video playback, especially after extended usage or when switching between different streams. We observe Error 11819 (AVFoundationErrorDomain) in Conviva platform that some of our users experience it but we couldn't reproduce it so far and we’re need support to determine the root cause and/or best practices to prevent it.
Some questions we have:
What typically triggers this error?
Could it be related to memory/resource constraints, network instability, or backgrounding?
Are there any recommended ways to handle or recover from this error gracefully?
Any insights or guidance would be greatly appreciated. Thanks!
Topic:
Media Technologies
SubTopic:
Streaming
Hello! I have been following the UsingAVFoundationToPlayAndPersistHTTPLiveStreams sample code in order to test persisting streams to disk. In addition to support for m3u8, I have noticed in testing that this also seems to work for MP3 Audio, simply by changing the plist entries to point to remote URLs with audio/mpeg content. Is this expected, or are there caveats that I should be aware of?
Thanks you!
The app registers a periodic time observer to the AVPlayer when the playback starts and it works fine. When switching to AirPlay during playback, the periodic time observation continues working as expected.
However, when switching back to local playback, the periodic time observer does not fire anymore until a seek is performed. The app removes the periodic time observer only when the playback stops.
I can see that when switching back to local playback, the timeControlStatus successively changes
to .waitingToPlayAtSpecifiedRate (reason: .evaluatingBufferingRate)
then to .waitingToPlayAtSpecifiedRate (reason: .toMinimizeStalls)
and finally to .playing
But the time observation does not work anymore.
Also, the issue is systematic with Live and VOD streams providing a program date (with HLS property #EXT-X-PROGRAM-DATE-TIME), with or without any DRM, and is never reproduced with other VOD streams.
I did watch WWDC 2019 Session 716 and understand that an active audio session is key to unlocking low‑level networking on watchOS. I’m configuring my audio session and engine as follows:
private func configureAudioSession(completion: @escaping (Bool) -> Void) {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playAndRecord, mode: .voiceChat, options: [])
try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
// Retrieve sample rate and configure the audio format.
let sampleRate = audioSession.sampleRate
print("Active hardware sample rate: \(sampleRate)")
audioFormat = AVAudioFormat(standardFormatWithSampleRate: sampleRate, channels: 1)
// Configure the audio engine.
audioInputNode = audioEngine.inputNode
audioEngine.attach(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: audioEngine.mainMixerNode, format: audioFormat)
try audioEngine.start()
completion(true)
} catch {
print("Error configuring audio session: \(error.localizedDescription)")
completion(false)
}
}
private func setupUDPConnection() {
let parameters = NWParameters.udp
parameters.includePeerToPeer = true
connection = NWConnection(host: "***.***.xxxxx.***", port: 0000, using: parameters)
setupNWConnectionHandlers()
}
private func setupTCPConnection() {
let parameters = NWParameters.tcp
connection = NWConnection(host: "***.***.xxxxx.***", port: 0000, using: parameters)
setupNWConnectionHandlers()
}
private func setupWebSocketConnection() {
guard let url = URL(string: "ws://***.***.xxxxx.***:0000") else {
print("Invalid WebSocket URL")
return
}
let session = URLSession(configuration: .default)
webSocketTask = session.webSocketTask(with: url)
webSocketTask?.resume()
print("WebSocket connection initiated")
sendAudioToServer()
receiveDataFromServer()
sendWebSocketPing(after: 0.6)
}
private func setupNWConnectionHandlers() {
connection?.stateUpdateHandler = { [weak self] state in
DispatchQueue.main.async {
switch state {
case .ready:
print("Connected (NWConnection)")
self?.isConnected = true
self?.failToConnect = false
self?.receiveDataFromServer()
self?.sendAudioToServer()
case .waiting(let error), .failed(let error):
print("Connection error: \(error.localizedDescription)")
DispatchQueue.main.asyncAfter(deadline: .now() + 2) {
self?.setupNetwork()
}
case .cancelled:
print("NWConnection cancelled")
self?.isConnected = false
default:
break
}
}
}
connection?.start(queue: .main)
}
Duplex in this context refers to two-way audio transmission simultaneously recording and sending audio while also receiving and playing back incoming audio, similar to a VoIP/SIP call.
The setup works fine on the simulator, which suggests that the core logic is correct. However, since the simulator doesn’t fully replicate WatchOS hardware behavior especially for audio sessions and networking issues might arise when running on a real device.
The problem likely lies in either the Watch’s actual hardware limitations, permission constraints, or specific audio session configurations.
I am reaching out to seek further assistance regarding the challenges I've been experiencing with establishing a UDP, TCP & web socket connection on watchOS using NWConnection for duplex audio streaming. Despite implementing the recommendations provided earlier, I am still encountering difficulties
From what I can see, your implementation is focused on streaming audio playback with the server. In my case, I'm looking for a slightly different approach: I want to capture audio and send buffers of a specific size to the server while playing audio simultaneously, essentially achieving full duplex streaming similar to a VOIP call. Additionally, I’d like to ensure that if no external audio route is connected, the Apple Watch speaker is used by default. Any thoughts or insights on adapting this setup for those requirements would be very welcome.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
AVAudioNode
Network
AVAudioSession
AVAudioEngine