Hello, Apple video engineers.
According to the official documentation, HLS is built on HTTP and traditionally ran on top of TCP. However, with the introduction of HTTP/3, which uses QUIC (runs on top of UDP), I would like to clarify the following:
Has the official HLS specification changed in a way that allows it to be considered UDP-based when using HTTP/3? And is it fair to say that HLS supports UDP since the transport can go over HTTP/3 and QUIC?
Would it be more accurate to say that HLS remains HTTP-dependent, and the transport protocol (TCP or QUIC) only determines how HTTP requests are delivered?
My thoughts: Since HTTP/3 uses QUIC running over UDP, we still can't say that HLS supports UDP in a classical way, as it is introduced in RTP, RTSP, SRT.
HTTP Live Streaming
RSS for tagSend audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).
Posts under HTTP Live Streaming tag
79 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I am using videojs player to play my hls m3u8 proxies, and there was no issue with ipad os 16, ater i upgraded it to 18, i started getting this warning ->
[Warning] VIDEOJS: – "WARN:"
"Problem encountered with playlist 0-https://coludFront.m3u8. Trying again since it is the only playlist."
After this warning logs, the player just freeze's and even the source is not loaded into player.
This issue doesn't exist in mac safari and iphone safari with latest os.
I am developing an app to stream and download DRM protected HLS videos based on the official “FairPlay Streaming Server SDK”.
When I play the downloaded video, it asks the server for .ts or .aac, even though I have passed the path of the downloaded video to AVURLAsset.
As a result, playback fails when the device is offline, such as in airplane mode.
This behavior depends on the playback time of the video and occurs when trying to download and play a video with a playback time of 19 hours or more.
It did not occur for videos with a playback time of 18 hours.
The environment we checked is iOS 18.3.
The solution at this time is to limit the video playback time to 18 hours, but if possible, we would like to allow download playback of videos longer than 19 hours.
Does anyone have any information or know of a solution to this problem, such as if you have experienced this type of event, or if you know that content longer than 19 hours cannot be played offline?
// load
let path = ".../***.movpkg" // Path of the downloaded file
videoAsset = AVURLAsset(url: path)
playerItem = AVPlayerItem(asset: videoAsset!)
player.replaceCurrentItem(with: playerItem)
// isPlayableOffline
print("videoAsset.assetCache.isPlayableOffline = \(videoAsset.assetCache.isPlayableOffline)") // true
Hi folks,
When doing HLS v6 live streaming with fmp4 chunks we noticed that when the encoder timestamps slightly drift and a #EXT-X-DISCONTINUITY tag is created in either the audio or video playlist (in an ABR setup), the tag is not correctly handled by the player leading to a broken playback containing black screen or no audio (depending on which playlist the tag is printed in).
We noticed that this is often true when the number of tags is odd between the playlists (eg. the audio playlist contains 1 tag and the video contains 2 tags will result in a black screen with audio).
By using the same "broken" source but using Shaka player instead won't break the playback at all.
Are there any possible fix (or upcoming) for AV Player?
Hello, we have HLS Stream app on Apple TV. Our streams are DRM protected. We have problem with streams when source device is turned off. For example, user start to watch our HLS DRM Protected content. After some time, user turns off device (it can be Monitor or TV via connected HDMI). Our app does not understand HDMI Source device turned off. Is there any way to understand HDMI connected device is turned off on Swift?
In our Apple TV application, we use the native AVPlayer for live playback functionality. Until tvOS 17.6 and during the tvOS 18 beta, the Pause/Resume feature worked as expected, allowing us to pause live playback. However, after updating to tvOS 18.1, the pause functionality no longer works.
The same app still works fine on tvOS 17, but on tvOS 18, attempting to pause live playback has no effect. We reviewed the tvOS 18 release notes but couldn't find any relevant changes or deprecations related to AVPlayer or live playback behavior.
Has there been any change in the handling of live playback or the Pause/Resume functionality in tvOS 18.1? Any guidance or suggestions to address this issue would be greatly appreciated.
Thank you!
In our Apple TV application, we are using the native AVPlayer for live playback functionality. During live restart playback, we intermittently encounter an error when the playback timeline approaches the actual live event end time.
Error:
The operation couldn’t be completed. (CoreMediaErrorDomain error -16839 - Unable to get playlist before long download timer) / Failure reason:
Scenario:
The live event is scheduled from 7:00 AM to 8:00 AM.
Restart playback begins at 7:20 AM, allowing the user to watch the event from the start while the live stream continues in real-time.
As the restart playback timeline approaches the actual event end time (8:00 AM), AVPlayer displays an error, and playback continues in the background.
Hi there, we have found a problem, after switching audio tracks multi times when play the HLS, there will be a few seconds of no sound playing after audio track switching, Is there a solution to this problem?
Can some one please answer me How Can I update the cookies of the previously set m3u8 video in AVPlayer without creating the new AVURLAsset and replacing the AVPlayer current Item with it
Here we are focusing to change the cookie at every 120 seconds while playing , in apple avplayer we can't modify cookie after initialisation due to that we followed the approach to using " Resource loader delegate " to pass cookie as a header value .
What I notice is that the playlist file (.m3u8) gets downloaded correctly. Then video file (.m4a) some chunks also gets downloaded. I know that the .ts file is downloaded because I can see the GET request completing on the web server with status 200. I also set a breakpoint at the following line:
loadingRequest.dataRequest?.respond(with: data)
immediately got error from avplayer status as
"The operation could not be completed. An unknown error occurred (-12881) From core media"
Need confirmation on why I am unable to load HLS using resource loader.
is it possible to update cookie value while paying continuously on avplayer.
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
let urlString = "localhost://demo.unified-streaming.com/k8s/features/stable/video/tears-of-steel/tears-of-steel.ism/.m3u8"
guard let url = URL(string: urlString) else {
print("Invalid URL")
return
}
//Create cookie to prepare for player asset
let cookie = HTTPCookie(properties: [
.name: "dazn-token",
.value: "cookie value",
.domain: url.host() ?? "",
.path: "/",
.discard: true
])
//Create cookie key to set AVURLAsset
let options = [AVURLAssetHTTPCookiesKey: [cookie]]
let asset = AVURLAsset(url: url,options: options)
proxy = ReverseProxyResourceLoader()
proxy?.cookie = "exampleCookie"
// Set resource loader delegate to moniter the chunks
asset.resourceLoader.setDelegate(proxy, queue: DispatchQueue.global())
// Load asset keys asynchronously (e.g., "playable")
let keys = ["playable"]
// Initialize the AVPlayer with the URL
let playerItem = AVPlayerItem(asset: asset)
self.player = AVPlayer(playerItem: playerItem)
playerItem.addObserver(self, forKeyPath: "status", options: [.new, .initial], context: nil)
// Observe 'error' property (if needed)
playerItem.addObserver(self, forKeyPath: "error", options: [.new], context: nil)
let contentKeySessionDelegate = ContentKeyDelegate()
// Initialize AVContentKeySession
let contentKeySession = AVContentKeySession(keySystem: .clearKey)
self.contentKeySession = contentKeySession
contentKeySession.setDelegate(contentKeySessionDelegate, queue: DispatchQueue.main)
// Associate the asset with the content key session
contentKeySession.addContentKeyRecipient(asset)
// Create a layer for the AVPlayer and add it to the view
playerLayer = AVPlayerLayer(player: player)
playerLayer?.frame = view.bounds
playerLayer?.videoGravity = .resizeAspect
if let playerLayer = playerLayer {
view.layer.addSublayer(playerLayer)
}
NotificationCenter.default.addObserver(
self,
selector: #selector(playerDidFinishPlaying),
name: .AVPlayerItemDidPlayToEndTime,
object: player?.currentItem
)
// Start playback
player?.play()
}
// Update cookie when ever needed
func updateCookie() {
proxy?.cookie = "update exampleCookie"
}
@objc private func playerDidFinishPlaying(notification: Notification) {
print("Playback finished!")
// Optionally, handle end-of-playback actions here
}
//
// ReverseProxyResourceLoader.swift
// HLSDemo
//
// Created by Gajje.Venkatarao on 12/12/24.
//
import Foundation
import AVKit
import AVFoundation
class ReverseProxyResourceLoader: NSObject, AVAssetResourceLoaderDelegate {
var cookie = ""
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
resourceLoader.preloadsEligibleContentKeys = true
guard let interceptedURL = loadingRequest.request.url else {
loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Invalid URL"]))
return false
}
if interceptedURL.scheme == "skd" {
print("Token updated Cookie:", interceptedURL )
return false
}
var components = URLComponents(url: interceptedURL, resolvingAgainstBaseURL: false)
components?.scheme = "https" // Replace with the original scheme
guard let originalURL = components?.url else {
loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to map URL"]))
loadingRequest.finishLoading()
return false
}
var request = URLRequest(url: originalURL)
request.httpMethod = "GET"
if let storeCoockie = HTTPCookie(properties: [
.name: "dazn-token",
.value: cookie,
.domain: originalURL.host ?? "",
.path: "/",
.discard: true
]){
HTTPCookieStorage.shared.setCookie(storeCoockie)
}
let headers = loadingRequest.request.allHTTPHeaderFields ?? [:]
for (key, value) in headers {
request.addValue(value, forHTTPHeaderField: key)
}
request.addValue(cookie, forHTTPHeaderField: "Cookie")
URLSession.shared.configuration.httpShouldSetCookies = true
request.httpShouldHandleCookies = true
let task = (URLSession.shared.dataTask(with: originalURL) { data, response, error in
if let error = error {
print("Error Received:", error)
loadingRequest.finishLoading(with: error)
return
}
print(originalURL)
guard let data = data , let url = response?.url else {
loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "No data received"]))
return
}
loadingRequest.dataRequest?.respond(with: data)
loadingRequest.finishLoading()
} as URLSessionDataTask)
task.resume()
return true
}
}
Example project
I'm working on an iOS app using JWPlayer to stream an HLS video. However, the stream stops and doesn't restart on both emulators and real devices. The output provides the following information:
• transportType: HTTP Live Stream
• mediaType: HTTP Live Stream
• BundleID:
• name: MEDIA_PLAYBACK_STALL
• interfaceType: Wifi
Additionally, the log shows:
(16830): Media file not received in 21s
Can anyone guide me on this issue?
Is this technical solution reasonable about WKWebView on cross-domain issues ?
Hi,all
My project use WKWebView to load offline package, such as .html/.css/.js,and also request some resources from remote server to update pages. So there is a cross-domain problem with local file(file://***) and remote domain (https://***), is this following technical solution reasonable to fix this problem:
1. Create a custom URLSchemeHandler which conforms to WKURLSchemeHandler
2.Unify local file and remote domain request to https request
3. Hook WKWebView https request
4. Implement WKURLSchemeHandler delegate method
(void)webView:(WKWebView *)webView startURLSchemeTask:(id)urlSchemeTask {
NSURL *url = urlSchemeTask.request.URL;
if ([url.pathExtension isEqualToString:@"html"]) {
NSData *data = [[NSData alloc] initWithContentsOfFile:localFilePath];
NSMutableDictionary resHeader = [NSMutableDictionary new];
[resHeader setValue:@"" forKey:@"Access-Control-Allow-Origin"];
[resHeader setValue:@"charset=UTF-8" forKey:@"Content-Type"];
[resHeader setValue:@"text/html" forKey:@"Content-Type"];
NSHTTPURLResponse *response = [[NSHTTPURLResponse alloc]
initWithURL:url statusCode:200 HTTPVersion:@"HTTP/1.1" headerFields:resHeader];
[urlSchemeTask didReceiveResponse:response];
[urlSchemeTask didReceiveData:data];
[urlSchemeTask didFinish];
} else {
NSURLSession *defaultSession = [NSURLSession sharedSession];
NSURLSessionTask *dataTask = [defaultSession dataTaskWithRequest:urlSchemeTask.request completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable response, NSError * _Nullable error) {
[urlSchemeTask didReceiveResponse:response];
[urlSchemeTask didReceiveData:data];
[urlSchemeTask didFinish];
}];
[dataTask resume];
}
}
Is this technical solution reasonable? and is there any issues that I haven't considered?
Sincerely,
Looking forward to your reply
Hello,
I'm writing a program to create CMAF compliant HLS files, with encryption.
I have a copy of ISO_IEC_23001-7_2023 to attempt to follow the spec.
I am following the 1:9 pattern encryption using CBCS, so for every 16 bytes of encrypted NAL unit data (of type 1 and 5), there's 144 bytes of clear data.
When testing my output in Safari with 'identity' keys Quickly Diagnosing Content Key and IV Issues, Safari will request the identity key from my test server and first few bytes of the CMAF renditions, but will not play and console gives away no clues to the error.
I am setting the subsample bytesofclear/protected data in the senc boxes. What I'm not sure of, is whether HLS/Safari/iOS acknowledges the senc/saiz/saio boxes of the MP4. There are other third party packagers Bento4, who suggest that they do not:
those clients ignore the explicit encryption
layout metadata found in saio/saiz boxes, and instead rely purely on the
video slice header size to determine the portions of the sample that is
encrypted
So now I'm fairly sure I need to decipher the video slice header size, and apply the protected blocks from that point on.
My question is, is that all there is to it? And is there a better way to debug my output? mediastreamvalidator will only work against unencrypted variants (which I'm outputting okay).
Thanks in advance!
Hi, I'm trying to decode a HLS livestream with VideoToolbox. The CMSampleBuffer is successfully created (OSStatus == noErr). When I enqueue the CMSampleBuffer to a AVSampleBufferDisplayLayer the view isn't displaying anything and the status of the AVSampleBufferDisplayLayer is 1 (rendering).
When I use a VTDecompressionSession to convert the CMSampleBuffer to a CVPixelBuffer the VTDecompressionOutputCallback returns a -8969 (bad data error).
What do I need to fix in my code? Do I incorrectly parse the data from the segment for the CMSampleBuffer?
let segmentData = try await downloadSegment(from: segment.url)
let (sps, pps, idr) = try parseH264FromTSSegment(tsData: segmentData)
if self.formatDescription == nil {
self.formatDescription = try CMFormatDescription(h264ParameterSets: [sps, pps])
}
if let sampleBuffer = try createSampleBuffer(from: idr, segment: segment) {
try self.decodeSampleBuffer(sampleBuffer)
}
func parseH264FromTSSegment(tsData: Data) throws -> (sps: Data, pps: Data, idr: Data) {
let tsSize = 188
var pesData = Data()
for i in stride(from: 0, to: tsData.count, by: tsSize) {
let tsPacket = tsData.subdata(in: i..<min(i + tsSize, tsData.count))
guard let payload = extractPayloadFromTSPacket(tsPacket) else { continue }
pesData.append(payload)
}
let nalUnits = parseNalUnits(from: pesData)
var sps: Data?
var pps: Data?
var idr: Data?
for nalUnit in nalUnits {
guard let firstByte = nalUnit.first else { continue }
let nalType = firstByte & 0x1F
switch nalType {
case 7: // SPS
sps = nalUnit
case 8: // PPS
pps = nalUnit
case 5: // IDR
idr = nalUnit
default:
break
}
if sps != nil, pps != nil, idr != nil {
break
}
}
guard let validSPS = sps, let validPPS = pps, let validIDR = idr else {
throw NSError()
}
return (validSPS, validPPS, validIDR)
}
func extractPayloadFromTSPacket(_ tsPacket: Data) -> Data? {
let syncByte: UInt8 = 0x47
guard tsPacket.count == 188, tsPacket[0] == syncByte else {
return nil
}
let payloadStart = (tsPacket[1] & 0x40) != 0
let adaptationFieldControl = (tsPacket[3] & 0x30) >> 4
var payloadOffset = 4
if adaptationFieldControl == 2 || adaptationFieldControl == 3 {
let adaptationFieldLength = Int(tsPacket[4])
payloadOffset += 1 + adaptationFieldLength
}
guard adaptationFieldControl == 1 || adaptationFieldControl == 3 else {
return nil
}
let payload = tsPacket.subdata(in: payloadOffset..<tsPacket.count)
return payloadStart ? payload : nil
}
func parseNalUnits(from h264Data: Data) -> [Data] {
let startCode = Data([0x00, 0x00, 0x00, 0x01])
var nalUnits: [Data] = []
var searchRange = h264Data.startIndex..<h264Data.endIndex
while let range = h264Data.range(of: startCode, options: [], in: searchRange) {
let nextStart = h264Data.range(of: startCode, options: [], in: range.upperBound..<h264Data.endIndex)?.lowerBound ?? h264Data.endIndex
let nalUnit = h264Data.subdata(in: range.upperBound..<nextStart)
nalUnits.append(nalUnit)
searchRange = nextStart..<h264Data.endIndex
}
return nalUnits
}
private func createSampleBuffer(from data: Data, segment: HLSSegment) throws -> CMSampleBuffer? {
var blockBuffer: CMBlockBuffer?
let alignedData = UnsafeMutableRawPointer.allocate(byteCount: data.count, alignment: MemoryLayout<UInt8>.alignment)
data.copyBytes(to: alignedData.assumingMemoryBound(to: UInt8.self), count: data.count)
let blockStatus = CMBlockBufferCreateWithMemoryBlock(
allocator: kCFAllocatorDefault,
memoryBlock: alignedData,
blockLength: data.count,
blockAllocator: nil,
customBlockSource: nil,
offsetToData: 0,
dataLength: data.count,
flags: 0,
blockBufferOut: &blockBuffer
)
guard blockStatus == kCMBlockBufferNoErr, let validBlockBuffer = blockBuffer else {
alignedData.deallocate()
throw NSError()
}
var sampleBuffer: CMSampleBuffer?
var timing = [calculateTiming(for: segment)]
var sampleSizes = [data.count]
let sampleStatus = CMSampleBufferCreate(
allocator: kCFAllocatorDefault,
dataBuffer: validBlockBuffer,
dataReady: true,
makeDataReadyCallback: nil,
refcon: nil,
formatDescription: formatDescription,
sampleCount: 1,
sampleTimingEntryCount: 1,
sampleTimingArray: &timing,
sampleSizeEntryCount: sampleSizes.count,
sampleSizeArray: &sampleSizes,
sampleBufferOut: &sampleBuffer
)
guard sampleStatus == noErr else {
alignedData.deallocate()
throw NSError()
}
return sampleBuffer
}
private func decodeSampleBuffer(_ sampleBuffer: CMSampleBuffer) throws {
guard let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer) else {
throw NSError()
}
if decompressionSession == nil {
try setupDecompressionSession(formatDescription: formatDescription)
}
guard let session = decompressionSession else {
throw NSError()
}
let flags: VTDecodeFrameFlags = [._EnableAsynchronousDecompression, ._EnableTemporalProcessing]
var flagOut = VTDecodeInfoFlags()
let status = VTDecompressionSessionDecodeFrame(
session,
sampleBuffer: sampleBuffer,
flags: flags,
frameRefcon: nil,
infoFlagsOut: nil)
if status != noErr {
throw NSError()
}
}
private func setupDecompressionSession(formatDescription: CMFormatDescription) throws {
self.formatDescription = formatDescription
if let session = decompressionSession {
VTDecompressionSessionInvalidate(session)
self.decompressionSession = nil
}
var decompressionSession: VTDecompressionSession?
var callback = VTDecompressionOutputCallbackRecord(
decompressionOutputCallback: decompressionOutputCallback,
decompressionOutputRefCon: Unmanaged.passUnretained(self).toOpaque())
let status = VTDecompressionSessionCreate(
allocator: kCFAllocatorDefault,
formatDescription: formatDescription,
decoderSpecification: nil,
imageBufferAttributes: nil,
outputCallback: &callback,
decompressionSessionOut: &decompressionSession
)
if status != noErr {
throw NSError()
}
self.decompressionSession = decompressionSession
}
let decompressionOutputCallback: VTDecompressionOutputCallback = { (
decompressionOutputRefCon,
sourceFrameRefCon,
status,
infoFlags,
imageBuffer,
presentationTimeStamp,
presentationDuration
) in
guard status == noErr else {
print("Callback: \(status)")
return
}
if let imageBuffer = imageBuffer {
}
}
In an m3u8 manifest, audio EXT-X-MEDIA tags usually contain CHANNELS tag containing the audio channels count like so:
#EXT-X-MEDIA:TYPE=AUDIO,URI="audio_clear_eng_stereo.m3u8",GROUP-ID="default-audio-group",LANGUAGE="en",NAME="stream_5",AUTOSELECT=YES,CHANNELS="2"
Is it possible to get this info from AVPlayer, AVMediaSelectionOption or some related API?
Our company application uses callkit for livestream but is not available in china which make Apple not to approve the app. can any one help with what i can do. The live stream is one of the key feature of the app. Or is it possible to restrict download from china
Hi, Im working on a app with a infinite scrollable video similar to Tiktok or instagram reels. I initially thought it would be a good idea to cache videos in the file system but after reading this post it seems like it is not recommended to cache videos on the file system: https://forums.developer.apple.com/forums/thread/649810#:~:text=If%20the%20videos%20can%20be%20reasonably%20cached%20in%20RAM%20then%20we%20would%20recommend%20that.%20Regularly%20caching%20video%20to%20disk%20contributes%20to%20NAND%20wear
The reason I am hesitant to cache videos to memory is because this will add up pretty quickly and increase memory pressure for my app.
After seeing the amount of documents and data storage that instagram stores, its obvious they are caching videos on the file system. So I was wondering what is the updated best practice for caching for these kind of apps?
I tried configuring the preferredForwardBufferDuration on devices using 4G and Wi-Fi, and in these cases, AVPlayer works correctly according to the configured buffer duration. However, when the device is connected to a 5G network, the configuration value no longer works.
For example, if I set preferredForwardBufferDuration to 30 seconds, AVPlayer preloads with a buffer of over 100 seconds. I’m not sure how to resolve this, as it’s causing issues with my system.
We are experiencing an issue with our HLS MPEG-TS streams on Apple devices, where the AVPlayer in our iOS app and Safari jumps back to the start when the player automatically changes quality. This occurs despite the stream still indicating that it is live and there is no change in the seekbar. After testing our streams with the Apple HLS Validator, the only problem that occured was an "Measured peak bitrate compared to multivariant playlist declared value exceeds error tolerance"-Error.
On Chrome and on our Android-App this playback bug does not happen. Has someone else experienced similar issues with the AVPlayer?
The media services used for HLS streaming in an AVPlayer seem to crash if your segments are too large.
Anything over 20Mbps seems to cause a crash. I have tried adjusting the segment length to 1 second also and it didn't help.
I am remuxing Dolby Vision and HDR video and want to avoid transcoding and losing any metadata. However the segments are too large.
Is there a workaround for this? Otherwise it seems AVFoundation is not suited to high bitrate HLS and I should be using MPV or similar.