HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

HTTP Live Streaming Documentation

Posts under HTTP Live Streaming tag

108 Posts
Sort by:
Post not yet marked as solved
0 Replies
41 Views
Hi All, I am using Azure Media Service and CDN and I have set up some rules from the CDN side that can only allow traffic from my domain. But I find this work well on Windows PC and Mac book, but not iPhone or iPad. For further investigation, I caught the network trace and find that the request didn't have the 'Origin' in the header so the request will be denied by the CDN rules. This could be observed using either chrome or Safari browser on iPhones and iPad. Have anyone seen this issue before? Is it related to the iOS system or the AV player? Is there any workaround for such issue?
Posted
by
Post not yet marked as solved
0 Replies
60 Views
Hello all, I am currently developing a video streaming site and am currently working on the video player. The site will be serving video content using HLS given that this will be quite sizable and long video content. Given that the main functionality of the site is to watch streaming video, when a user clicks on a video to watch, when the video player page opens, the desire is to have the video start playing automatically with sound. Safari autoplay policy basically rejects attempts to autoplay video content with audio unless the audio starts muted, otherwise, the user needs to interact with the player in order to start the video playing with audio unmuted. Given that the main purpose of the site is to watch these videos, it would be hugely detrimental to require the user to interact with the player to either start playing the video, or to unmute it. Many sites such as YouTube, Twitch, and others obviously have autoplay functionality that works as I would expect for a video-centric site in Safari. Given their size, I imagine they have some agreement with Safari team to have their site pre-authorized to allow autoplay with audio. How do I go about doing this? Hoping someone might be able to share some information that would be useful to ensure I can have things configured correctly such that my Safari users won't have to suffer having to manually start playing, or turn on their audio in order to enjoy the videos the same way Chrome and Firefox users will automatically. Thanks!
Posted
by
Post not yet marked as solved
0 Replies
56 Views
Hi everybody, I think I have encountered a problem when selecting the new audible mediagroup in the hls. I will explain the casuistry below: The hls manifest contains 2-3 audio tracks, which are parsed correctly by AVFoundation giving rise to the different audible mediagroups. When we select a new audio track, we do it as the documentation says: guard let item = currentPlayerItem, let mediaSelectionGroup = item.asset.mediaSelectionGroup(forMediaCharacteristic: .legible) else { return } let selected = mediaSelectionGroup.options[index] item.select(selected, in: mediaSelectionGroup) So far everything works correctly, the problem is when switching audio tracks constantly and repeatedly. If we do this, the playback stays for a few seconds without any sound, it seems to be muted or laggy. After investigating in Charles, we see that when repeating the process mentioned above, the audio track is requested correctly and the ts are returned correctly, but until the video is requested again and the next video ts is returned, the audio is not played. I attach a screenshot of charles so that it can be seen correctly: In the selected part, you can see that the new audio track is requested and until avplayer does not ask again for the bitrate update, the audio does not play. What could be happening? Is there anything in avfoundation that we can change so that it doesn't happen? Thanks in advance, Luis Martínez
Posted
by
Post not yet marked as solved
0 Replies
153 Views
Used the PreferredForwardBufferDuration, this also after 2 - 3 minutes unable to pause the video, its catching live if pause duration more than 2 -3 minutes. Example:  [self.player.currentItem setPreferredForwardBufferDuration:60];   self.player.automaticallyWaitsToMinimizeStalling = YES;  self.player.currentItem.canUseNetworkResourcesForLiveStreamingWhilePaused = NO; How to stop buffering the AV-Player during pause state How to clear the AV-Player buffer during pause. SeekToTime is not working properly for live channel, I want to seek just one/two sec back. With SeekToTime if av-player unable to seek to previous time line its catching the future position. Manifest providing to av-player, #EXT-X-VERSION:4 #EXT-X-TARGETDURATION:1 #EXT-X-MEDIA-SEQUENCE:1 #EXT-X-KEY:METHOD=AES- http://ServerAddress/f1655359642.ts #EXT-X-KEY:METHOD=AES- http://ServerAddress/f1655359643.ts #EXT-X-KEY:METHOD=AES- #EXTINF:0.984, http://ServerAddress/f1655359644.ts
Posted
by
Post not yet marked as solved
0 Replies
125 Views
Hello I have a HLS manifest with multiple audio and multiple subtitle tracks. I use AVPlayer on iOS to play this HLS asset. On the iOS app I have the freedom to display selective subtitle and audio track list to the user but when the user uses AirPlay on the iOS, all the subtitle and audio tracks available in the HLS manifest is displayed to the user on the Apple TV (or any AirPlay compatible device). Let me explain with one example. Say, the HLS manifest has the following tracks: English, Spanish & German audio tracks English, Italian, Spanish & German subtitle tracks When the user uses AirPlay, I want to hide German language and display only the following tracks on the Apple TV: English & Spanish audio tracks English, Italian & Spanish subtitle tracks The primary intent here is that the user should not be able to see German language anywhere on the target AirPlay device. How to achieve this in code? Note: It's difficult to modify the source HLS manifest file.
Posted
by
Post not yet marked as solved
2 Replies
170 Views
Hi all, I'm streaming in HEVC 1080/2160 25/50 frames to Apple TV 4K 1st Gen. In both cases the devices can display the content without major issue, even doing HEVC 2160p50 which is very intense on the CPU, so the devices is very capable of handling that. Problem begins as soon i enabled DRM Fairplay, Apple TV presents some sporadic video freezing and apple tv console logs shows "dropped frames" as the main reason for it. The bigger the resolution and frame rate has more often video freezing. Even doing HEVC 720p50 there are some ocurrences. Did not have any ocurrence doing 576p50. It seems to be an issue related to pure CPU of decoding and decryption of the content, but afaik the Apple TV has an independent chipset just for decription. Is this a known issue? Any ideas what could it be? Why DRM/Fairplay enabled could affect that much the performance? Looking forward to the replies. Thanks!
Posted
by
Post not yet marked as solved
3 Replies
327 Views
Hello, I am Flip.Shop developer. Our site is having a problem displaying a Video whose size adjusts dynamically to the width and height of the parent component. Please visit this page: https://flip.shop and scroll through a few posts. On all normal browsers (Chrome/Firefox) the video loads nicely. But on Safari 15.5 (desktop and mobile) you can see a flicker for a while. The first frame of the video can't adjust to the size. Video component looks like this: <video preload="auto" loop="" playsinline="" webkit-playsinline="" x5-playsinline="" src="blob:https://flip.shop/1039dfe6-a1f4-4f80-822b-250665225c68" autoplay="" style="width: 100%; height: 100%;"> </video> and CSS of parent component looks like this: width: 100%; height: 100%; position: relative; & video { object-fit: cover; } Is there any solution to prevent video from going crazy in the Safari browser? The only workaround that seems to work is to show a poster covering the video until the video plays. Unfortunately, there is no event to inform that Safari is freaking out, so this poster has to be removed after a 300-500 millisecond timeout connected to the "play" event, which significantly delays the display of this video (on Safari).
Posted
by
Post marked as solved
1 Replies
204 Views
I have a downchannel half-connection to the Amazon Alexa cloud https://developer.amazon.com/en-US/docs/alexa/alexa-voice-service/manage-http2-connection.html#create Now, if the APP enters the background, my connection will be broken. Even if I put it in the backgroundTask, I cannot maintain the connection through ping, How do I maintain this HTTP persistent connection when the APP is in the background, plz。
Posted
by
Post not yet marked as solved
0 Replies
180 Views
Hi everyone, I am having a problem on AVPlayer when I try to play some videos. The video starts for a few seconds, but immediately after I see a black screen and in the console there is the following error: https://...manifest.m3u8 -12642 "CoreMediaErrorDomain" "Impossibile completare l'operazione. (Errore CoreMediaErrorDomain -12642 - No matching mediaFile found from playlist)" -12880 "CoreMediaErrorDomain" "Can not proceed after removing variants" - The strange thing is that if I try to play the same video on multiple devices, the result is that on someone it works and on someone it does not. For example on iPhone 5SE works and on iPad Pro 11'' II gen. and iPhone11 I've tried searching around to figure out what may be causing the problem, but there doesn't seem to be a clear solution. Anyone who has had a similar problem? Do you have any ideas about the reason for this problem?
Posted
by
Post marked as solved
2 Replies
323 Views
In our tvOS app we have to inject some tiny bit of data in the master manifest and leave the rest as is. The idea I was trying to implement here is intercepting the master manifest request with use of AVAssetResourceLoaderDelegate, and just redirect all consequent request, so AVKit can handle it on its own. In order to actually mimic the original requests, I made a copy of what is in AVAssetResourceLoadingRequest and adjusted only the parts required: override func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {     guard let url = loadingRequest.request.url, url.scheme == Self.assetScheme else {         return super.resourceLoader(resourceLoader, shouldWaitForLoadingOfRequestedResource: loadingRequest)     }     guard var urlComponents = URLComponents(url: url, resolvingAgainstBaseURL: false) else {         LOG.error("Could not obtain url components from resource request: \(loadingRequest.request)")         return super.resourceLoader(resourceLoader, shouldWaitForLoadingOfRequestedResource: loadingRequest)     }     urlComponents.scheme = "https"     guard let assetURL = try? urlComponents.asURL() else {         LOG.error("Could not make url from URL components \(urlComponents)")         return super.resourceLoader(resourceLoader, shouldWaitForLoadingOfRequestedResource: loadingRequest)     }     let assetURLRequest = (loadingRequest.request as NSURLRequest).mutableCopy() as? NSMutableURLRequest     assetURLRequest?.url = assetURL     guard let taskRequest = assetURLRequest?.copy() as? URLRequest else {         LOG.error("Could not convert url request \(String(describing: assetURLRequest))")         return super.resourceLoader(resourceLoader, shouldWaitForLoadingOfRequestedResource: loadingRequest)     }     if url == masterManifestURL {         // ...custom logic comes here...     } else {         loadingRequest.response = HTTPURLResponse(url: assetURL, statusCode: 302, httpVersion: "HTTP/1.1", headerFields: nil)         loadingRequest.redirect = taskRequest         loadingRequest.finishLoading()     }     return true } That works just fine, but only for VOD assets. For linear/live assets however only first bunch of data is loaded, when it ends, player does not request the next part of the sliding window and it hangs loading. I believe that the problem is somewhere with the custom logic, so I decided to list it separately: urlSession.dataTask(with: taskRequest) { [weak self] data, response, error in     loadingRequest.response = response     if let data = data, let dataRequest = loadingRequest.dataRequest, let self = self, let manifestString = String(data: data, encoding: .utf8) {         let adjustedManifestString = self.adjustAudioMetadataForManifest(manifestString)         if let adjustedData = adjustedManifestString.data(using: .utf8) {             dataRequest.respond(with: adjustedData)         } else {             LOG.error("Could not complement audio labels in master manifest")             dataRequest.respond(with: data)         }     }     if let error = error {         loadingRequest.finishLoading(with: error)     } else {         loadingRequest.finishLoading()     } }.resume() I noticed that unlike apple player, the custom resource loader requests have different encoding headers. It also was not clear whether data length and offset is more crucial for linear than it is for VOD, so I added this header as well: if let dataReq = loadingRequest.dataRequest, !dataReq.requestsAllDataToEndOfResource {     let offsetEnd = dataRequest.requestedOffset + dataRequest.requestedLength - 1     assetURLRequest?.addValue("bytes=\(dataReq.requestedOffset)-\(offsetEnd)", forHTTPHeaderField: "Range") } if loadingRequest.contentInformationRequest != nil {     assetURLRequest?.setValue("identity", forHTTPHeaderField: "Accept-Encoding") } I also fulfilled contentInformationRequest which I forgot originally (but it still worked for VOD): if let contentInformationRequest = loadingRequest.contentInformationRequest {     contentInformationRequest.contentLength = Int64.max     if let mimeType = response?.mimeType {         let utiType = UTTypeCreatePreferredIdentifierForTag(kUTTagClassMIMEType, mimeType as CFString, nil)         contentInformationRequest.contentType = utiType?.takeRetainedValue() as? String     }     contentInformationRequest.isByteRangeAccessSupported = true } and finally adjusted the data response with requested length: if dataRequest.requestsAllDataToEndOfResource {     dataRequest.respond(with: dataToRespond) } else {     let offsetStart = Int(dataRequest.requestedOffset)     let offsetEnd = Int(dataRequest.requestedOffset + dataRequest.requestedLength)     dataRequest.respond(with: dataToRespond[offsetStart ..< offsetEnd]) } All those adjustments happen only for master manifest request, and I just redirect all other request to proper url with https schema. Unfortunately all adjustments don't seem to make any difference. All work equally good with VOD assets but doesn't allow linear assets to play beyond the the very first video record (so sliding window just hangs) Is there some documentation on how to properly do the custom resource loader for a linear/live asset I can refer to in order to make it work?
Posted
by
Post not yet marked as solved
0 Replies
184 Views
Hi Team, I am able to use AVAssetDownloadTask for downloading HLS content with pause, resume, cancel functionality. However there is one scenario remaining, as manifest url are signed and expired after few hours. I need to add support for resume after manifest url expired before download complete. I do not want to restart download, instead resume download after with new HLS manifest url
Posted
by
Post not yet marked as solved
1 Replies
157 Views
Hi, I ran mediastreamvalidator and hlsreport for the first time and if the "issues" (MUST and SHOULD) sentence are perfectly understandable, I was wondering the meaning of the 2 tables at the end of the report .... in a "Discontinuity Information" section there is 2 table called "discontinuity duration table " and "start time table" is there any documentation on that ? is it a symptom of an issue with the CMAF MP4 file created by thye packager I use ? thanks for your help
Posted
by
Post not yet marked as solved
1 Replies
179 Views
Hey there, I'm new here but I really could need some help... I'm trying and searching around for a long time now but couldn't get there. I want to create an HLS audio-only stream (without any video files) but multiple audio selectors to switch between languages. Is it possible and how?
Posted
by
Post not yet marked as solved
1 Replies
252 Views
I am looking for some guidance on an authentication issue related to AirPlay sessions for HLS streams. Our app currently uses token authentication for our HLS streams, but for AirPlay sessions we only authenticate on the master manifest since the user has already authenticated via the app and However, this does leave open the potential for someone to fake an AirPlay User-Agent and request rendition manifests and segments without authentication. We wouldn't be able to perform the token authentication on the AirPlay since we can't pass the custom header with the token across the AirPlay session boundaries. Support showed me a potential workaround of using " AVAssetResourceLoader", but that would not work in my case, as I don't have the ability to make changes on the iOS app. So that leads me to trying to solve this issue at the CDN level. What I would like to do is try and verify that requests are coming from valid AirPlay devices/sessions by checking for headers that are included specifically for an AirPlay session. Searching online led me to these two possible headers, " X-Apple-Device-ID: X-Apple-Session-ID: " ,but I have not been able to see them when checking on the CDN. Is there any documentation on default/standard headers that would/should appear in AirPlay requests? Thanks
Posted
by
Post not yet marked as solved
1 Replies
292 Views
Our custom software takes RTSP from cctv cameras and turns it into an RTSP stream that we play through calppr player. One particular model of camera wont play on ios devices. It is just all black when trying to play it. on ios (all versions of ios) It still works great on PC, Mac, and Android. Our hls stream is video only. There is no audio. The problem is not with the player because I tested other players and they do the same thing on iOS. Black screen or at best they freeze on the first picture and don't ever move. Any advice on what the problem may be? Thank you so much!
Posted
by
Post not yet marked as solved
0 Replies
182 Views
Simple AVPlayer sample in swift for iOS 15.4.1 Interstitial specified via EXT-X-DATERANGE tag. Interstitial displayed as expected but no notifications generated for either AVPlayerInterstitialEventMonitor.currentEventDidChangeNotification or .eventsDidChangeNotification? Tested on both a simulator and a device?? Suggestions?
Posted
by
Post not yet marked as solved
1 Replies
244 Views
We use a hidden text track and the cuechange event to sync timed events on our pages with the live HLS stream we are producing. It fires correctly on iphone 8, 11, and 12, but will not fire when using an iPhone 13 pro. We see the 13 pro detect the text track by listening for the addtrack event, but the cuechange event never fires. is this a known bug? or is there some special syntax or encoding for the stream that is required for the iPhone 13?
Posted
by
Post not yet marked as solved
0 Replies
176 Views
Hi I am trying to publish parts ore my hole iCloud on a Website. I am looking for days to finde a way to do that. I am hosting the Website from a Mac mini with macOS X Server. The easiest way I see is with CloudKit. Has anyone an idea? In the end something like https://www.icloud.com would be exactly what I want. But how? XD Ps: i am open for everything. Can be done over app (Xcode) can be done with server stuff what i have no idea about... Everything
Posted
by