HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

HTTP Live Streaming Documentation

Posts under HTTP Live Streaming tag

108 Posts
Sort by:
Post not yet marked as solved
0 Replies
41 Views
Hi All, I am using Azure Media Service and CDN and I have set up some rules from the CDN side that can only allow traffic from my domain. But I find this work well on Windows PC and Mac book, but not iPhone or iPad. For further investigation, I caught the network trace and find that the request didn't have the 'Origin' in the header so the request will be denied by the CDN rules. This could be observed using either chrome or Safari browser on iPhones and iPad. Have anyone seen this issue before? Is it related to the iOS system or the AV player? Is there any workaround for such issue?
Posted
by stonehan.
Last updated
.
Post not yet marked as solved
0 Replies
60 Views
Hello all, I am currently developing a video streaming site and am currently working on the video player. The site will be serving video content using HLS given that this will be quite sizable and long video content. Given that the main functionality of the site is to watch streaming video, when a user clicks on a video to watch, when the video player page opens, the desire is to have the video start playing automatically with sound. Safari autoplay policy basically rejects attempts to autoplay video content with audio unless the audio starts muted, otherwise, the user needs to interact with the player in order to start the video playing with audio unmuted. Given that the main purpose of the site is to watch these videos, it would be hugely detrimental to require the user to interact with the player to either start playing the video, or to unmute it. Many sites such as YouTube, Twitch, and others obviously have autoplay functionality that works as I would expect for a video-centric site in Safari. Given their size, I imagine they have some agreement with Safari team to have their site pre-authorized to allow autoplay with audio. How do I go about doing this? Hoping someone might be able to share some information that would be useful to ensure I can have things configured correctly such that my Safari users won't have to suffer having to manually start playing, or turn on their audio in order to enjoy the videos the same way Chrome and Firefox users will automatically. Thanks!
Posted Last updated
.
Post not yet marked as solved
0 Replies
56 Views
Hi everybody, I think I have encountered a problem when selecting the new audible mediagroup in the hls. I will explain the casuistry below: The hls manifest contains 2-3 audio tracks, which are parsed correctly by AVFoundation giving rise to the different audible mediagroups. When we select a new audio track, we do it as the documentation says: guard let item = currentPlayerItem, let mediaSelectionGroup = item.asset.mediaSelectionGroup(forMediaCharacteristic: .legible) else { return } let selected = mediaSelectionGroup.options[index] item.select(selected, in: mediaSelectionGroup) So far everything works correctly, the problem is when switching audio tracks constantly and repeatedly. If we do this, the playback stays for a few seconds without any sound, it seems to be muted or laggy. After investigating in Charles, we see that when repeating the process mentioned above, the audio track is requested correctly and the ts are returned correctly, but until the video is requested again and the next video ts is returned, the audio is not played. I attach a screenshot of charles so that it can be seen correctly: In the selected part, you can see that the new audio track is requested and until avplayer does not ask again for the bitrate update, the audio does not play. What could be happening? Is there anything in avfoundation that we can change so that it doesn't happen? Thanks in advance, Luis Martínez
Posted
by LuisA3.
Last updated
.
Post not yet marked as solved
0 Replies
50 Views
Hello, We are trying to setup a web stream server with Mpeg-2 (H262) video format in the HLS container. Is there a way to play Mpeg-2 codec video on iPhone safari? Take care
Posted
by uguru.
Last updated
.
Post not yet marked as solved
1 Replies
309 Views
we are playing a HLS stream with the help of AV player and trying to read HLS manifest. We are able to detect majority of the tags however player is not detecting the EXT-X-DATERANGE:ID with DURATION tag ie #EXT-X-DATERANGE:ID="aba74c45-e963-45bf-8171-1f910c33f64a",DURATION=32.44 Where as, the other #EXT-X-DATERANGE:ID has been detected at the beginning of the manifest. #EXT-X-DATERANGE:ID="aba74c45-e963-45bf-8171-1f910c33f64a",START-DATE="2022-03-10T13:18:15.179Z",PLANNED-DURATION=15,X-AD-ID="9858" #EXT-X-DISCONTINUITY We are using the AVPlayers metadata collector delegate method to detect the metadata func metadataCollector(_ metadataCollector: AVPlayerItemMetadataCollector, didCollect metadataGroups: [AVDateRangeMetadataGroup], indexesOfNewGroups: IndexSet, indexesOfModifiedGroups: IndexSet) {} We are not able to detect the EXT-X-DATERANGE:ID with DURATION tag with the delegate used above Any help appreciated.
Posted Last updated
.
Post not yet marked as solved
0 Replies
153 Views
Used the PreferredForwardBufferDuration, this also after 2 - 3 minutes unable to pause the video, its catching live if pause duration more than 2 -3 minutes. Example:  [self.player.currentItem setPreferredForwardBufferDuration:60];   self.player.automaticallyWaitsToMinimizeStalling = YES;  self.player.currentItem.canUseNetworkResourcesForLiveStreamingWhilePaused = NO; How to stop buffering the AV-Player during pause state How to clear the AV-Player buffer during pause. SeekToTime is not working properly for live channel, I want to seek just one/two sec back. With SeekToTime if av-player unable to seek to previous time line its catching the future position. Manifest providing to av-player, #EXT-X-VERSION:4 #EXT-X-TARGETDURATION:1 #EXT-X-MEDIA-SEQUENCE:1 #EXT-X-KEY:METHOD=AES- http://ServerAddress/f1655359642.ts #EXT-X-KEY:METHOD=AES- http://ServerAddress/f1655359643.ts #EXT-X-KEY:METHOD=AES- #EXTINF:0.984, http://ServerAddress/f1655359644.ts
Posted
by Commscope.
Last updated
.
Post not yet marked as solved
0 Replies
125 Views
Hello I have a HLS manifest with multiple audio and multiple subtitle tracks. I use AVPlayer on iOS to play this HLS asset. On the iOS app I have the freedom to display selective subtitle and audio track list to the user but when the user uses AirPlay on the iOS, all the subtitle and audio tracks available in the HLS manifest is displayed to the user on the Apple TV (or any AirPlay compatible device). Let me explain with one example. Say, the HLS manifest has the following tracks: English, Spanish & German audio tracks English, Italian, Spanish & German subtitle tracks When the user uses AirPlay, I want to hide German language and display only the following tracks on the Apple TV: English & Spanish audio tracks English, Italian & Spanish subtitle tracks The primary intent here is that the user should not be able to see German language anywhere on the target AirPlay device. How to achieve this in code? Note: It's difficult to modify the source HLS manifest file.
Posted
by sudipta_s.
Last updated
.
Post not yet marked as solved
3 Replies
327 Views
Hello, I am Flip.Shop developer. Our site is having a problem displaying a Video whose size adjusts dynamically to the width and height of the parent component. Please visit this page: https://flip.shop and scroll through a few posts. On all normal browsers (Chrome/Firefox) the video loads nicely. But on Safari 15.5 (desktop and mobile) you can see a flicker for a while. The first frame of the video can't adjust to the size. Video component looks like this: <video preload="auto" loop="" playsinline="" webkit-playsinline="" x5-playsinline="" src="blob:https://flip.shop/1039dfe6-a1f4-4f80-822b-250665225c68" autoplay="" style="width: 100%; height: 100%;"> </video> and CSS of parent component looks like this: width: 100%; height: 100%; position: relative; & video { object-fit: cover; } Is there any solution to prevent video from going crazy in the Safari browser? The only workaround that seems to work is to show a poster covering the video until the video plays. Unfortunately, there is no event to inform that Safari is freaking out, so this poster has to be removed after a 300-500 millisecond timeout connected to the "play" event, which significantly delays the display of this video (on Safari).
Posted Last updated
.
Post not yet marked as solved
2 Replies
170 Views
Hi all, I'm streaming in HEVC 1080/2160 25/50 frames to Apple TV 4K 1st Gen. In both cases the devices can display the content without major issue, even doing HEVC 2160p50 which is very intense on the CPU, so the devices is very capable of handling that. Problem begins as soon i enabled DRM Fairplay, Apple TV presents some sporadic video freezing and apple tv console logs shows "dropped frames" as the main reason for it. The bigger the resolution and frame rate has more often video freezing. Even doing HEVC 720p50 there are some ocurrences. Did not have any ocurrence doing 576p50. It seems to be an issue related to pure CPU of decoding and decryption of the content, but afaik the Apple TV has an independent chipset just for decription. Is this a known issue? Any ideas what could it be? Why DRM/Fairplay enabled could affect that much the performance? Looking forward to the replies. Thanks!
Posted Last updated
.
Post not yet marked as solved
1 Replies
282 Views
When playing several short HLS clips using AVPlayer connected to a TV using Apple's Lightning-to-HDMI adapter (A1438) we often fail with those unknown errors. CoreMediaErrorDomain -12034 and CoreMediaErrorDomain -12158 Anyone has any clue what the errors mean? Environment: iPhone8 iOS 15.4 Lightning-to-HDMI adapter (A1438)
Posted
by anders.u.
Last updated
.
Post not yet marked as solved
5 Replies
1.3k Views
I have some HLS videos that are encrypted with a key for streaming playback in my app. So far, my testing indicates that they work on device all the way through, but only in the simulator up to iOS 14.5. In iOS 15 simulators, I get a crash that appears to be within private APIs with this copied from the stack: #0 0x0000000110584e8c in segPumpRequestCustomURLForCryptKey () Is anyone else getting this crash?
Posted
by mredig.
Last updated
.
Post not yet marked as solved
1 Replies
292 Views
Our custom software takes RTSP from cctv cameras and turns it into an RTSP stream that we play through calppr player. One particular model of camera wont play on ios devices. It is just all black when trying to play it. on ios (all versions of ios) It still works great on PC, Mac, and Android. Our hls stream is video only. There is no audio. The problem is not with the player because I tested other players and they do the same thing on iOS. Black screen or at best they freeze on the first picture and don't ever move. Any advice on what the problem may be? Thank you so much!
Posted
by kmax1940.
Last updated
.
Post not yet marked as solved
1 Replies
252 Views
I am looking for some guidance on an authentication issue related to AirPlay sessions for HLS streams. Our app currently uses token authentication for our HLS streams, but for AirPlay sessions we only authenticate on the master manifest since the user has already authenticated via the app and However, this does leave open the potential for someone to fake an AirPlay User-Agent and request rendition manifests and segments without authentication. We wouldn't be able to perform the token authentication on the AirPlay since we can't pass the custom header with the token across the AirPlay session boundaries. Support showed me a potential workaround of using " AVAssetResourceLoader", but that would not work in my case, as I don't have the ability to make changes on the iOS app. So that leads me to trying to solve this issue at the CDN level. What I would like to do is try and verify that requests are coming from valid AirPlay devices/sessions by checking for headers that are included specifically for an AirPlay session. Searching online led me to these two possible headers, " X-Apple-Device-ID: X-Apple-Session-ID: " ,but I have not been able to see them when checking on the CDN. Is there any documentation on default/standard headers that would/should appear in AirPlay requests? Thanks
Posted Last updated
.
Post not yet marked as solved
1 Replies
157 Views
Hi, I ran mediastreamvalidator and hlsreport for the first time and if the "issues" (MUST and SHOULD) sentence are perfectly understandable, I was wondering the meaning of the 2 tables at the end of the report .... in a "Discontinuity Information" section there is 2 table called "discontinuity duration table " and "start time table" is there any documentation on that ? is it a symptom of an issue with the CMAF MP4 file created by thye packager I use ? thanks for your help
Posted Last updated
.
Post not yet marked as solved
1 Replies
179 Views
Hey there, I'm new here but I really could need some help... I'm trying and searching around for a long time now but couldn't get there. I want to create an HLS audio-only stream (without any video files) but multiple audio selectors to switch between languages. Is it possible and how?
Posted
by mrmoon97.
Last updated
.
Post marked as solved
2 Replies
323 Views
In our tvOS app we have to inject some tiny bit of data in the master manifest and leave the rest as is. The idea I was trying to implement here is intercepting the master manifest request with use of AVAssetResourceLoaderDelegate, and just redirect all consequent request, so AVKit can handle it on its own. In order to actually mimic the original requests, I made a copy of what is in AVAssetResourceLoadingRequest and adjusted only the parts required: override func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {     guard let url = loadingRequest.request.url, url.scheme == Self.assetScheme else {         return super.resourceLoader(resourceLoader, shouldWaitForLoadingOfRequestedResource: loadingRequest)     }     guard var urlComponents = URLComponents(url: url, resolvingAgainstBaseURL: false) else {         LOG.error("Could not obtain url components from resource request: \(loadingRequest.request)")         return super.resourceLoader(resourceLoader, shouldWaitForLoadingOfRequestedResource: loadingRequest)     }     urlComponents.scheme = "https"     guard let assetURL = try? urlComponents.asURL() else {         LOG.error("Could not make url from URL components \(urlComponents)")         return super.resourceLoader(resourceLoader, shouldWaitForLoadingOfRequestedResource: loadingRequest)     }     let assetURLRequest = (loadingRequest.request as NSURLRequest).mutableCopy() as? NSMutableURLRequest     assetURLRequest?.url = assetURL     guard let taskRequest = assetURLRequest?.copy() as? URLRequest else {         LOG.error("Could not convert url request \(String(describing: assetURLRequest))")         return super.resourceLoader(resourceLoader, shouldWaitForLoadingOfRequestedResource: loadingRequest)     }     if url == masterManifestURL {         // ...custom logic comes here...     } else {         loadingRequest.response = HTTPURLResponse(url: assetURL, statusCode: 302, httpVersion: "HTTP/1.1", headerFields: nil)         loadingRequest.redirect = taskRequest         loadingRequest.finishLoading()     }     return true } That works just fine, but only for VOD assets. For linear/live assets however only first bunch of data is loaded, when it ends, player does not request the next part of the sliding window and it hangs loading. I believe that the problem is somewhere with the custom logic, so I decided to list it separately: urlSession.dataTask(with: taskRequest) { [weak self] data, response, error in     loadingRequest.response = response     if let data = data, let dataRequest = loadingRequest.dataRequest, let self = self, let manifestString = String(data: data, encoding: .utf8) {         let adjustedManifestString = self.adjustAudioMetadataForManifest(manifestString)         if let adjustedData = adjustedManifestString.data(using: .utf8) {             dataRequest.respond(with: adjustedData)         } else {             LOG.error("Could not complement audio labels in master manifest")             dataRequest.respond(with: data)         }     }     if let error = error {         loadingRequest.finishLoading(with: error)     } else {         loadingRequest.finishLoading()     } }.resume() I noticed that unlike apple player, the custom resource loader requests have different encoding headers. It also was not clear whether data length and offset is more crucial for linear than it is for VOD, so I added this header as well: if let dataReq = loadingRequest.dataRequest, !dataReq.requestsAllDataToEndOfResource {     let offsetEnd = dataRequest.requestedOffset + dataRequest.requestedLength - 1     assetURLRequest?.addValue("bytes=\(dataReq.requestedOffset)-\(offsetEnd)", forHTTPHeaderField: "Range") } if loadingRequest.contentInformationRequest != nil {     assetURLRequest?.setValue("identity", forHTTPHeaderField: "Accept-Encoding") } I also fulfilled contentInformationRequest which I forgot originally (but it still worked for VOD): if let contentInformationRequest = loadingRequest.contentInformationRequest {     contentInformationRequest.contentLength = Int64.max     if let mimeType = response?.mimeType {         let utiType = UTTypeCreatePreferredIdentifierForTag(kUTTagClassMIMEType, mimeType as CFString, nil)         contentInformationRequest.contentType = utiType?.takeRetainedValue() as? String     }     contentInformationRequest.isByteRangeAccessSupported = true } and finally adjusted the data response with requested length: if dataRequest.requestsAllDataToEndOfResource {     dataRequest.respond(with: dataToRespond) } else {     let offsetStart = Int(dataRequest.requestedOffset)     let offsetEnd = Int(dataRequest.requestedOffset + dataRequest.requestedLength)     dataRequest.respond(with: dataToRespond[offsetStart ..< offsetEnd]) } All those adjustments happen only for master manifest request, and I just redirect all other request to proper url with https schema. Unfortunately all adjustments don't seem to make any difference. All work equally good with VOD assets but doesn't allow linear assets to play beyond the the very first video record (so sliding window just hangs) Is there some documentation on how to properly do the custom resource loader for a linear/live asset I can refer to in order to make it work?
Posted Last updated
.
Post marked as solved
1 Replies
204 Views
I have a downchannel half-connection to the Amazon Alexa cloud https://developer.amazon.com/en-US/docs/alexa/alexa-voice-service/manage-http2-connection.html#create Now, if the APP enters the background, my connection will be broken. Even if I put it in the backgroundTask, I cannot maintain the connection through ping, How do I maintain this HTTP persistent connection when the APP is in the background, plz。
Posted
by tongxingx.
Last updated
.
Post not yet marked as solved
1 Replies
774 Views
We have a website which allows users to upload image, video, and audio files. Then they are able to watch them via streaming. We deliver these as .m3u8 files. In chrome and safari, video and audio works fine. Same for android phones. In iPhone, videos play but audio shows the play button with a slash through it. Here is a jsfiddle to play on an iPhone or simulator. You will see the second one, the video, will play. The audio file will not. https://jsfiddle.net/59ryjLub/1/ Note that both will not play unless something like hls.js is used for web. But for iOS devices, there is supposed to be a native converter. This is what seems to be broken for audio files. What can I do to get the audio files to play on iPhone mobile web? Please advise!
Posted
by airmajd.
Last updated
.