HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

HTTP Live Streaming Documentation

Posts under HTTP Live Streaming tag

108 Posts
Sort by:
Post not yet marked as solved
0 Replies
180 Views
Hi everyone, I am having a problem on AVPlayer when I try to play some videos. The video starts for a few seconds, but immediately after I see a black screen and in the console there is the following error: https://...manifest.m3u8 -12642 "CoreMediaErrorDomain" "Impossibile completare l'operazione. (Errore CoreMediaErrorDomain -12642 - No matching mediaFile found from playlist)" -12880 "CoreMediaErrorDomain" "Can not proceed after removing variants" - The strange thing is that if I try to play the same video on multiple devices, the result is that on someone it works and on someone it does not. For example on iPhone 5SE works and on iPad Pro 11'' II gen. and iPhone11 I've tried searching around to figure out what may be causing the problem, but there doesn't seem to be a clear solution. Anyone who has had a similar problem? Do you have any ideas about the reason for this problem?
Posted
by LuxLux.
Last updated
.
Post not yet marked as solved
1 Replies
442 Views
I've built a web app that uses WebRTC to allow people to video chat in the browser and not be forced to download an app. However, it would be really helpful if iOS users could stream their video using their native camera app and not just the RTC element in the browser. Is this possible? I've found a way to open the native camera app using this HTML: <input type="file" accept="video/*" capture="environment"> However, this only allows the user to upload their video and not stream it.
Posted Last updated
.
Post not yet marked as solved
1 Replies
244 Views
We use a hidden text track and the cuechange event to sync timed events on our pages with the live HLS stream we are producing. It fires correctly on iphone 8, 11, and 12, but will not fire when using an iPhone 13 pro. We see the 13 pro detect the text track by listening for the addtrack event, but the cuechange event never fires. is this a known bug? or is there some special syntax or encoding for the stream that is required for the iPhone 13?
Posted
by aclaasen.
Last updated
.
Post not yet marked as solved
0 Replies
184 Views
Hi Team, I am able to use AVAssetDownloadTask for downloading HLS content with pause, resume, cancel functionality. However there is one scenario remaining, as manifest url are signed and expired after few hours. I need to add support for resume after manifest url expired before download complete. I do not want to restart download, instead resume download after with new HLS manifest url
Posted Last updated
.
Post not yet marked as solved
0 Replies
182 Views
Simple AVPlayer sample in swift for iOS 15.4.1 Interstitial specified via EXT-X-DATERANGE tag. Interstitial displayed as expected but no notifications generated for either AVPlayerInterstitialEventMonitor.currentEventDidChangeNotification or .eventsDidChangeNotification? Tested on both a simulator and a device?? Suggestions?
Posted Last updated
.
Post not yet marked as solved
11 Replies
3k Views
Does AppleTV 4K support HLG in any configuration? When I connect my AppleTV 4K to a Samsung TV with HLG support via HDMI AVPlayer.availableHDRModes.contains(.hlg) returns false However AVPlayer.availableHDRModes.contains(.hdr10) returns true https://support.apple.com/en-us/HT208074 only mentions HDR10 and Dolby Vision support. Is there any way to play HLG video on AppleTV 4K similar to how it works on newer iPhones and iPads?
Posted
by _tom_.
Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
It appears that, on initialization, an AVURLAsset has a copy of the cookies from HTTPCookieStorage.shared.cookies, unless otherwise specified with the options parameter. This array of HTTPCookie is merely a copy of iOS's cookie store at the time of initialization. If the OS's cookie store updates, the player/asset does not begin to use the most up-to-date cookies, instead using its original copy. How can I go about updating the player's/asset's cookie store to the current, most up-to-date, cookie store?
Posted Last updated
.
Post not yet marked as solved
0 Replies
176 Views
Hi I am trying to publish parts ore my hole iCloud on a Website. I am looking for days to finde a way to do that. I am hosting the Website from a Mac mini with macOS X Server. The easiest way I see is with CloudKit. Has anyone an idea? In the end something like https://www.icloud.com would be exactly what I want. But how? XD Ps: i am open for everything. Can be done over app (Xcode) can be done with server stuff what i have no idea about... Everything
Posted Last updated
.
Post marked as solved
3 Replies
3.8k Views
Both standard mp4 files and streaming HLS files are experiencing substantial playback and rendering issues on iOS 15. This includes: Safari immediately crashes Video displays only black (occasional audio can be heard) Video is frozen on 1st frame despite time updating Substantial load times (10+ seconds). Should be immediate. GPU Process:Media has been disabled yet issues persist. Safari immediately crashes with GPU Process: WebGL enabled. These videos are being rendered via WebGL (threejs) None of these issues were present on iOS 14. I’m on an iPad Pro 12.9 2020.
Posted
by VILLMER.
Last updated
.
Post not yet marked as solved
0 Replies
202 Views
Hello, We're developing a video streaming service. Can't get the video to stream on AirPlay2-enabled TVs, there is only a loader visible on the tV. We can only get the sounds to stream to AirPlay2 speakers. Media Stream Validator and HLS report on the playlists, shows that the video segments are recognized as something but can't process them. "Processed 0 of 600 segments" etc. We're at a loss on how to debug this. Is there any way to get access to AirPlay 2 logs to see what isn't working?
Posted
by vaike.
Last updated
.
Post not yet marked as solved
0 Replies
157 Views
Hello we are experiencing a quality drop when streaming via HTTP Live Streaming on iOS 15. The file behaves as expected on iOS 14 and 13. Is this a known issue? If so when can we expect a fix or is there a workaround?
Posted Last updated
.
Post not yet marked as solved
0 Replies
248 Views
Hi, I have an app that uses AVPlayer to stream and play videos (HLS) but I'm struggling to find a way to do the same with fmp4. This is what I use to play my HLS stream. I tried just to replace the url with the fmp4 one but it does not work. private func connect() {         let stringUrl = "https://wolverine.raywenderlich.com/content/ios/tutorials/video_streaming/foxVillage.m3u8"         let url = URL(string: stringUrl)!         let asset = AVURLAsset(url: url)         let item = AVPlayerItem(asset: asset)         if #available(iOS 10.0, *) {             item.preferredForwardBufferDuration = Double(50000) / 1000         }         if #available(iOS 13.0, *) {             item.automaticallyPreservesTimeOffsetFromLive = true         }         self.player = AVPlayer(playerItem: item)         let playerLayer = AVPlayerLayer(player: self.player)         playerLayer.frame = self.playerView.bounds         playerLayer.videoGravity = .resizeAspect         self.playerView.layer.addSublayer(playerLayer)         self.videoLayer = playerLayer         self.videoLayer?.frame = self.playerView.bounds         player?.play()     } I haven't got any luck looking for a possible solution and I'm out of ideas. I'll be really grateful if anyone of you could point me to a good direction.
Posted Last updated
.
Post not yet marked as solved
2 Replies
268 Views
We continuously are getting playback failed for live stream with error logs as: //Error 1 Segment exceeds specified bandwidth for variant||The operation couldn’t be completed. (CoreMediaErrorDomain error -12889.)|| //Error 2 The operation couldn’t be completed. (CoreMediaErrorDomain error -12888 - Playlist File unchanged for longer than 1.5 * target duration)|| //Error 3 Segment exceeds specified bandwidth for variant|| If we could actually find what -16042 error actually signifies, that would be great. P.S : We can provide any details regarding stream if required.
Posted
by vthakur93.
Last updated
.
Post not yet marked as solved
0 Replies
666 Views
Dear Apple Expert, Our project uses libUSB library to interact with a USB based camera device. Our application is working fine in macOS Mojave (10.14.6 ). When the new MacOS 12 beta version was made available, we tested our code. But when we try to claim the interface via "CreateInterfaceIterator" API, we are getting "kIOReturnExclusiveAccess" error code and ultimately our application fails. The failure is observed in both libUSB versions 1.0.23 and 1.0.24. Could you help us by explaining if there is change in the new OS with respect to access to USB devices?
Posted
by Akshit04.
Last updated
.
Post not yet marked as solved
0 Replies
199 Views
I would like to implement the OTT service with apple HLS playbackURL, I wonder what is the best practice for serving HLS streaming to the player whether Separated 2 endpoint between the playlist and segment(thus, playlist not cache at the middle layer even segment can be cached) or Use the same endpoint either playlist and segment, the playlist and segment will request to the same endpoint (playlist not cache at the middle layer even segment can be cached) I suspecting the 1. scenario can cause the problem at application player site by make the video stream video and audio unsync or stream not smooth compared with scenario 2. Can somebody help for giving me the suggestion please. Thank you
Posted
by jungjaid.
Last updated
.
Post not yet marked as solved
2 Replies
715 Views
I'm trying to use the sample code associated to the talk Author fragmented MPEG-4 content with AVAssetWriter which can be found here. It works well when I run it on macOS, but after adapting it to run in iOS (basically moving the code in the main file to a view controller), it doesn't work. The problem is that the function: assetWriter(_:didOutputSegmentData:segmentType:segmentReport:) is never called for the last segment. In macOS, the last segment is reported after calling the function AVAssetWriter.finishWriting(completionHandler:), but before the completionHandler parameter block is invoked. In iOS, nothing happens at that point. Is there anything I could do from my side to fix this problem? Thanks in advance!
Posted
by rlaguilar.
Last updated
.
Post not yet marked as solved
1 Replies
572 Views
AVPlayer gets the list of URLs from the m3u8 file. I need to add some query string at the end of each URL. Is there any option in the AVPlayer to do this? Example: HLS URL : http://example.com/hls.m3u8 #EXTM3U #EXT-X-VERSION:3 #EXT-X-TARGETDURATION:10 #EXT-X-MEDIA-SEQUENCE:0 #EXTINF:10.417000,ts #EXTINF:10.417000,ts #EXTINF:9.470000,ts #EXTINF:10.417000,ts #EXTINF:9.470000,ts #EXTINF:10.417000,ts #EXTINF:9.470000,ts #EXTINF:3.840611,ts #EXT-X-ENDLIST AVPlayer is trying to download http://example.com/1.ts But I want the AVPlayer to add "?st=2020-09-01T13%3A59%3A03Z&se=2020-09-02T13%3A59%3A03Z&sp=rl&sv=2018-03-28&sr=b&sig=Pua9sv8mgvPF6gNwuBSghdEq%2BefMFmwBuyUdjCetmw4%3D" So AVPlayer will try http://example.com/1.ts?st=2020-09-01T13%3A59%3A03Z&se=2020-09-02T13%3A59%3A03Z&sp=rl&sv=2018-03-28&sr=b&sig=Pua9sv8mgvPF6gNwuBSghdEq%2BefMFmwBuyUdjCetmw4%3D instead of http://example.com/1.ts
Posted
by nshakeeb.
Last updated
.
Post marked as solved
3 Replies
519 Views
I've been maintaining adaptive HLS+FairPlay streams with audio and video for years. My implementation works great. However, I'm now also implementing captions or subtitles, and I'm having trouble with the later. I'm actually able to generate my HLS streams with WebVTT subtitles, and they work great. But as soon as I encrypt the streams, Apple players stop working (forever stall). And my fairplay implementation works perfectly when no subtitles are involved. I'm not encrypting the webvtt chunks: they travel as plain text, as stated in Apple guidelines. I believe this may be the issue: encrypted a/v streams with an unencrypted subtitles stream. However, encrypting plain text subtitles with SAMPLE-AES has no sense to me, so far I was unable to find a single HLS example online with subtitles that also happen to have FairPlay encryption. All documents I have about FairPlay also say nothing about this, I've also tried applying CEA-608 closed captions in the video stream, and this actually works great with FairPlay. But CEA-608 has its own issues, and so I would like to migrate that tech to WebVTT: which also works great, except when FairPlay is involved. I understand that Apple also establishes that I could use TTML (ISMC1) inside fMP4, which I suspect may be SAMPLE-AES encryptable. However, given my customers use cases, I need to use TS format for HLS, and so I can't use fMP4. With all this in mind, do anybody know how to properly configure HLS+FairPlay with a plain text WebVTT subtitles stream? Please note this is about live streaming, and not VOD nor offline playback. Thanks.
Posted Last updated
.
Post not yet marked as solved
5 Replies
1.7k Views
We have live HLS streams with separate aac audio for multi-language tracks. Previously we have had 6 second segments, which had led to a small variation in audio segment length, although this played back everywhere on iOS14/tvOS14. In iOS15/tvOS15 these streams fail to play, unless we align audio and video segment lengths, which in turn requires us to run at 4 or 8 seconds for segments. To get back to 6, we would need to sample at a much higher rate (192kHz), as opposed to the 96kHz we have working today, or the 48kHz we had working previously. Anyone else spotted this?
Posted
by imbimp.
Last updated
.