HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

HTTP Live Streaming Documentation

Posts under HTTP Live Streaming tag

92 Posts
Sort by:
Post not yet marked as solved
1 Replies
1.1k Views
Hi , We are getting big spikes of errors on very few programs in some live channels. [-16012:CoreMediaErrorDomain] [Error Domain=CoreMediaErrorDomain Code=-16012 "(null)"] When this error occurred , AVPlayer stops & users has to restart the playback. This is the error we are getting & this happens in some live programs. We have the same set up & uses same transcoders etc in all programs. Mostly we have very low error rate in player with live programs , but with this error , error rate can increase up to 80% of the users effecting pretty much all the users on apple devices. Does anyone know what this error actually means ? What is the context & what is the reason behind that ? It seems like this may be related subtitles & this occurs only when the subtitles are enabled. ( The subtitles are not embedded in the stream it is teletext ) Try to find in apple documents & online & nothing could be find unfortunately.
Posted
by
Post not yet marked as solved
0 Replies
532 Views
I am simply trying to visualize m3u8 files from my Amazon S3 server while they are playing on my AVPlayer. The basic context of my AVPlayer is that I first create AVURLAsset and attach the cookies required to access it and then subsequently create the AVPlayerItem. I then setup the AVPlayer, play it, and then setup an observer so that I can handle some logic when the audio file is done playing. I just want to be able to get the power data of the streamed file in real time or close to real time. I looked into audioTapProcessor but it does not appear to work with streamed files. I then looked into AVAudioEngine but that also does not work well with streams. Is there any solution to this?
Posted
by
Post not yet marked as solved
1 Replies
979 Views
We have created a hls playback framework and lot of our client's are complaining about this error {"code": -19152, "domain": "CoreMediaErrorDomain", "localizedDescription": "The operation couldn’t be completed. (CoreMediaErrorDomain error -19152 - The operation couldn’t be completed. (CoreMediaErrorDomain error -19152.))", "localizedFailureReason": "", "localizedRecoverySuggestion": ""} We are unable to reproduce this issue on our end but we have data reflecting the same error happening at good rate. Any help/hint is welcome. Thanks
Posted
by
Post not yet marked as solved
0 Replies
636 Views
Hi, is it possible to force/ensure an automatic license renewal in a Fairplay SPC response? I can find that feature in other DRM systems like Widevine (using specific parameters in the response) Searching in "FairPlay Streaming Server SDK 4.4.4" I can only find parameters related to lease/rental TLLs, but not an explicit renewal request. does this feature exists on FairPlay? Any information will be appreciated. Have a great day!
Posted
by
Post not yet marked as solved
0 Replies
465 Views
When streaming HLS files in the iPhone/iPad browser only on iOS 16.5, the bit rate does not switch. The lowest quality bitrate is always selected regardless of the communication environment. This does not occur on iOS16.4 and iOS16.6(beta) in the same communication environment and on the same device, and the bitrate switches according to network conditions. Has anyone else experienced a similar event? Also, does anyone know how to remedy this issue?
Posted
by
Post not yet marked as solved
0 Replies
517 Views
Hey Guys, I am looking to add VP9 support to an application and am trying to figure out how to access these APIs on iOS. Is it possible to access the VP9 encoders / decoders without using HLS? Looking for some help on this. Thanks, Norris
Posted
by
Post not yet marked as solved
3 Replies
1.2k Views
Hello, We are using HLS for our streaming iOS and tvOS applications. We have DRM protection on our applications but we want add another secure layer which is CDN token. We want to add that CDN token data on header or query parameters. Any of two are applicable at our CDN side. There is a problem at client side. We want to send that token knowledge and refresh at given a time. We add token data using at initial state let asset = AVURLAsset(url: url, options: ["AVURLAssetHTTPHeaderFieldsKey": headers]) and add interceptor with asset.resourceLoader.setDelegate. It works seamlessly. We use AVAssetResourceLoaderDelegate and we can intercept just master playlist and playlists via func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool and we can refresh CDN token data at only playlists. That token data can be at query params or header. It does not matter. For example, #EXTM3U #EXT-X-VERSION:3 #EXTINF:10.0 https://chunk1?cdntoken=A.ts #EXTINF:10.0 https://chunk2?cdntoken=A.ts #EXTINF:10.0 https://chunk3?cdntoken=A.ts #EXTINF:10.0 assume that it is our .m3u8 file for given live video playlist. It has three chunks with cdn token data in query params. If we give that chunks to AVPlayer. It is going to play chunks in order. When we change new cdn token at query params, it effects our chunk Urls and our player stalls. It is because our cdn adds new cdn token knowledge to chunk's Url. It means that our new .m3u8 is going to be like that for next playlist; #EXT-X-VERSION:3 #EXTINF:10.0 https://chunk4?cdntoken=B.ts #EXTINF:10.0 https://chunk5?cdntoken=B.ts #EXTINF:10.0 https://chunk6?cdntoken=B.ts #EXTINF:10.0 Cdn token data is converted to B from A at cdn side and it sends new playlist like that. That's why, our player is going to stall. Is there any way not to player stall with edited chunk url? When we change new cdn token at header. It does not change chunks url like at the first question but AVPlayer does not allow to intercept chunk Urls which means that before callin https://chunk1?cdntoken=A.ts url, i want to intercept and add new cdn token data to header. Is there any way to intercept chunk Urls like intercepting playlist? Thanks for answers in advance
Posted
by
Post not yet marked as solved
1 Replies
510 Views
I have used ffmpeg to transform a video into multiple chunks, the ffmpeg command works fine and I can play the video. I have created an index.m3u8 that contains 4 qualities, each of those qualities contains the chunks of the video, I can play each chunk, but I can't play anything when using the index.m3u8. here is my index #EXTM3U #EXT-X-VERSION:3 #EXT-X-STREAM-INF:BANDWIDTH=5000000,RESOLUTION=1920x1080 http://cdn... #EXT-X-STREAM-INF:BANDWIDTH=2800000,RESOLUTION=1280x720 http://cdn... #EXT-X-STREAM-INF:BANDWIDTH=1400000,RESOLUTION=842x480 http://cdn... #EXT-X-STREAM-INF:BANDWIDTH=800000,RESOLUTION=640x360 http://cdn... here is my quality file #EXTM3U #EXT-X-PLAYLIST-TYPE:VOD #EXT-X-TARGETDURATION:8.341667 #EXT-X-VERSION:3 #EXT-X-MEDIA-SEQUENCE:24 #EXTINF:8.341667, http://cdn.... #EXTINF:8.341667, http://cdn.... #EXTINF:8.341667, http://cdn.... #EXTINF:6.873533, http://cdn.... #EXT-X-ENDLIST
Posted
by
Post not yet marked as solved
0 Replies
333 Views
Hello, I have an IP Camera that is able to do RTSP streaming. This stream would go to my server, where it will be forwarded in RTSP format to an iPhone device. I would like to know if Apple allows RTSP incoming streaming (both audio and video); are there any specific guidelines for this? NOTE : - i will use Xcode version 14.3.1 swift version 5.8 Thanks,
Posted
by
Post not yet marked as solved
1 Replies
422 Views
I am developing an iOS app using SwiftUI, and I want to provide the functionality to download an HLS (HTTP Live Streaming) livestream for offline usage. However, I'm facing difficulties finding specific examples or tutorials on how to achieve this in SwiftUI. Could anyone provide me with some guidance or point me to resources that demonstrate how to download HLS livestreams for offline playback using SwiftUI? I would appreciate any code snippets, libraries, or step-by-step explanations that can help me implement this feature successfully. Thank you in advance for your assistance!
Posted
by
Post not yet marked as solved
0 Replies
558 Views
I want to edit the m3u8 manifest file for all the segments of a hd movies that has about 500 ts files. I want to change the url of each one. So far i've had to do it manually and was wondering if there's a software or method to do this very quickly.
Posted
by
Post not yet marked as solved
0 Replies
562 Views
Hi, Replay doesn't work on HLS videos (.m3u8) on ios 16 safari when you get to the end of a video. It works on .mp4s, .movs, etc. I have written a github issue on the videojs repo here: https://github.com/videojs/video.js/issues/8345 But i'm starting to think its the new native ios 16 player that is causing issues and not the library itself.
Posted
by
Post not yet marked as solved
0 Replies
621 Views
Hello, I am developing a firewall against http attacks at Layer7 layer. And no system (linux/windows/android/bsd/ios version < 16) works flawlessly, except for ios 16 version. İos 16 device screen record (error): veed . io/view/ab86584b-c054-4b70-8c73-6ae9782fabad) Old ios version test (no error): I am using a golang http service in addition to nginx in the opened url. And when I try to access this golang code directly (ios16) I get 503 error from a device. And all this http service does is to set a cookie on the client after getting the useragent and ip information. Code: What new feature in iOS 16 prevents my service from running? and how can i fix this. Note: In iOS 16, the situation is the same in all browsers, not just safari, I tried it on chrome. However, there is no problem when I try it on 15 and lower versions, which is a lower version. Thanks for your help in advance.
Posted
by
Post not yet marked as solved
1 Replies
501 Views
I am aware that HLS is required for most video streaming use cases (watching a movie, TV show, or YouTube video). This is a requirement for all apps. However, I am confused as to whether this would also apply to video chat/video conferencing apps. It would be inefficient to upload compressed video using rtmp/rtp, decompress it, and create HLS segments. Low latency requirements only make this worse. So, is it permissible to use other protocols for video conferencing use cases? Thanks
Posted
by
Post not yet marked as solved
1 Replies
696 Views
Problem Description This HLS video https://lf3-vod-cdn-tos.douyinstatic.com/obj/vodsass/hls/main.m3u8 starts with noise at 22 seconds play directly on MacOS 12.6.6 Safari,and it also appears on iOS (16.5.1) safari. But there is no noise when playing with MSE on Mac by the third-party open source web playe such as hls.js on Safari. Test tool hls.js test demo: https://hlsjs.video-dev.org/demo/
Posted
by
Post not yet marked as solved
0 Replies
397 Views
Hi there, I'm currently making a web application using webRTC. Even though all the SDP info and ICES of caller, callee are well transmitted, the connection was kept failing. The other devices are functioning well. However at just Iphone(13), it's not working. I tried to connect at same network. And it's working. Therefore I think it's a problem about Ice candidates. I read similar post to avoid this issue. And when one of safari's advanced option called WebRTC platform UDP sockets is disabled it's working. Is there a way that I can connect without tuning options of Safari? Thanks. FYI this is one of my Iphone's Ice candidate:842163049 1 udp 1685921535 118.235.10.100 50750 typ srflx raddr 0.0.0.0 rport 50750 generation 0 ufrag 7e7f network-id 3 network-cost 900
Posted
by
Post marked as solved
1 Replies
558 Views
I am facing an issue with video content that I have converted to HLS playlist content (using ffmpeg) added to an S3 bucket that is shared through a Cloudfront Distribution. My scenario is the following: I have a bucket called bucket-a, with a "folder" video-1 which contains the following files: output.m3u8 output0.ts ... output15.ts audio/ audio.aac image.jpg All items in bucket-a are blocked from public access through S3. Content is only vended through a Cloudfront distribution which has origin bucket-a. I am able to access https://.cloudfront.net/path/output.m3u8 on a desktop browser without fail, and no errors thrown. But the file output.m3u8 and all .ts files are not available on iPhone mobile browsers. The part that is peculiar is that this is not true for all playlist content in bucket-a. For example, I have a "folder" video-2 within bucket-a that has the same file structure as video-1 that is completely accessible through all mobile browsers. Here is an example master playlist error: https://dbs3s11vyxuw0.cloudfront.net/bottle-promo/script_four/output.m3u8 Even more head-scratching is that I am able to access all the playlists that are within this playlist. What I've tried: Initially, I believed the issue to be due to the way the video was transcoding so I standardized the video transcoding. Then I believed the issue to be due to CloudFront permissions, though those seem to be fine. I've validated my stream here: https://ott.dolby.com/OnDelKits_dev/StreamValidator/Start_Here.html Not sure which way to turn.
Posted
by
Post not yet marked as solved
0 Replies
313 Views
Hi, I am using HLS playback for live broadcasting. I am using AVPlayerItem. In live, I found seekableDuration always have some offset from the latest moment compared to the same playback on Chrome or Android. As far as I have digged into, the difference approximately matches the recommendedTimeOffsetFromLive (usually 6~9 seconds on my tests). The problem is, I tried to minimize configuredTimeOffsetFromLive but it does not have any effect. Even if I set it to 1~2 seconds, it is always the same as recommendedTimeOffsetFromLive. I tried to change automaticallyPreservesTimeOffsetFromLive as well, but nothing seems working. How do these properties work and how can I make the time offset minimized?
Posted
by
Post not yet marked as solved
0 Replies
322 Views
Hi, I have an HLS content i.e., .m3u8 manifest file but the segments are encoded with **MPEG2Video. ** Is such encoding supported by HLS? Or they only support H.264/AVC or HEVC/H.265? Stream #0:0[0x281]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv, bt470bg, top first), 720x576 [SAR 16:15 DAR 4:3], 3125 kb/s, 25 fps, 25 tbr, 90k tbn Stream #0:1[0x201]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, fltp, 128 kb/s Thanks.
Posted
by