HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

Posts under HTTP Live Streaming tag

84 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

mediastreamvalidator always signals 'unsupported sudo track codec' with AES sample encrypted streams
Hi,When I run mediastreamvalidator on HLS streams (produced by mediafilesegmenter), with SAMPLE-AES applied, the latest version always signals :Unsupported audio track codec: Unknown - As a 'MUST FIX' issue.This result occurs even when I run the validator against the FairPlayStreaming example content supplied with the developer SDK package.This seems to have been introduced in HTTPLiveStreamingTools version:Version 1.2(170524)It does not happen with tools version:Version 1.2(160525)Is this a bug in the tool? or a new requirement perhaps?Hopefully someone can help me out with this...
2
0
1.5k
Aug ’23
HLS media playlist Captions and the DEFAULT attribute
When describing closed captions renditions through HLS master playlists, should a properly formed playlist with closed captions contain one closed caption rendition with the DEFAULT=YES attribute? I searched the HLS RFC and section 4.3.4.1 on EXT-X-MEDIA mentions that no more than one should contain the DEFAULT=YES attribute. I was hoping to find recommendations around whether with one or more EXT-X-MEDIA renditions in the same group, if it is required that one of them contain DEFAULT=YES.The media player I am using does not display closed captions if one of the closed captions renditions does not contain the DEFAULT=YES attribute and I am wondering if that is an issue with the player or an issue with a malformed HLS playlist. Note the lack of a DEFAULT=YES attribute in the example playlist below. Should this still be considered a valid playlist?#EXTM3U #EXT-X-VERSION:3 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-MEDIA:TYPE=CLOSED-CAPTIONS,GROUP-ID="CC",LANGUAGE="eng",NAME="English",INSTREAM-ID="CC1" #EXT-X-STREAM-INF:BANDWIDTH=2103200,AVERAGE-BANDWIDTH=2305600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=960x540,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream #EXT-X-STREAM-INF:BANDWIDTH=804760,AVERAGE-BANDWIDTH=875600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=640x360,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream #EXT-X-STREAM-INF:BANDWIDTH=1304160,AVERAGE-BANDWIDTH=1425600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=768x432,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream #EXT-X-STREAM-INF:BANDWIDTH=505120,AVERAGE-BANDWIDTH=545600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=480x270,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream #EXT-X-STREAM-INF:BANDWIDTH=3102000,AVERAGE-BANDWIDTH=3405600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=1280x720,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream
1
0
3.9k
Sep ’23
-16832/CoreMediaErrorDomain Warning: restarting from end of live playlist
Hello Guys,Recently, I'm getting this kind of issue on the HLS live streaming content,The error detail is like belowerror -16832/CoreMediaErrorDomain Warning: restarting 12.494080s from end of live playlist; target duration 9s - stall danger for https://C4--147-21-8-0-24-8395.dfwlive-v1-c4p1-sponsored.dfw.vcdn.att-idns.net/Content/HLS.abre/Live/channel(CMTHD-8395.dfw.1080)/uc9zjusg758wyw3zmumhqyck92mpguti-ks7qcjbcm2aaaaaa-20191125T201851-197474271-114.vttAnyone know what this is? It seems make the picture frozen, or degrade the streaming quality. Sometimes end up restarting the playback.Thanks & regards.
2
2
1.7k
Jun ’23
Putting the TS into tsrecompressor
I notice that precious few HTTP Live Streaming questions have gotten responses, let alone answers. But I'll be optimistic!I'd like to try streaming my own live video through the LL-HLS tools (currently prerelease 73), in multiple bitrates.I succeeded in following directions: I use tsrecompressor to generate "bip-bop" video and pass three compressed variations of that into three instances of mediastreamsegmenter, then out through three instances of ll-hls-origin-example.go. It works as promised, end-to-end. (Brief aside, for any who may stumble after me: it took me too long to realize that I should improve my knowledge of IP Multicasting and use the prescribed 224.0.0.50 address. I got nowhere trying to simply route familiar UDP unicast between my processes.)So far, so good. Now I want to supply my own video from an external feed. Not the generated "bip-bop" or any local capture devices.% tsrecompressor --help tsrecompressor: unrecognized option `--help' Read input MPEG-2 TS, recompress and write to output. Usage: tsrecompressor [options] where options are: -i | --input-file= : input file path (default is stdin) ... etc.That sounds fantastic — I'd love to feed tsrecompressor through stdin! But in what format? It doesn't say, and my first few dozen guesses came up cold.The man page for mediastreamsegmenter appears to point the way: The mediastreamsegmenter only accepts MPEG-2 Transport Streams as defined in ISO/IEC 14496-1 as input. The transport stream must contain H.264 (MPEG-4, part 10) video and AAC or MPEG audio. If AAC audio is used, it must have ADTS headers. H.264 video access units must use Access Unit Delimiter NALs, and must be in unique PES packets.Of course, that's mediastreamsegmenter and not tsrecompressor. But it's a start. So this is my best guess at the appropriate ffmpeg output. (Recall that I want to eventually pass a live stream into ffmpeg; for now I'm starting with an m4v file.)% ffmpeg -re -i infile.m4v \ -c:v h264_************ \ -c:a aac_at \ -f mpegts \ - | tsrecompressor -h -a \ -O 224.0.0.50:9121 \ -L 224.0.0.50:9123 \ -P 224.0.0.50:9125This ends abruptly after 9 frames.av_interleaved_write_frame(): Broken pipe Error writing trailer of pipe:: Broken pipe//My best results are when I change from H.264 to H.265:% ffmpeg -re -i infile.m4v \ -c:v hevc_************ \ -c:a aac_at \ -f mpegts \ - | tsrecompressor -h -a \ -O 224.0.0.50:9121 \ -L 224.0.0.50:9123 \ -P 224.0.0.50:9125Now it doesn't break the pipe. It keeps counting along, frame after frame. The VTEncoderXPCService starts up, and sampling of the tsrecompressor process shows both producer and consumer threads for audio recompression.But there's no output. There was output for the generated "bip-bop" video. Not for HEVC-TS via stdin. I'm not 100% certain yet but I see no indication of any UDP output from tsrecompressor. The three mediastreamsegmenter processes sit idle.Am I missing some tag, or something, in the input stream? Do I need to pay more attention to chunk sizes and frame offsets?Thanks, all, for any insight or experience.
7
0
1.8k
Aug ’23
How to cache an HLS video while playing it
Hi, I'm working on an app where a user can scroll through a feed of short videos (a bit like TikTok). I do some pre-fetching of videos a couple positions ahead of the user's scroll position, so the video can start playing as soon as he or she scrolls to the video. Currently, I just pre-fetch by initializing a few AVPlayers. However, I'd like to add a better caching system. I'm looking for the best way to: get videos to start playing as possible, while making sure to minimize re-downloading of videos if a user scrolls away from a video and back to it. Is there a way that I can cache the contents of an AVPlayer that has loaded an HLS video? Alternatively, I've explored using AVAssetDownloadTask to download the HLS videos. My issue is that I can't download the full video and then play it - I need to start playing the video as soon as the user scrolls to it, even if it's not done downloading. Is there a way to start an HLS download with an AVAssetDownloadTask, and then start playing the video while it continues downloading? Thank you!
10
0
6.4k
Oct ’23
Video Quality selection in HLS streams
Hello there, in our team we were requested to add the possibility to manually select the video quality. I know that HLS is an adaptive stream and that depending on the network condition it choose the best quality that fits to the current situation. I also tried some setting with preferredMaximumResolution and preferredPeakBitRate but none of them worked once the user was watching the steam. I also tried something like replacing the currentPlayerItem with the new configuration but anyway this only allowed me to downgrade the quality of the video. When I wanted to set it for example to 4k it did not change to that track event if I set a very high values to both params mentioned above. My question is if there is any method which would allow me to force certain quality from the manifest file. I already have some kind of extraction which can parse the manifest file and provide me all the available information but I couldn't still figure out how to make the player reproduce specific stream with my desired quality from the available playlist.
4
0
6.5k
Oct ’23
iOS 15 - HLS parallel download issue
Hi, We are using AVPlayer for FPL HLS stream, after migrating to iOS 15 (currently Beta 3), we have observed a strange behavior during segments download; The player downloads last video segment of VOD HLS stream The player downloads all audio segments Playback starts playing from the end (last segment) Needs to restart playback to start downloading all video segments. Note that we do not have this behavior with older versions of iOS (14 and before). m3u8 file Thank you,
3
0
2.0k
Jun ’23
How to insert timed metadata (id3) into live HLS files with Apple's mediastreamsegmenter and ffmpeg
I am trying to insert timed metadata (id3) into a live HLS stream created with Apple's mediastreamsegmenter tool. I am getting the video from an ffmpeg stream, here is the command I run to test from an existing file: ffmpeg -re -i vid1.mp4 -vcodec libx264 -acodec aac -f mpegts - | mediastreamsegmenter -f /Users/username/Sites/video -s 10 -y test -m -M 4242 -l log.txt To inject metadata, I run this command: id3taggenerator -text '{"x":"data dan","y":"36"}' -a localhost:4242 This setup creates the expected .ts files and I can play back the video/audio with no issues. However the metadata I am attempting to insert does not work in the final file. I know the metadata is there in some form, when I file-compare a no-metadata version of the video to one I injected metadata into, I can see the ID3 tags within the binary data. Bad File Analysis When I analyze the generated files using ffmpeg: ffmpeg -i video1.ts the output I get is: [mpegts @ 0x7fb00a008200] start time for stream 2 is not set in estimate_timings_from_pts[mpegts @ 0x7fb00a008200] stream 2 : no TS found at start of file, duration not set[mpegts @ 0x7fb00a008200] Could not find codec parameters for stream 2 (Audio: mp3, 0 channels): unspecified frame sizeConsider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options Input #0, mpegts, from 'video1.ts':  Duration: 00:00:10.02, start: 0.043444, bitrate: 1745 kb/s  Program 1   Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 848x464 [SAR 1:1 DAR 53:29], 30 fps, 30 tbr, 90k tbn, 60 tbc   Stream #0:1[0x101] Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 130 kb/s  No Program   Stream #0:2[0x102]: Audio: mp3, 0 channels Note how the third stream (stream #0:2) is marked as mp3...this is incorrect! Also it says "No Program", instead of being in "Program 1". When I analyze a properly encoded video file with inserted ID3 metadata that I created with Apple's mediafilesegmenter tool, the analysis shows a "timed_id3" track and this metadata track works properly in my web browser. Good File Analysis ffmpeg -i video1.ts —Input #0, mpegts, from 'video1.ts':  Duration: 00:00:10.08, start: 19.984578, bitrate: 1175 kb/s  Program 1   Stream #0:0[0x101]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 848x464, 30 fps, 30 tbr, 90k tbn, 180k tbc  Stream #0:1[0x102]: Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 67 kb/s  Stream #0:2[0x103]: Data: timed_id3 (ID3  / 0x20334449) I must use mediastreamsegmenter because that is required for live streams. Does anyone know how I can get timed ID3 metadata into a live HLS stream properly?
2
0
2.2k
Sep ’23
iOS 16 - Exception Thrown for processContentKeyResponse
Issue: I am supporting an iOS application that streams Fairplay DRM protected content. On iOS 16 devices, I am seeing intermittent exceptions thrown when trying to process the CKC returned by the license server. The thrown exception is as follows: -[AVContentKeyRequest processContentKeyResponse:] AVContentKeySession's keySystem is not same as that of keyResponse This issue does not occur on older devices (we support iOS 13, 14, 15) I am unable to find documentation about this error so any insight is appreciated: High-Level Code Overview Use ContentKeyRequest to request an application certificate Use returned Cert to call makeStreamingContentKeyRequestData Use returned data to request FairPlay license Use returned CKC to generate AVContentKeyResponse (i.e. AVContentKeyResponse(fairPlayStreamingKeyResponseData:_)) Call processContentKeyResponse(_) App crash/exception thrown when callling processContentKeyResponse I am seeing other issues related to DRM and iOS 16 but these are specific to downloaded and offline content which do not match my use case.
3
0
2.5k
Jul ’23
Audio Interruption Issues with AVPlayer During Live Streaming via Amazon Kinesis
Hello, I've encountered a recurring issue while trying to play back live streams using AVPlayer in an iOS app. The video stream is being delivered via Amazon Kinesis Video Streams (KVS) using HLS. The specific issue is that audio frequently gets interrupted during playback. The video continues to play just fine, but the audio stops. This issue seems to occur only on iOS devices and not on other platforms or players. When I check the console logs, I see a number of error messages that may be related to the issue: 2023-05-11 20:57:27.494719+0200 Development[53868:24121620] HALPlugIn::DeviceGetCurrentTime: got an error from the plug-in routine, Error: 1937010544 (stop) 2023-05-11 20:57:27.534340+0200 Development[53868:24121620] [aqme] AQMEIO.cpp:199 timed out after 0.011s (6269 6269); suspension count=0 (IOSuspensions: ) 2023-05-11 20:57:30.592067+0200 Development[53868:24122309] HALPlugIn::DeviceGetCurrentTime: got an error from the plug-in routine, Error: 1937010544 (stop) 2023-05-11 20:57:30.592400+0200 Development[53868:24122309] HALPlugIn::DeviceGetCurrentTime: got an error from the plug-in routine, Error: 1937010544 (stop) I've attempted to troubleshoot this issue in various ways, including trying different iOS devices and networks. I've also attempted to use VLC's player on iOS, which doesn't have the audio interruption issue, but it does encounter other problems. I believe there might be some compatibility issue between AVPlayer and KVS. I've posted a similar issue on the Amazon KVS GitHub repo but I am reaching out here to see if anyone has faced a similar issue with AVPlayer and has found a solution or can provide some guidance. Has anyone encountered this issue before, or does anyone have suggestions on how to address it? Any help would be greatly appreciated!
1
1
1.4k
Oct ’23
CoreMediaErrorDomain : code : -16012
Hi , We are getting big spikes of errors on very few programs in some live channels. [-16012:CoreMediaErrorDomain] [Error Domain=CoreMediaErrorDomain Code=-16012 "(null)"] When this error occurred , AVPlayer stops & users has to restart the playback. This is the error we are getting & this happens in some live programs. We have the same set up & uses same transcoders etc in all programs. Mostly we have very low error rate in player with live programs , but with this error , error rate can increase up to 80% of the users effecting pretty much all the users on apple devices. Does anyone know what this error actually means ? What is the context & what is the reason behind that ? It seems like this may be related subtitles & this occurs only when the subtitles are enabled. ( The subtitles are not embedded in the stream it is teletext ) Try to find in apple documents & online & nothing could be find unfortunately.
1
0
1.2k
Oct ’23
Error 16833 and 19152
We have created a hls playback framework and lot of our client's are complaining about this error {"code": -19152, "domain": "CoreMediaErrorDomain", "localizedDescription": "The operation couldn’t be completed. (CoreMediaErrorDomain error -19152 - The operation couldn’t be completed. (CoreMediaErrorDomain error -19152.))", "localizedFailureReason": "", "localizedRecoverySuggestion": ""} We are unable to reproduce this issue on our end but we have data reflecting the same error happening at good rate. Any help/hint is welcome. Thanks
1
0
1.1k
Jun ’23
How to intercept HLS Playlist chunk request for CDN token implementation
Hello, We are using HLS for our streaming iOS and tvOS applications. We have DRM protection on our applications but we want add another secure layer which is CDN token. We want to add that CDN token data on header or query parameters. Any of two are applicable at our CDN side. There is a problem at client side. We want to send that token knowledge and refresh at given a time. We add token data using at initial state let asset = AVURLAsset(url: url, options: ["AVURLAssetHTTPHeaderFieldsKey": headers]) and add interceptor with asset.resourceLoader.setDelegate. It works seamlessly. We use AVAssetResourceLoaderDelegate and we can intercept just master playlist and playlists via func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool and we can refresh CDN token data at only playlists. That token data can be at query params or header. It does not matter. For example, #EXTM3U #EXT-X-VERSION:3 #EXTINF:10.0 https://chunk1?cdntoken=A.ts #EXTINF:10.0 https://chunk2?cdntoken=A.ts #EXTINF:10.0 https://chunk3?cdntoken=A.ts #EXTINF:10.0 assume that it is our .m3u8 file for given live video playlist. It has three chunks with cdn token data in query params. If we give that chunks to AVPlayer. It is going to play chunks in order. When we change new cdn token at query params, it effects our chunk Urls and our player stalls. It is because our cdn adds new cdn token knowledge to chunk's Url. It means that our new .m3u8 is going to be like that for next playlist; #EXT-X-VERSION:3 #EXTINF:10.0 https://chunk4?cdntoken=B.ts #EXTINF:10.0 https://chunk5?cdntoken=B.ts #EXTINF:10.0 https://chunk6?cdntoken=B.ts #EXTINF:10.0 Cdn token data is converted to B from A at cdn side and it sends new playlist like that. That's why, our player is going to stall. Is there any way not to player stall with edited chunk url? When we change new cdn token at header. It does not change chunks url like at the first question but AVPlayer does not allow to intercept chunk Urls which means that before callin https://chunk1?cdntoken=A.ts url, i want to intercept and add new cdn token data to header. Is there any way to intercept chunk Urls like intercepting playlist? Thanks for answers in advance
3
0
1.4k
Aug ’23
I can't play m3u8 containing video with libx265 code.
I have used ffmpeg to transform a video into multiple chunks, the ffmpeg command works fine and I can play the video. I have created an index.m3u8 that contains 4 qualities, each of those qualities contains the chunks of the video, I can play each chunk, but I can't play anything when using the index.m3u8. here is my index #EXTM3U #EXT-X-VERSION:3 #EXT-X-STREAM-INF:BANDWIDTH=5000000,RESOLUTION=1920x1080 http://cdn... #EXT-X-STREAM-INF:BANDWIDTH=2800000,RESOLUTION=1280x720 http://cdn... #EXT-X-STREAM-INF:BANDWIDTH=1400000,RESOLUTION=842x480 http://cdn... #EXT-X-STREAM-INF:BANDWIDTH=800000,RESOLUTION=640x360 http://cdn... here is my quality file #EXTM3U #EXT-X-PLAYLIST-TYPE:VOD #EXT-X-TARGETDURATION:8.341667 #EXT-X-VERSION:3 #EXT-X-MEDIA-SEQUENCE:24 #EXTINF:8.341667, http://cdn.... #EXTINF:8.341667, http://cdn.... #EXTINF:8.341667, http://cdn.... #EXTINF:6.873533, http://cdn.... #EXT-X-ENDLIST
1
0
562
Jun ’23
Live Streaming
Hello, I have an IP Camera that is able to do RTSP streaming. This stream would go to my server, where it will be forwarded in RTSP format to an iPhone device. I would like to know if Apple allows RTSP incoming streaming (both audio and video); are there any specific guidelines for this? NOTE : - i will use Xcode version 14.3.1 swift version 5.8 Thanks,
0
0
361
Jun ’23
Downloading HLS Livestream for Offline Usage in SwiftUI
I am developing an iOS app using SwiftUI, and I want to provide the functionality to download an HLS (HTTP Live Streaming) livestream for offline usage. However, I'm facing difficulties finding specific examples or tutorials on how to achieve this in SwiftUI. Could anyone provide me with some guidance or point me to resources that demonstrate how to download HLS livestreams for offline playback using SwiftUI? I would appreciate any code snippets, libraries, or step-by-step explanations that can help me implement this feature successfully. Thank you in advance for your assistance!
1
1
448
Jul ’23
iOS 16 Webview Set Cookie Or Response Error
Hello, I am developing a firewall against http attacks at Layer7 layer. And no system (linux/windows/android/bsd/ios version < 16) works flawlessly, except for ios 16 version. İos 16 device screen record (error): veed . io/view/ab86584b-c054-4b70-8c73-6ae9782fabad) Old ios version test (no error): I am using a golang http service in addition to nginx in the opened url. And when I try to access this golang code directly (ios16) I get 503 error from a device. And all this http service does is to set a cookie on the client after getting the useragent and ip information. Code: What new feature in iOS 16 prevents my service from running? and how can i fix this. Note: In iOS 16, the situation is the same in all browsers, not just safari, I tried it on chrome. However, there is no problem when I try it on 15 and lower versions, which is a lower version. Thanks for your help in advance.
0
2
673
Jul ’23
Video Streaming Protocols Supported
I am aware that HLS is required for most video streaming use cases (watching a movie, TV show, or YouTube video). This is a requirement for all apps. However, I am confused as to whether this would also apply to video chat/video conferencing apps. It would be inefficient to upload compressed video using rtmp/rtp, decompress it, and create HLS segments. Low latency requirements only make this worse. So, is it permissible to use other protocols for video conferencing use cases? Thanks
1
0
530
Jul ’23