HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

Posts under HTTP Live Streaming tag

83 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Vision Pro capabliity
Hello everyone! I'm planning to buy an Apple Vision Pro (for replacing a Varjo XR-3) I want to use it for a professional project, and I want to know if it can fit our need. I want to develop a program on the Vision Pro for playing some live streaming videos from our local network cameras. (using RTSP) Is this possible to get and play more than one live stream video. One of those videos come from a stereo camera, streaming a SideBySide 3d stereo video. Is this possible to have a classic 2d video in one ultra-wide virtual screen and another one virtual screen displaying a 3D video with depth simultaneously? Thank you everyone in advance. Regard's.
0
0
70
5d
Error 15517 when playing HLS
Playing fMP4 HLS stream on VisionOS beta. This is the stream, HEVC main 10 and EAC3 6 channel: #EXT-X-STREAM-INF:BANDWIDTH=6760793,AVERAGE-BANDWIDTH=6760793,VIDEO-RANGE=PQ,CODECS="hvc1.2.4.L150.B0,mp4a.a6",RESOLUTION=3840x2160,FRAME-RATE=23.976,SUBTITLES="subs" This is what AVPlayer says: Error Domain=AVFoundationErrorDomain Code=-11848 "Cannot Open" UserInfo={NSLocalizedFailureReason=The media cannot be used on this device., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x3009e37b0 {Error Domain=CoreMediaErrorDomain Code=-15517 "(null)"}} I can't find any documentation for the underlying error 15517. Is it because "mp4a.a6" is declared in the codec list and not "ec-3"? hlsreport has these MUST FIX issues: 1. Measured peak bitrate compared to multivariant playlist declared value exceeds error tolerance Multivariant Playlist Stream Definition for All Variants 2. Stereo audio in AAC-LC, HE-AAC v1, or HE-AAC v2 format MUST be provided Multivariant Playlist 3. If Dolby Digital Plus is provided then Dolby Digital MUST be provided also Multivariant Playlist 4. I-frame playlists ( EXT-X-I-FRAME-STREAM-INF ) MUST be provided to support scrubbing and scanning UI Multivariant Playlist 5. The server MUST deliver playlists using gzip content-encoding All Variants All Renditions Multivariant Playlist 6. You MUST provide multiple bit rates of video Multivariant Playlist 7. Playlist codec type doesn't match content codec type All Variants 8. (Segment) The operation couldn’t be completed. (HTTPPumpErrorDomain error -16845 - HTTP 400: (unhandled)) (list of subtitle renditions) 9. (Segment) HTTP 400 - HTTP/2.0 400 Bad Request (list of subtitle renditions) 10. Multichannel audio MUST be separate audio stream All Variants 11. If EXT-X-INDEPENDENT-SEGMENTS is not in the multivariant playlist, then you MUST use the EXT-X-INDEPENDENT-SEGMENTS tag in all video media playlists All Variants 12. The CODECS attribute MUST include every media format present All Variants, does not declare EC-3
1
0
109
6d
How can I cache only the initial few seconds (chunks) of an HLS stream on the disk?
Hi, I am working on an app that is very similar to TikTok in terms of video experience. There is an infinite scroll feed of videos, and I am using HLS URLs as the video source. My requirement is to cache the initial few seconds of each video on the disk while the video is playing. The next time a user views the video, it should play the initial few seconds from the cache, with the subsequent chunks coming from the network. Additionally, when there is no network connection, the video should still play the initial few seconds from the cache. I was able to achieve this with MP4 using AVAssetResourceLoaderDelegate, but the same approach is not possible with HLS. What are some other ways through which I can implement this feature? Thanks.
2
0
233
4w
mediafilesegmenter can't create format reader
I am using HTTP Live Streaming Tools to segment a spatial video (MV-HEVC) recorded by Vision Pro. I first used the macOS build on my MacBook, it works beautifully with the command: mediafilesegmenter -r -f path/to/destination path/to/movie.MOV But when I tried to use the CentOS build in a Docker container and segment the exact same file using the exact same command, it gives the following error: can't create format reader /path/to/movie.MOV 561211770 Unable to find any valid tracks to segment. I looked up the error code, it seems to correspond to kAudioSessionBadPropertySizeError. Any idea why?
2
0
190
4w
AVPlayer and TLS 1.3 compliance for low latency HLS live stream
Hi guys, I'm investigating failure to play low latency Live HLS stream and I'm getting following error: (String) “<AVPlayerItemErrorLog: 0x30367da10>\n#Version: 1.0\n#Software: AppleCoreMedia/1.0.0.21L227 (Apple TV; U; CPU OS 17_4 like Mac OS X; en_us)\n#Date: 2024/05/17 13:11:46.046\n#Fields: date time uri cs-guid s-ip status domain comment cs-iftype\n2024/05/17 13:11:16.016 https://s2-h21-nlivell01.cdn.xxxxxx.***/..../xxxx.m3u8 -15410 \“CoreMediaErrorDomain\” \“Low Latency: Server must support http2 ECN and SACK\” -\n2024/05/17 13:11:17.017 -15410 \“CoreMediaErrorDomain\” \“Invalid server blocking reload behavior for low latency\” -\n2024/05/17 13:11:17.017 The stream works when loading from dev server with TLS 1.3, but fails on CDN servers with TLS 1.2. Regular Live streams and VOD streams work normally on those CDN servers. I tried to configure TLSv1.2 in Info.plist, but that didn't help. When running nscurl --ats-diagnostics --verbose it is passing for the server with TLS 1.3, but failing for CDN servers with TLS 1.2 due to error Code=-1005 "The network connection was lost." Is TLS 1.3 required or just recommended? Refering to https://developer.apple.com/documentation/http-live-streaming/enabling-low-latency-http-live-streaming-hls and https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis Is it possible to configure AVPlayer to skip ECN and SACK validation? Thanks.
0
0
284
May ’24
AVPlayer CoreMediaErrorDomain -12642
Hi everyone, I am having a problem on AVPlayer when I try to play some videos. The video starts for a few seconds, but immediately after I see a black screen and in the console there is the following error: <__NSArrayM 0x14dbf9f30>( { StreamPlaylistError = "-12314"; comment = "have audio audio-aacl-54 in STREAMINF without EXT-X-MEDIA audio group"; date = "2024-05-13 20:46:19 +0000"; domain = CoreMediaErrorDomain; status = "-12642"; uri = "http://127.0.0.1:8080/master.m3u8"; }, { "c-conn-type" = 1; "c-severity" = 2; comment = "Playlist parse error"; "cs-guid" = "871C1871-D566-4A3A-8465-2C58FDC18A19"; date = "2024-05-13 20:46:19 +0000"; domain = CoreMediaErrorDomain; status = "-12642"; uri = "http://127.0.0.1:8080/master.m3u8"; } )
1
0
372
May ’24
iOS to Android H264 encoding issue.
I'm trying to cast the screen from an iOS device to an Android device. I'm leveraging ReplayKit on iOS to capture the screen and VideoToolbox for compressing the captured video data into H.264 format using CMSampleBuffers. Both iOS and Android are configured for H.264 compression and decompression. While screen casting works flawlessly within the same platform (iOS to iOS or Android to Android), I'm encountering an error ("not in avi mode") on the Android receiver when casting from iOS. My research suggests that the underlying container formats for H.264 might differ between iOS and Android. Data transmission over the TCP socket seems to be functioning correctly. My question is: Is there a way to ensure a common container format for H.264 compression and decompression across iOS and Android platforms? Here's a breakdown of the iOS sender details: Device: iPhone 13 mini running iOS 17 Development Environment: Xcode 15 with a minimum deployment target of iOS 16 Screen Capture: ReplayKit for capturing the screen and obtaining CMSampleBuffers Video Compression: VideoToolbox for H.264 compression Compression Properties: kVTCompressionPropertyKey_ConstantBitRate: 6144000 (bitrate) kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Main_AutoLevel (profile and level) kVTCompressionPropertyKey_MaxKeyFrameInterval: 60 (maximum keyframe interval) kVTCompressionPropertyKey_RealTime: true (real-time encoding) kVTCompressionPropertyKey_Quality: 1 (lowest quality) NAL Unit Handling: Custom header is added to NAL units Android Receiver Details: Device: RedMi 7A running Android 10 Video Decoding: MediaCodec API for receiving and decoding the H.264 stream
0
0
269
May ’24
Does HLS support MVHEVC HDR10 ?
I am setting up a HLS server for MVHEVC files... just find that if tag mp4 files using Asset Writer with let colorPropertySettings = [ AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_709_2, AVVideoYCbCrMatrixKey: AVVideoTransferFunction_ITU_R_709_2, AVVideoTransferFunctionKey: AVVideoYCbCrMatrix_ITU_R_709_2 ] the HLS playback well on Safari. but if tag mp4 files using using Asset Writer with let colorPropertySettings = [ AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020, AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020, AVVideoTransferFunctionKey: AVVideoTransferFunction_SMPTE_ST_2084_PQ ] the HLS can not play on Safari. looks like HLS does NOT support MVHEVC HDR10. OR have I lost any setting for MVHEVC HDR10? thanks.
0
0
221
May ’24
RealityView with FairPlay
I'm trying to implement the playback of an HLS content with FairPlay, and I want to insert it into a RealityView using a VideoMaterial of a sphere. When I use unencrypted HLS content everything works correctly, but when I use FairPlay it doesn't. To initialize FairPlay I am using the following in the view: let contentKeyDelegate = ContentKeySessionDelegate(licenseURL: licenseURL, certificateURL: certificateURL) // Create the Content Key Session using the FairPlay Streaming key system. let contentKeySession = AVContentKeySession(keySystem: .fairPlayStreaming) contentKeySession.setDelegate(contentKeyDelegate, queue: DispatchQueue.main) contentKeySession.addContentKeyRecipient(asset) Has anyone else encountered this problem? Note: I'm testing in Vision Pro directly because the simulator hasn't support for FairPlay.
2
0
340
May ’24
Issue with AssetDownloadTask for HLS stream
In my app I play HLS streams via AVPlayer. It works well! However, when I try to download those same HLS urls via MakeAssetDownloadTask I regularly come across the error: Download error for identifier 21222: Error Domain=CoreMediaErrorDomain Code=-12938 "HTTP 404: File Not Found" UserInfo={NSDescription=HTTP 404: File Not Found, _NSURLErrorRelatedURLSessionTaskErrorKey=( "BackgroundAVAssetDownloadTask <CE9B10ED-E749-49FF-9942-3F8728210B20>.<1>" ), _NSURLErrorFailingURLSessionTaskErrorKey=BackgroundAVAssetDownloadTask <CE9B10ED-E749-49FF-9942-3F8728210B20>.<1>} I have a feeling that the AVPlayer has a way to resolve this that the MakeAssetDownloadTask lacks. I am wondering if any of you have come across this or have insight. Thank you! BTW this is using Xcode Version 15.3 (15E204a) and developing for visionOS 1.0.1
2
0
319
Apr ’24
EXT-X-DEFINE: how to implement QUERYPARAM on iOS 16.5+
I have a stream.m3u8 file with the following contents: #EXTM3U #EXT-X-VERSION:11 #EXT-X-DEFINE:QUERYPARAM="token" #EXT-X-STREAM-INF:BANDWIDTH=400000,RESOLUTION=640x360 360.m3u8{$token} #EXT-X-DEFINE:QUERYPARAM="token" #EXT-X-STREAM-INF:BANDWIDTH=1000000,RESOLUTION=960x540 540.m3u8{$token} #EXT-X-STREAM-INF:BANDWIDTH=2000000,RESOLUTION=1280x720 #EXT-X-DEFINE:QUERYPARAM="token" 720.m3u8{$token} #EXT-X-STREAM-INF:BANDWIDTH=8000000,RESOLUTION=1920x1080 #EXT-X-DEFINE:QUERYPARAM="token" 1080.m3u8{$token} #EXT-X-STREAM-INF:BANDWIDTH=16000000,RESOLUTION=3840x2160 #EXT-X-DEFINE:QUERYPARAM="token" 2160.m3u8{$token} The requesting URL is https://example.com/stream.m3u8!?token= and I am using an iPhone 15 running iOS 17.1. The QUERYPARAM substitution as described here is not working, because the requested URL for 540.m3u8 does not have a token appended it; I know this because I checked my apache error log. The file is hosted on a Debian 11 server. Have I implemented this correctly, and do I have the correct tech stack? Furthermore, what debugging tools should I be running to investigate further?
1
0
281
Apr ’24
HLS+FairPlay stream playback sometimes fails with Safari on macOS when variants are not ordered by increasing bitrate
While setting up our premium video-on-demand workflow in AWS, using AWS MediaConvert and MediaPackager and licence delivery from drmToday we encountered an issue with HLS+FairPlay playback (only) in Safari on macOS. The issue is that sometimes (more than 50% on the same video) the videoplayer initialization fails (with simple event of type=”error” in onerror callback). We are using Shaka player in our web application, so we first assumed that this (random) issue could be due to Shaka. However, we also tested with direct playback via the player and we observed the same issue, with same frequency. Since we have some content for which this problem does not occur and other content when the problem occurs very frequently, we tried to understand what could explain this difference. We noticed that for assets where the problem never occurs the order of the video submanifest was increasing, whereas for assets where the problem occurs frequently the order is decreasing. To isolate the issue we created a standalone page for a 2-minute asset and we are able to demonstrate that on this asset, when the bitrates are in decreasing order the playback with Safari on macOS fails more than 50% of the time. Test page using tag: KO: https://ntg-test-public-scr.s3.eu-west-1.amazonaws.com/aws-video.html OK: https://ntg-test-public-scr.s3.eu-west-1.amazonaws.com/aws-video.html?ok=1 Test page using Shaka: KO: https://ntg-test-public-scr.s3.eu-west-1.amazonaws.com/aws-shaka.html OK: https://ntg-test-public-scr.s3.eu-west-1.amazonaws.com/aws-shaka.html?ok=1 Notes: the issue is only reproducible with Safari on macOS (not with Safari on iOS) same HLS content + FairPlay plays OK 100% on tvOS the issue is only reproducible for HLS content with FairPlay (OK if no DRM)
0
0
552
Apr ’24
tsrecompressor (LL-HLS) input file option & 4K source
Now I'm preparing LL-HLS demonstration by using HLS tools. Generated pattern and screen capture are okay. But input file option does not work. Please check whether it's possible to use explicit input file as source. In case of screen recording, the max resolution of multi variant manifest is FHD even though the display resolution is 3840x2160. Please check whether it's possible to set 4K UHD resolution.
1
0
294
Apr ’24
AVAssetDurationDidChange - notification not receiving in iOS 17
We utilized AVFragmentedAssetMinder to refresh the player data. While notifications for AVAssetDurationDidChange were consistently received whenever the player duration changed. However, following the release of iOS 17, notifications for AVAssetDurationDidChange ceased to be received. Could you please advise anyone why this notification is not being triggered? what we have to change NotificationCenter.default.addObserver(self, selector: #selector(self.onVideoUpdate), name: .AVAssetDurationDidChange, object: nil) #AVPLAyer, #AVMUtableMovie
0
0
342
Mar ’24
How to encode MV-HEVC spatial videos using HLS tool correctly?
Hi all, I am a graduate student who is looking into making MV-HEVC videos streamable. May I ask that is it possible to encode mv-hevc videos with the HLS (Http Live Streaming) protocol? I've been trying to use the HLS tool by Apple to encode a spatial video recored by VP. mediafilesegmenter -iso-fragmented -t 4 -f sp_video-1-vp spatial-video-by-vp.MOV But the output HLS playlist file doesn't look like the format that Apple proposes in the WWDC video. For example, the attribute EXT-X-VERSION is 7 instead of 12, and no attr REQ-VIDEO-LAYOUT=CH-STEREO which should be the key indicator of the spatial video type. From what the WWDC video showcased, I assume Apple's HLS tool supports it. Maybe my usage is not correct. Just curious what you guys think about it, thank you!
1
1
620
Mar ’24
iPhone webRTC / getUserMedia only uses headset mic with low volume. Can we change mics?
Hi hope all are well! We've been working on a live streaming app and it's going quite well! Just got the aspect ratio locked as desired. Now the audio, its volume is extremely low. It sounds like it's using the headset mic instead of the bottom mic that's used on Facetime or on speakerphone calls. We tried flipping cameras and specifying sample rates, almost every constraint in MediaConstraints - no go! Is there any way to specify this? Thanks in advance!
1
0
401
Mar ’24