HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

HTTP Live Streaming Documentation

Posts under HTTP Live Streaming tag

92 Posts
Sort by:
Post not yet marked as solved
0 Replies
434 Views
Hey Apple! I'm just wondering if there are any recommendations on best practices for supporting AV experiences in SwiftUI? As far as I know, VideoPlayer is the only API available directly supported in SwiftUI in AVKit (https://developer.apple.com/documentation/avkit) without the need for UIViewRepresentable / UIViewControllerRepresentable bridging of AVPlayer into SwiftUI. However there are many core video and audio experiences that a modern audience expect that are not supported in VideoPlayer. e.g. PiP Is there a roadmap for support in SwiftUI directly? Thanks!
Posted
by em_walks.
Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
Hi , We are getting big spikes of errors on very few programs in some live channels. [-16012:CoreMediaErrorDomain] [Error Domain=CoreMediaErrorDomain Code=-16012 "(null)"] When this error occurred , AVPlayer stops & users has to restart the playback. This is the error we are getting & this happens in some live programs. We have the same set up & uses same transcoders etc in all programs. Mostly we have very low error rate in player with live programs , but with this error , error rate can increase up to 80% of the users effecting pretty much all the users on apple devices. Does anyone know what this error actually means ? What is the context & what is the reason behind that ? It seems like this may be related subtitles & this occurs only when the subtitles are enabled. ( The subtitles are not embedded in the stream it is teletext ) Try to find in apple documents & online & nothing could be find unfortunately.
Posted
by zkyki.
Last updated
.
Post not yet marked as solved
2 Replies
680 Views
We have a web application which is having stalled html video element issue on iOS Safari. iOS Version 17.0.3. The html page contains a inline script tag for example <script> ... </script> and runs immediately when page loads. The script does following videoElement.src = url; videoElement.load(); The url is a HLS manifest url. After the DOM elements are created, we attatch the videoElement to a <div> and expecting the video reaches canplaythrough eventually and starts to playing. Actual Behavior: The video never plays videoElement readyState and networkState both stuck at value of 1 we found that "suspened" event was triggered on the video element and not sure who is triggering it. Temporary Mitigation: When video is stalled, if we call videoElement.load() manually in Safari js console, the readyState and networkState will increase and seeing HLS video segment are being fetched and video eventually reaches canplaythrough. This happens only on iOS Safari, not MacOS Safari. We suspect its because at begining when videoElement.src is set and .load() was called, the videoElement was not attached to any div so iOS decides to stop it to save battery. But its completely uneducated guess and any help would be appreciated. Thanks !
Posted
by tyrelltle.
Last updated
.
Post marked as solved
4 Replies
6.1k Views
Hello there, in our team we were requested to add the possibility to manually select the video quality. I know that HLS is an adaptive stream and that depending on the network condition it choose the best quality that fits to the current situation. I also tried some setting with preferredMaximumResolution and preferredPeakBitRate but none of them worked once the user was watching the steam. I also tried something like replacing the currentPlayerItem with the new configuration but anyway this only allowed me to downgrade the quality of the video. When I wanted to set it for example to 4k it did not change to that track event if I set a very high values to both params mentioned above. My question is if there is any method which would allow me to force certain quality from the manifest file. I already have some kind of extraction which can parse the manifest file and provide me all the available information but I couldn't still figure out how to make the player reproduce specific stream with my desired quality from the available playlist.
Posted Last updated
.
Post not yet marked as solved
1 Replies
539 Views
Hard/Software: iPad 7, Ipad OS 17.0.3, MobileVLCKit, SDP/RTSP stream, USB/Ethernet cable. Description: I have an iPad application for live video streaming. From a security point of view, it should work via cable (USB/Ethernet). This application uses MobileVLCKit player to play videos. Videos are available via a URLs with an IP address and use the SDP/RTSP format (url example: http://172.20.129.69/stream/sdp). Code example: let camUri="http://172.20.129.69/stream/sdp" let options = ["--network-caching=100"] var players: [VLCMediaPlayer] = [] players.append(VLCMediaPlayer(options: options)) let media = VLCMedia(url: camUri) players[0].media = media players[0].drawable = centerView players[0].play() Problem: The application worked on various iPads with iPad OS 16 and lower (using cable or Wi-Fi). After updating to iPad OS 17, the application no longer works via cable. What could be the problem and how to fix it?
Posted Last updated
.
Post not yet marked as solved
0 Replies
329 Views
I am trying to play multiple HLS streams with AVPlayer. Starting with 4 cameras it is working fine but whenever I change cameras, it stuck with AVPlayer.TimeControlStatus.waitingToPlayAtSpecifiedRate. it is working fine with iOS 15 but having issues with iOS 16 and 17. I have checked it with different network speeds. So bandwidth is fine.
Posted Last updated
.
Post not yet marked as solved
0 Replies
753 Views
Hello community, I'm encountering a perplexing issue with my React Native app on iOS, specifically related to network connectivity when making API calls using Axios. The problem manifests as intermittent network errors, and what's even more puzzling is that the issue seems to be specific to iOS devices and Apple browsers. Here's a brief overview of the problem: Intermittent Network Errors: Occasionally, when making API calls using Axios in my React Native app on iOS, I receive network errors. The strange part is that this issue is sporadic and doesn't occur consistently. Works on Cellular Network: When the app encounters these network issues on WiFi, I've observed that switching to a cellular network resolves the problem, and the API calls start working again. Android and Other Devices Are Unaffected: Interestingly, the app works flawlessly on Android devices and other platforms. The issue appears to be isolated to iOS and Apple browsers. Has anyone else in the community faced a similar problem or have any insights into what might be causing this? I've already ruled out general connectivity issues, as the app works perfectly on other devices and networks. Any suggestions, tips, or shared experiences would be greatly appreciated. I'm open to trying out different approaches or debugging techniques to get to the bottom of this issue. Thanks in advance for your assistance!
Posted Last updated
.
Post not yet marked as solved
0 Replies
534 Views
Hi there, I want to play a stream of audio immediately after the first chunk of the audio arrives. The data stream looks like this Audio_data, Delimiter, Json_data. currently I am handling all chunks before the delimiter and adds it in the queue of the AVQueuePlayer. However, when playing this audio during the stream there are many glitches and does not work well. Waiting until all chunks arrived and then play the audio works well. So I assume there is no problem with the audio data, but with the handling of the chunks as they come and play immediately. Happy about any advice you have! I am pretty lost right now. Thank you so much. import SwiftUI import AVFoundation struct AudioStreamView: View { @State private var players: [AVAudioPlayer] = [] @State private var jsonString: String = "" @State private var queuePlayer = AVQueuePlayer() var streamDelegate = AudioStreamDelegate() var body: some View { VStack(spacing: 20) { Button("Fetch Stream") { fetchDataFromServer() } .padding() TextEditor(text: $jsonString) .disabled(true) .border(Color.gray) .padding() .frame(minHeight: 200, maxHeight: .infinity) } } func fetchDataFromServer() { guard let url = URL(string: "https://dev-sonia.riks0trv4c6ns.us-east-1.cs.amazonlightsail.com/voiceMessage") else { return } var request = URLRequest(url: url) request.httpMethod = "POST" // Specify the request type as POST let parameters: [String: Any] = [ "message_uuid": "value1", "user_uuid": "68953DFC-B9EA-4391-9F32-0B36A34ECF56", "session_uuid": "value3", "timestamp": "value4", "voice_message": "Whats up?" ] request.httpBody = try? JSONSerialization.data(withJSONObject: parameters, options: .fragmentsAllowed) request.addValue("application/json", forHTTPHeaderField: "Content-Type") let task = URLSession.shared.dataTask(with: request) { [weak self] (data, response, error) in guard let strongSelf = self else { return } if let error = error { print("Error occurred: \(error)") return } // You might want to handle the server's response more effectively based on the API's design. // For now, I'll make an assumption that the server returns the audio URL in the response JSON. if let data = data { do { if let jsonResponse = try JSONSerialization.jsonObject(with: data, options: []) as? [String: Any], let audioURLString = jsonResponse["audioURL"] as? String, let audioURL = URL(string: audioURLString) { DispatchQueue.main.async { strongSelf.playAudioFrom(url: audioURL) strongSelf.jsonString = String(data: data, encoding: .utf8) ?? "Invalid JSON" } } else { print("Invalid JSON structure.") } } catch { print("JSON decoding error: \(error)") } } } task.resume() } func playAudioFrom(url: URL) { let playerItem = AVPlayerItem.init(url: url) queuePlayer.replaceCurrentItem(with: playerItem) queuePlayer.play() } }
Posted
by dustin_kl.
Last updated
.
Post not yet marked as solved
10 Replies
6.1k Views
Hi, I'm working on an app where a user can scroll through a feed of short videos (a bit like TikTok). I do some pre-fetching of videos a couple positions ahead of the user's scroll position, so the video can start playing as soon as he or she scrolls to the video. Currently, I just pre-fetch by initializing a few AVPlayers. However, I'd like to add a better caching system. I'm looking for the best way to: get videos to start playing as possible, while making sure to minimize re-downloading of videos if a user scrolls away from a video and back to it. Is there a way that I can cache the contents of an AVPlayer that has loaded an HLS video? Alternatively, I've explored using AVAssetDownloadTask to download the HLS videos. My issue is that I can't download the full video and then play it - I need to start playing the video as soon as the user scrolls to it, even if it's not done downloading. Is there a way to start an HLS download with an AVAssetDownloadTask, and then start playing the video while it continues downloading? Thank you!
Posted
by porges.
Last updated
.
Post not yet marked as solved
0 Replies
351 Views
I've been working with the eligibleforhdrplayback property to determine if HDR playback is supported. However, I've noticed an inconsistency. When the video format switches from HDR to SDR in settings menu on Apple TV, the property still returns true, indicating HDR is playable even when it's not (This seems to contradict what was mentioned around the [20:40] mark of this WWDC video). I've tried using the eligibleForHDRPlaybackDidChangeNotification and even restarted the app, but I still encounter the same issue. Are there alternative approaches to accurately determine if the app can play HDR content on Apple TV?
Posted
by ferasa.
Last updated
.
Post not yet marked as solved
2 Replies
2.1k Views
I am trying to insert timed metadata (id3) into a live HLS stream created with Apple's mediastreamsegmenter tool. I am getting the video from an ffmpeg stream, here is the command I run to test from an existing file: ffmpeg -re -i vid1.mp4 -vcodec libx264 -acodec aac -f mpegts - | mediastreamsegmenter -f /Users/username/Sites/video -s 10 -y test -m -M 4242 -l log.txt To inject metadata, I run this command: id3taggenerator -text '{"x":"data dan","y":"36"}' -a localhost:4242 This setup creates the expected .ts files and I can play back the video/audio with no issues. However the metadata I am attempting to insert does not work in the final file. I know the metadata is there in some form, when I file-compare a no-metadata version of the video to one I injected metadata into, I can see the ID3 tags within the binary data. Bad File Analysis When I analyze the generated files using ffmpeg: ffmpeg -i video1.ts the output I get is: [mpegts @ 0x7fb00a008200] start time for stream 2 is not set in estimate_timings_from_pts[mpegts @ 0x7fb00a008200] stream 2 : no TS found at start of file, duration not set[mpegts @ 0x7fb00a008200] Could not find codec parameters for stream 2 (Audio: mp3, 0 channels): unspecified frame sizeConsider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options Input #0, mpegts, from 'video1.ts':  Duration: 00:00:10.02, start: 0.043444, bitrate: 1745 kb/s  Program 1   Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 848x464 [SAR 1:1 DAR 53:29], 30 fps, 30 tbr, 90k tbn, 60 tbc   Stream #0:1[0x101] Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 130 kb/s  No Program   Stream #0:2[0x102]: Audio: mp3, 0 channels Note how the third stream (stream #0:2) is marked as mp3...this is incorrect! Also it says "No Program", instead of being in "Program 1". When I analyze a properly encoded video file with inserted ID3 metadata that I created with Apple's mediafilesegmenter tool, the analysis shows a "timed_id3" track and this metadata track works properly in my web browser. Good File Analysis ffmpeg -i video1.ts —Input #0, mpegts, from 'video1.ts':  Duration: 00:00:10.08, start: 19.984578, bitrate: 1175 kb/s  Program 1   Stream #0:0[0x101]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 848x464, 30 fps, 30 tbr, 90k tbn, 180k tbc  Stream #0:1[0x102]: Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 67 kb/s  Stream #0:2[0x103]: Data: timed_id3 (ID3  / 0x20334449) I must use mediastreamsegmenter because that is required for live streams. Does anyone know how I can get timed ID3 metadata into a live HLS stream properly?
Posted
by danfarrow.
Last updated
.
Post not yet marked as solved
2 Replies
423 Views
For live content, if the user pauses playback and their current playhead falls out of the valid DVR window, AVPlayer will stop reporting seekableTimeRanges - why does this happen? Is there a workaround? For example: Suppose we were streaming content with a 5 minute DVR window User pauses playback for 6 minutes Their current position is now outside of the valid seekable time range AVPlayer stops reporting seekableTimeRanges all together This is problematic for two reasons: We have observed the AVPlayer generally becomes unresponsive when this happens. i.e. Any seek action will cause the player to freeze up Without knowing the seekable range, we dont know how to return the user to the live edge when they resume playback Seems to be the same issue described in these threads: https://developer.apple.com/forums/thread/45850 https://developer.apple.com/forums/thread/45850
Posted Last updated
.
Post not yet marked as solved
1 Replies
563 Views
Hi guys, Setting AVPlayerViewController.transportBarCustomMenuItems is not working on tvOS. I still see 2 icons for Audio and Subtitles. let menuItemAudioAndSubtitles = UIMenu( image: UIImage(systemName: "heart") ) playerViewController.transportBarCustomMenuItems = [menuItemAudioAndSubtitles] WWDC 2021 video is insufficient to make this work. https://developer.apple.com/videos/play/wwdc2021/10191/ The video doesn't say what exactly I need to do. Do I need to disable subtitle options? viewController.allowedSubtitleOptionLanguages = [] This didn't work and I still see the default icon loaded by the player. Do I need to create subclass of AVPlayerViewController? I just want to replace those 2 default icons by 1 icon as a test, but I was unsuccessful after many hours of work. Is it mandatory to define child menu items to the main item? Or do I perhaps need to define UIAction? The documentation and video are insufficient in providing guidance how to do that. I did something like this before, but that was more than 3 years ago and audi and subtitles was showing at the top of the player screen as tabs, if I rememebr correctly. Is transportBarCustomMenuItems perhaps deprecated? Is it possible that when loading AVPlayerItem and it detects audi and subtitles in the stream, it automatically resets AVPlayerViewController menu? How do I suppress this behavior? I'm currently loading AVPlayerViewController into SwiftUI interface. Is that perhaps the problem? Should I write SwiftUI player overlay from scratch? Thanks, Robert
Posted Last updated
.
Post not yet marked as solved
0 Replies
523 Views
Hello, I'm try to create a working Clear-key HLS stream using Shaka-packager. Shaka packager support only SAMPLE-AES. when implemented, it's looks like the resulting stream is no't playable in Safary browser. This is the example M3U8 file starts: #EXTM3U #EXT-X-VERSION:6 ## Generated with https://github.com/google/shaka-packager version v2.6.1-634af65-release #EXT-X-TARGETDURATION:13 #EXT-X-PLAYLIST-TYPE:VOD #EXT-X-MAP:URI="audio_und_2c_128k_aac_init.mp4" #EXT-X-KEY:METHOD=SAMPLE-AES,URI="https://www.httpstest.com:771/key.key",IV=0x00000000000000000000000000000000,KEYFORMAT="identity" #EXTINF:10.008, audio_und_2c_128k_aac_1.mp4 #EXTINF:10.008, audio_und_2c_128k_aac_2.mp4 #EXTINF:9.985, audio_und_2c_128k_aac_3.mp4 #EXTINF:10.008, audio_und_2c_128k_aac_4.mp4 #EXTINF:10.008, audio_und_2c_128k_aac_5.mp4 #EXTINF:9.985, audio_und_2c_128k_aac_6.mp4 #EXTINF:0.093, audio_und_2c_128k_aac_7.mp4 #EXT-X-ENDLIST I'm looking for: A. a working example of SAMPLE-AES Clear key encrypted HLS (so I will be able to learn from it how it should be defined for IOS) B. help on how to create a clear key working HLS stream for IOS/macOS (Safari)
Posted Last updated
.
Post not yet marked as solved
1 Replies
3.8k Views
When describing closed captions renditions through HLS master playlists, should a properly formed playlist with closed captions contain one closed caption rendition with the DEFAULT=YES attribute? I searched the HLS RFC and section 4.3.4.1 on EXT-X-MEDIA mentions that no more than one should contain the DEFAULT=YES attribute. I was hoping to find recommendations around whether with one or more EXT-X-MEDIA renditions in the same group, if it is required that one of them contain DEFAULT=YES.The media player I am using does not display closed captions if one of the closed captions renditions does not contain the DEFAULT=YES attribute and I am wondering if that is an issue with the player or an issue with a malformed HLS playlist. Note the lack of a DEFAULT=YES attribute in the example playlist below. Should this still be considered a valid playlist?#EXTM3U #EXT-X-VERSION:3 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-MEDIA:TYPE=CLOSED-CAPTIONS,GROUP-ID="CC",LANGUAGE="eng",NAME="English",INSTREAM-ID="CC1" #EXT-X-STREAM-INF:BANDWIDTH=2103200,AVERAGE-BANDWIDTH=2305600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=960x540,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream #EXT-X-STREAM-INF:BANDWIDTH=804760,AVERAGE-BANDWIDTH=875600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=640x360,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream #EXT-X-STREAM-INF:BANDWIDTH=1304160,AVERAGE-BANDWIDTH=1425600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=768x432,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream #EXT-X-STREAM-INF:BANDWIDTH=505120,AVERAGE-BANDWIDTH=545600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=480x270,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream #EXT-X-STREAM-INF:BANDWIDTH=3102000,AVERAGE-BANDWIDTH=3405600,CODECS="avc1.640029,mp4a.40.2",RESOLUTION=1280x720,FRAME-RATE=30.000,CLOSED-CAPTIONS="CC" https://the.link.to.my.stream
Posted
by smithks.
Last updated
.
Post not yet marked as solved
0 Replies
443 Views
I'm working on live streaming encoder but I cannot start to play live stream correctly if the stream type (#EXT-X-PLAYLIST-TYPE) is EVENT. When I try to play it, the stream starts from the beginning not the current position. When I tested on iPhone 7 (iOS 15.7.8), live streams start correctly but I tested on iPhone 8 Plus (iOS 16.6), live streams start from the beginning. (I tested on other iPhones with iOS 16 or later and the result is the same.) I also tried to add "#EXT-X-START:TIME-OFFSET" tag but it didn't work. Is this behavior a bug or do I have to add some tag to play?
Posted
by S_Yama.
Last updated
.
Post not yet marked as solved
0 Replies
322 Views
Hi, I have an HLS content i.e., .m3u8 manifest file but the segments are encoded with **MPEG2Video. ** Is such encoding supported by HLS? Or they only support H.264/AVC or HEVC/H.265? Stream #0:0[0x281]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv, bt470bg, top first), 720x576 [SAR 16:15 DAR 4:3], 3125 kb/s, 25 fps, 25 tbr, 90k tbn Stream #0:1[0x201]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, fltp, 128 kb/s Thanks.
Posted
by Sneh.
Last updated
.