HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

HTTP Live Streaming Documentation

Posts under HTTP Live Streaming tag

102 results found
Sort by:
Post not yet marked as solved
15 Views

URLSessionWebSocketTask SIP Register

I can connect to my SIP server (asterisk), but my REGISTER message is not getting a response, and then I get disconnected 30 seconds later. Does anyone have a working example or maybe a tutorial or video on how to do this? These are the parameters I'm sending:         REGISTER sip:\(host) SIP/2.0         Via: SIP/2.0/WSS ij5094obipus.invalid;branch=\(branch)         Max-Forwards: 69         To: <sip:\(username)@\(host)>         From: <sip:\(username)@\(host)>;tag=\(tag)         Call-ID: \(callID)         CSeq: 2 REGISTER         Contact: <sip:pqqr773g@3tb56aj0gkb8.invalid;transport=ws>;+sip.ice;reg-id=1;+sip.instance="<urn:uuid:\(uuid)>";expires=60         Expires: 60         Allow: INVITE,ACK,CANCEL,BYE,UPDATE,MESSAGE,OPTIONS,REFER,INFO,NOTIFY         Supported: path,gruu,outbound         User-Agent: CoNameWebRTC-v1.5         Content-Length: 0
Asked Last updated
.
Post not yet marked as solved
212 Views

iOS 15 - HLS parallel download issue

Hi, We are using AVPlayer for FPL HLS stream, after migrating to iOS 15 (currently Beta 3), we have observed a strange behavior during segments download; The player downloads last video segment of VOD HLS stream The player downloads all audio segments Playback starts playing from the end (last segment) Needs to restart playback to start downloading all video segments. Note that we do not have this behavior with older versions of iOS (14 and before). m3u8 file Thank you,
Asked
by yolotn.
Last updated
.
Post not yet marked as solved
96 Views

Problems with HLS in iOS15 / tvOS15 - requires alignment of video and audio segment length

We have live HLS streams with separate aac audio for multi-language tracks. Previously we have had 6 second segments, which had led to a small variation in audio segment length, although this played back everywhere on iOS14/tvOS14. In iOS15/tvOS15 these streams fail to play, unless we align audio and video segment lengths, which in turn requires us to run at 4 or 8 seconds for segments. To get back to 6, we would need to sample at a much higher rate (192kHz), as opposed to the 96kHz we have working today, or the 48kHz we had working previously. Anyone else spotted this?
Asked
by imbimp.
Last updated
.
Post not yet marked as solved
18 Views

HLS HEVC+Alpha doesn't work as SceneKit material

Hi, when I use a local .mp4 video file encoded in HEVC + Alpha channel with an AVPlayer as the material of a SCNNode, the transparency is restituted correctly similarly as if I used a .png image with transparency. The issue is: when I encode this same .mp4 file into a HLS stream using mediafilesegmenter and try to play it in the same manner as a SCNNode material with AVPlayer the transparency is not restituted and instead the transparent zones are filled with opaque black. (the hls stream has correct transparency as verified by opening its url with Safari) Sample Test: import UIKit import ARKit class ViewController: UIViewController {     private var arView: ARSCNView!     lazy var sphere: SCNNode = {     let node = SCNSphere(radius: 5)         node.isGeodesic = false         node.segmentCount = 64         node.firstMaterial?.lightingModel = .constant         node.firstMaterial?.diffuse.contents =  colorLiteral(red: 0, green: 0, blue: 0, alpha: 0)         node.firstMaterial?.cullMode = .front         return SCNNode(geometry: node)     }()          private var avPlayer: AVPlayer!          override func viewDidLoad() {         super.viewDidLoad()         setupArView()         setupArSession()         setupButton()     }     private func setupButton() {         let button = UIButton()         button.setTitle("START", for: .normal)         button.translatesAutoresizingMaskIntoConstraints = false         view.addSubview(button)         NSLayoutConstraint.activate([             button.centerXAnchor.constraint(equalTo: view.centerXAnchor),             button.centerYAnchor.constraint(equalTo: view.centerYAnchor)         ])         button.addTarget(self, action: #selector(createSphere), for: .touchUpInside)     }          @IBAction func createSphere() {         guard avPlayer == nil else { return }         addSphere()     } } extension ViewController {     private func setupArView() {         arView = ARSCNView()         arView.backgroundColor = .black         arView.translatesAutoresizingMaskIntoConstraints = false         view.insertSubview(arView, at: 0)         NSLayoutConstraint.activate([             arView.leadingAnchor.constraint(equalTo: view.leadingAnchor),             arView.topAnchor.constraint(equalTo: view.topAnchor),             arView.trailingAnchor.constraint(equalTo: view.trailingAnchor),             arView.bottomAnchor.constraint(equalTo: view.bottomAnchor)         ])         arView.preferredFramesPerSecond = 60     }     private func setupArSession() {         let configuration = ARWorldTrackingConfiguration()         configuration.worldAlignment = .gravityAndHeading         configuration.environmentTexturing = .none         if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {     configuration.frameSemantics.insert(.personSegmentationWithDepth)         }         if ARWorldTrackingConfiguration.supportsUserFaceTracking {             configuration.userFaceTrackingEnabled = true         }         arView.session.run(configuration)     }     private func addSphere() { //        let asset = AVURLAsset(url: URL(string: "https://SOMECLOUDSTORAGE.com/hls-bug/prog_index.m3u8")!)         let asset = AVURLAsset(url: Bundle.main.url(forResource: "puppets", withExtension: "mp4")!)         let playerItem = AVPlayerItem(asset: asset)         playerItem.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil)         playerItem.isAudioSpatializationAllowed = true         playerItem.allowedAudioSpatializationFormats = .monoStereoAndMultichannel         avPlayer = AVPlayer()         sphere.position = SCNVector3(0, 0, 0)         arView.scene.rootNode.addChildNode(sphere)         avPlayer.replaceCurrentItem(with: playerItem)     }     override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {         if keyPath == #keyPath(AVPlayerItem.status) {             let status: AVPlayerItem.Status             if let statusNumber = change?[.newKey] as? NSNumber {                 status = AVPlayerItem.Status(rawValue: statusNumber.intValue)!             } else {                 status = .unknown             }             switch status {             case .readyToPlay:                 DispatchQueue.main.async {                     self.avPlayer.playImmediately(atRate: 1)                   self.sphere.geometry?.firstMaterial?.diffuse.contents = self.avPlayer                 }             case .failed, .unknown:                 break             @unknown default:                 break             }         } else {             super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)         }     } } The video file used is the puppets_with_alpha_hevc.mov file from the Apple's HEVC alpha demo that I re-muxed into .mp4 container using ffmpeg. To reproduce both scenarios replace the AVURLAsset with either a local .mp4 file or the HLS stream url. Issue reproduced on iPhone 11 Pro iOS 15. This issue has been unresolved for a bit of time now though I tried everything to get attention. Unsuccessful TSI ticket, silent Feedback Assistant Bug report, I even discussed about this bug during WWDC 2021 with Shiva Sundar who is in charge of HEVC dev and who said it would be checked. Hopes
Asked
by mehdiklk.
Last updated
.
Post not yet marked as solved
56 Views

USB Live Stream not working in osx 12

Dear Apple Expert, Our project uses libUSB library to interact with a USB based camera device. Our application is working fine in macOS Mojave (10.14.6 ). When the new MacOS 12 beta version was made available, we tested our code. But when we try to claim the interface via "CreateInterfaceIterator" API, we are getting "kIOReturnExclusiveAccess" error code and ultimately our application fails. The failure is observed in both libUSB versions 1.0.23 and 1.0.24. Could you help us by explaining if there is change in the new OS with respect to access to USB devices?
Asked
by Akshit04.
Last updated
.
Post not yet marked as solved
51 Views

Changing "master manifest file" HLS terminology to "main manifest file" to make it more inclusive

Hi, The term master is out of favor in the computing world and beyond. Hence, to make codebases more inclusive, wondering if we can change "master manifest file" HLS terminology to "main manifest file"? This will help us shift from using the non-inclusive term in our organization's codebase too. Thanks!
Asked
by agadekar.
Last updated
.
Post not yet marked as solved
60 Views

How to insert timed metadata (id3) into live HLS files with Apple's mediastreamsegmenter and ffmpeg

I am trying to insert timed metadata (id3) into a live HLS stream created with Apple's mediastreamsegmenter tool. I am getting the video from an ffmpeg stream, here is the command I run to test from an existing file: ffmpeg -re -i vid1.mp4 -vcodec libx264 -acodec aac -f mpegts - | mediastreamsegmenter -f /Users/username/Sites/video -s 10 -y test -m -M 4242 -l log.txt To inject metadata, I run this command: id3taggenerator -text '{"x":"data dan","y":"36"}' -a localhost:4242 This setup creates the expected .ts files and I can play back the video/audio with no issues. However the metadata I am attempting to insert does not work in the final file. I know the metadata is there in some form, when I file-compare a no-metadata version of the video to one I injected metadata into, I can see the ID3 tags within the binary data. Bad File Analysis When I analyze the generated files using ffmpeg: ffmpeg -i video1.ts the output I get is: [mpegts @ 0x7fb00a008200] start time for stream 2 is not set in estimate_timings_from_pts[mpegts @ 0x7fb00a008200] stream 2 : no TS found at start of file, duration not set[mpegts @ 0x7fb00a008200] Could not find codec parameters for stream 2 (Audio: mp3, 0 channels): unspecified frame sizeConsider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options Input #0, mpegts, from 'video1.ts':  Duration: 00:00:10.02, start: 0.043444, bitrate: 1745 kb/s  Program 1   Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 848x464 [SAR 1:1 DAR 53:29], 30 fps, 30 tbr, 90k tbn, 60 tbc   Stream #0:1[0x101] Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 130 kb/s  No Program   Stream #0:2[0x102]: Audio: mp3, 0 channels Note how the third stream (stream #0:2) is marked as mp3...this is incorrect! Also it says "No Program", instead of being in "Program 1". When I analyze a properly encoded video file with inserted ID3 metadata that I created with Apple's mediafilesegmenter tool, the analysis shows a "timed_id3" track and this metadata track works properly in my web browser. Good File Analysis ffmpeg -i video1.ts —Input #0, mpegts, from 'video1.ts':  Duration: 00:00:10.08, start: 19.984578, bitrate: 1175 kb/s  Program 1   Stream #0:0[0x101]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 848x464, 30 fps, 30 tbr, 90k tbn, 180k tbc  Stream #0:1[0x102]: Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 67 kb/s  Stream #0:2[0x103]: Data: timed_id3 (ID3  / 0x20334449) I must use mediastreamsegmenter because that is required for live streams. Does anyone know how I can get timed ID3 metadata into a live HLS stream properly?
Asked
by danfarrow.
Last updated
.
Post not yet marked as solved
270 Views

HLS fmp4 support

I know that HLS supports fmp4. But I know that to use fmp4 instead of ts, you need to support CENC cbcs. If cbcs is supported on Android 7.0 or later and you do not use cbcs, can not you use fmp4?
Asked
by SangHyuk.
Last updated
.
Post not yet marked as solved
123 Views

HLS: AVPlayer not reacting to connection loss on partly buffered assets

When connection is cut before track is fully buffered and player reaches end of loaded time ranges, audio stops but status is not updated: elapsed Time continues to send events player.timeControlStatus = playing currentItem!.isPlaybackLikelyToKeepUp = true player.status = readyToPlay player.currentItem!.status = readyToPlay currentItem!.isPlaybackLikelyToKeepUp = true But an event is logged in errorLog() "NSURLErrorDomain" errorStatusCode -1009 This results in weird behaviour where a progress bar is continuing to show progress without any sound. It even continues beyond the total track duration. Reproducible on demo app https://github.com/timstudt/PlayerApp: start playback let buffer til e.g. 70sec (loadedTimeRanges) activate flight mode seek to 60sec (returns = successful) watch: when player reaches 70sec mark, audio stops, but elapsed time continues. Note: w/o seeking the player stalls correctly on 70sec mark.
Asked
by tstudt.
Last updated
.
Post not yet marked as solved
74 Views

Install HTTP Live Streaming Tools on M1 MacBook

For my App project I want to try the HTTP Live Streaming Tools provided by Apple. I could download the .pkg file and follow the install instruction. After successful installation (the prompt appeared) I cannot find the program anywhere. From readme: "The HLS tools package requires an Intel-based Mac running macOS 10.15 Catalina or later."- So the problem is the M1 chip? Is there a solution for it ?
Asked Last updated
.
Post not yet marked as solved
127 Views

Caching video streamed via HLS

Hello everyone! Recently our backend team integrated video streaming via HLS, before we had default HTTP streaming. With HTTP streaming this exporting code worked fine: private func cacheFile(from asset: AVURLAsset) {         guard asset.isExportable,               let fileName = asset.url.pathComponents.last,               let outputURL = self.cacheDirectory?.appendingPathComponent(fileName), !FileManager.default.fileExists(atPath: outputURL.path)         else { return }         asset.resourceLoader.setDelegate(self, queue: backgroundQueue.underlyingQueue)         let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHighestQuality)         exporter?.outputURL = outputURL         exporter?.determineCompatibleFileTypes(completionHandler: { types in             guard let type = types.first else { return }             exporter?.outputFileType = type             exporter?.exportAsynchronously(completionHandler: {                 if let error = exporter?.error { print(error)                 }             })         })     } This code works great with HTTP streaming, but for HLS asset.isExportable is equal to false. After removing check for asset.isExportable exporter?.determineCompatibleFileTypes passes empty array inside closure. If setting outputFileType to .mp4 or .mov I'm receiving error inside exportAsynchronously completionHandler: Error Domain=AVFoundationErrorDomain Code=-11838 "Operation Stopped" UserInfo={NSLocalizedFailureReason=The operation is not supported for this media., NSLocalizedDescription=Operation Stopped, NSUnderlyingError=0x6000025abd50 {Error Domain=NSOSStatusErrorDomain Code=-16976 "(null)" Why does this happen? AVAssetExportSession cannot combine all parts of .m3u8 to .mp4? Is there any alternative way to cache streamed video via HLS?
Asked
by Chopyhoo.
Last updated
.
Post not yet marked as solved
436 Views

Different EXT-X-DISCONTINUITY between audio and video

I have a live presentation where the encoder was not sending data for different times for audio and video. For example: Audio: 1, 2, 3, -, -, 6, -, -, 9 Video: 1, 2, 3, -, -, -, -, -, 9 Should the packager emit 2 discontinuity tags for audio or just discard timestamp 6 and merge them into 1 discontinuity? Section 8.15 of the HLS authoring spec says: "All variants and renditions MUST have discontinuities at the same points in time." Does this mean that for audio/video and text variants all need to align? However the documentation for EXT-X-DISCONTINUITY-SEQUENCE seems to imply that a/v may not necessarily have the same count of discontinuities. Please clarify which is correct and how the packager should have emitted the discontinuity tags above. Thanks!
Asked
by nlin.
Last updated
.
Post not yet marked as solved
475 Views

bug with RTSP video Unable to determine our source address: This computer has an invalid IP address: 0.0.0.0

Hello, I have a problem with my iOS application which displays video streaming via MobileVlcKit. this function has been available on my application for many months but, for a few days, I can't see the video stream in my application ! when I use the xCode simulator, the video stream is displayed correctly. but, when i launch a local version or a version in testflight i get a black screen without my video stream flux ! when I run a local version via the USB cord, I see this message **** in the debugging console : "Unable to determine our source address: This computer has an invalid IP address: 0.0.0.0" can someone please help me?
Asked
by CAMBOX44.
Last updated
.
Post marked as solved
147 Views

Issue with HLS on iOS devices

I'm having some issues with the generation of HLS (.ts) files. We are working in order to generate an HLS from an RTSP source (ip camera) using ffmpeg. The issues is this one: If we use FFMPEG to generate the HLS we have no issues at all by playing it on macOS, iPadOS and iOS. If we use our internal FFMPEG included inside our software (.exe running on windows) we have some issues with iOS. The HLS won't play only on iOS devices (ipads and macs works like a charm). We've tried to understand what is the issue but we are not able to dig anything else than something related with Media Source Extensions. I've included a link to our repo where you can fine both the file generated from our software (streamATC.ts) and the one generated directly from FFMPEG (streamFF.ts) if you want to take a look at it. Github forum about this issue Our repo with .ts files No difference between both files from a codec standpoint as you can see: Do you have any help in order to understand why of this strange behaviour ?
Asked Last updated
.
Post not yet marked as solved
182 Views

create .m3u8

hi all, I tried to create .m3u8 to stream screan ip to tv with chrome cast but it doesn't work as expected. It's laggy and can't display your picture. Looking forward to hearing from you gentlemen.
Asked Last updated
.