HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

Posts under HTTP Live Streaming tag

77 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

AVPlayer HLS Restirect Stream Resulotion
Hello, we have HLS stream app and we use AVPlayer for HLS stream. We want to implement dynamic resulotion feature as user's selection. For example, if user want to watch only 1080p user has to watch only 1080p but we have tried to implement "preferredMaximumResolution" and "preferredPeakBitRate" parameters and but AVPlayer does not force it which means that setting preferredMaximumResolution= CGSize(width: 1920, height: 1080) player does not only force to play 1080p profile, player drops resulotion to 720p but we do not want 720p stream if user selected 1080p resulotion. Is there any method to force it even if stream stalls? Thank you in advance
0
0
35
1d
tvOS 18 playback issues on 4k
Observing 4K playback issues on tvOS 18. Encountering HTTP 416 (Range Not Satisfiable) errors when the player attempts to request byte ranges that are outside the available data on the server. This leads to fatal playback error resulting in the error CoreMediaErrorDomain error -12939 - HTTP 416: Requested Range Not Satisfiable Notably, there are no customizations or modifications to the standard AVPlayerViewController on tvOS player. AVPlayer is trying to request the resource of length equals 739 bytes with an invalid byte range (739-) request. Since the request is not satisfiable server returns with 416. Note this is only limited to tvOS 18 and we are trying to understand why AVPlayer is making this invalid request in tvOS 18 resulting in playback error.
1
0
79
6d
Spatial streaming from iPhone
Hi, I am trying to stream spatial video in realtime from my iPhone 16. I am able to record spatial video as a file output using: let videoDeviceOutput = AVCaptureMovieFileOutput() However, when I try to grab the raw sample buffer, it doesn't include any spatial information: let captureOutput = AVCaptureVideoDataOutput() //when init camera session.addOutput(captureOutput) captureOutput.setSampleBufferDelegate(self, queue: sessionQueue) //finally func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { //use sample buffer (but no spatial data available here) } Is this how it's supposed to work or maybe I am missing something? this video: https://developer.apple.com/videos/play/wwdc2023/10071 gives us a clue towards setting up spatial streaming and I've got the backend all ready for 3D HLS streaming. Now I am only stuck at how to send the video stream to my server.
1
0
237
1w
Glitch in AVPlayer while playing HLS videos
We have observed consistent glitches in video playback when using AVPlayer to stream HLS (HTTP Live Streaming) videos on iOS. The issue manifests as intermittent frame drops, stuttering, and playback instability during HLS streams. However, the same behavior is not present when playing MP4 videos using the same AVPlayer instance. The HLS streams being used follow standard encoding practices, and network conditions have been ruled out as a cause for this problem. https://drive.google.com/file/d/1lhdpHTyjPYCYLHjzvb6ZF6P6jehIuwY0/view?usp=sharing Steps to Reproduce: 1. Load an HLS video into AVPlayer and initiate playback. 2. Observe intermittent glitches and stuttering during video playback. 3. Load and play an MP4 video in the same AVPlayer instance. 4. Notice that MP4 playback is smooth without any glitches.
2
0
200
3w
Streaming HLS from hotspot IoT device on iOS
Hi, Brief background on what I'm trying to achieve: I have an IoT device that produces a HLS stream of saved videos when they are accessed through the device's broadcast hotspot. To access the hotspot, I use an NEHotspotConfiguration. When I use AVPlayer to watch the HLS stream, everything is fine! When I use a media pod (VLC) to try to consume the HLS stream, traffic goes over cellular network even though the device's host address is 192.168.1.254. I am under the impression this is ALWAYS a local network device. I haven't spent much time digging into the code for VLC to figure out why, but when I disable cell network in my app's settings, the VLC request resolves perfectly. I have been served radio silence on their forums and issues, so I thought if there's another solution this would be the place to ask! Is there something going on with the way iOS handles web requests to local network devices? My IoT device's hotspot never has internet access, and after reading Quinn's Extra-ordinary Networking advice (https://developer.apple.com/forums/thread/734348), I'm still lost for how I can force my request to go to the WiFi network rather than cellular... Does anyone have any recommendations? Thanks in advance!
1
0
213
Sep ’24
macOS Sonoma 'Cannot Decode' HLS Video
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me: Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode} Thanks!
1
0
211
4w
Issue with Low-Latency HLS Playback Using AVAssetResourceLoaderDelegate
Hi, I am writing to seek any help or workaround regarding an issue I have encountered while implementing Low-Latency HLS playback using the AVAssetResourceLoaderDelegate. I have been successfully loading playlists during HLS live playback using the AVAssetResourceLoaderDelegate. However, after introducing Low-Latency HLS, I have run into a problem. When the AVPlayer loads low-latency content playback natively, everything works fine. But when I use the delegate for loading, I encounter the following error from AVPlayer's status observer: CoreMediaErrorDomain -15410 Low Latency: Server must support http2 ECN and SACK It seems there is no problem since playback does not stop, but there is a very critical part missing. The playback does not achieve the expected low latency and behaves similarly to standard HLS. Additionally, this behavior only occurs on iOS 16 devices and simulators. On iOS 17 simulators and devices, the error message does not appear, and the latency remains low as expected. Therefore, I suspect that there might be some misjudgment in the verification process within the internal implementation of AVPlayer. Since our app needs to support iOS 16, I would appreciate any solutions, methods to try, or workarounds that you could share regarding this issue. Thank you.
1
0
243
3w
2.0 System Controls In Immersive Space
How do I enable the system hand controls within an immersive space? I have an ImmersiveScene and would like to enable the new 2.0 system controls like the home button and volume slider. ImmersiveSpace(id: appModel.immersiveSpaceID) { ImmersiveView() } .immersionStyle(selection: .constant(.full), in: .full) .upperLimbVisibility(.visible) While I can see my hands and arms in this view, I cannot get the "New Hand Gestures" to show up when on visionOS 2.0. When I leave the immersive space, they appear.
1
0
376
Sep ’24
HLS output - influence fragment time.
Hello, I'm trying to create HLS output with segment time of 6 seconds, but sync samples (fragments) every 1 second. I want to have AVAssetWriter write a sync sample / moof header every second. Am I correct in understanding that I could only achieve this with a pre-fragmented MP4 and use a passthrough rendition with setting preferredOutputSegmentInterval to indefinite and running flushSegment() as needed? Or is there another method using AVFoundation? Thanks in advance.
0
0
273
Sep ’24
Detect Dolby Atmos programmatically
Hi, I am trying to detect if an audio stream is Dolby Atmos. I have existing code that determines if a stream is Dolby Atmos based on the following: Channel count is greater than equal to 8 Binaural is true Immersive is true Downmix is false I am trying to determine if these rules are correct and documentation that specifies these rules that I can reference in the future. Any help you can provide is greatly appreciated. Regards, John
1
0
422
Sep ’24
Clarification on Providing a Justification for NSAllowsArbitraryLoads
I’m developing an app for IPTV where users can add their own links to TV channels and watch them through the app. Since not all IPTV links use HTTPS, I’ve set NSAllowsArbitraryLoads to true in the Info.plist. Apple mentions that if you set this to true, you need to provide an explanation. What kind of explanation do they require, and how should I provide it? Thanks!
1
0
293
Aug ’24
How to Manage HLS Assets Downloaded with AVAssetDownloadTask to Appear in the iOS Settings App
I have an application that downloads content using AVAssetDownloadTask. In the iOS Settings app, these downloads are listed in the Storage section as a collection of downloaded movies, displaying the asset image, whether it's already watched, the file size, and an option to delete it. Curious about how other apps handle this display, I noticed that Apple Music shows every downloaded artist, album, and song individually. This feature made me wonder: can I achieve something similar in my application? On the other hand, apps like Spotify and Amazon Music don’t show any downloaded files in the Settings app. Is it possible to implement that approach as well? Here is print screen of the Apple Music Storage section in the Settings App: I tried moving the download directory into sub folder using the FileManager, but all the results made the downloads stop showing in the setting app
1
0
429
Aug ’24
HLS authoring - creating Iframe playlist
Hello, Is there an example from Apple on how to extract the data to create an Iframe playlist using the AVAssetSegmentTrackReport? I'm following the example of HLS authoring from WWDC 2020 - Author fragmented MPEG-4 content with AVAssetWriter It states: "You can create the playlist and the I-frame playlist based on the information AVAssetSegmentReport provides." I've examined the AVAssetSegmentTrackReport and it only appears to provide the firstVideoSampleInformation, which is good for the first frame, but the content I'm creating contains an I-Frame every second within 6 second segments. I've tried parsing the data object from the assetWriter delegate function's didOutputSegmentData parameter, but only getting so far parsing the NALUs - the length prefixes seem to go wrong when I hit the first NALU type 8 (PPS) in the first segment. Alternatively, I could parse out the output from ffmpeg, but hoping there's a solution within Swift. Many thanks
2
0
405
Aug ’24
Metadata of Audiotracks Airplay-Receiver different from Source
Description: HLS-VOD-Stream contains several audio tracks, being marked with same language tag but different name tag. https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8 e.g. #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="bipbop_audio",LANGUAGE="eng",NAME="BipBop Audio 1",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="bipbop_audio",LANGUAGE="eng",NAME="BipBop Audio 2",AUTOSELECT=NO,DEFAULT=NO,URI="alternate_audio_aac/prog_index.m3u8" You set up Airplay from e.g. iPhone or Mac or iPad to Apple TV or Mac. Expected behavior: You see in AVPlayer and QuickTime Language Audiotrack Dropdown containing info about LANGUAGE and NAME on Airplay Sender as on Airplay Receiver - the User Interface between playing back a local Stream or Airplay-Stream is consistent. Current status: You see in UI of Player of Airplay Receiver only Information of Language Tag. Question: => Do you have an idea, if this is a missing feature of Airplay itself or a bug? Background: We'd like to offer additional Audiotrack with enhanced Audio-Characteristics for better understanding of spoken words - "Klare Sprache". Technically, "Klare Sprache" works by using an AI-based algorithm that separates speech from other audio elements in the broadcast. This algorithm enhances the clarity of the dialogue by amplifying the speech and diminishing the volume of background sounds like music or environmental noise. The technology was introduced by ARD and ZDF in Germany and is available on select programs, primarily via HD broadcasts and digital platforms like HbbTV. Users can enable this feature directly from their television's audio settings, where it may be labeled as "deu (qks)" or "Klare Sprache" depending on the device. The feature is available on a growing number of channels and is part of a broader effort to make television more accessible to viewers with hearing difficulties. It can be correctly signaled in HLS via: e.g. https://ccavmedia-amd.akamaized.net/test/bento4multicodec/airplay1.m3u8 # Audio #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="stereo-aac",LANGUAGE="de",NAME="Deutsch",DEFAULT=YES,AUTOSELECT=YES,CHANNELS="2",URI="ST.m3u8" #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="stereo-aac",LANGUAGE="de",NAME="Deutsch (Klare Sprache)",DEFAULT=NO,AUTOSELECT=YES,CHARACTERISTICS="public.accessibility.enhances-speech-intelligibility",CHANNELS="2",URI="KS.m3u8" Still there's the problem, that with Airplay-Stream you don't get this extra information but only LANGUAGE tag.
2
0
356
Aug ’24