HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

Posts under HTTP Live Streaming tag

81 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

AVAssetDurationDidChange - notification not receiving in iOS 17
We utilized AVFragmentedAssetMinder to refresh the player data. While notifications for AVAssetDurationDidChange were consistently received whenever the player duration changed. However, following the release of iOS 17, notifications for AVAssetDurationDidChange ceased to be received. Could you please advise anyone why this notification is not being triggered? what we have to change NotificationCenter.default.addObserver(self, selector: #selector(self.onVideoUpdate), name: .AVAssetDurationDidChange, object: nil) #AVPLAyer, #AVMUtableMovie
0
0
394
Mar ’24
How to encode MV-HEVC spatial videos using HLS tool correctly?
Hi all, I am a graduate student who is looking into making MV-HEVC videos streamable. May I ask that is it possible to encode mv-hevc videos with the HLS (Http Live Streaming) protocol? I've been trying to use the HLS tool by Apple to encode a spatial video recored by VP. mediafilesegmenter -iso-fragmented -t 4 -f sp_video-1-vp spatial-video-by-vp.MOV But the output HLS playlist file doesn't look like the format that Apple proposes in the WWDC video. For example, the attribute EXT-X-VERSION is 7 instead of 12, and no attr REQ-VIDEO-LAYOUT=CH-STEREO which should be the key indicator of the spatial video type. From what the WWDC video showcased, I assume Apple's HLS tool supports it. Maybe my usage is not correct. Just curious what you guys think about it, thank you!
1
1
703
Mar ’24
iPhone webRTC / getUserMedia only uses headset mic with low volume. Can we change mics?
Hi hope all are well! We've been working on a live streaming app and it's going quite well! Just got the aspect ratio locked as desired. Now the audio, its volume is extremely low. It sounds like it's using the headset mic instead of the bottom mic that's used on Facetime or on speakerphone calls. We tried flipping cameras and specifying sample rates, almost every constraint in MediaConstraints - no go! Is there any way to specify this? Thanks in advance!
1
0
454
Mar ’24
How to achieve license renewal case in safari based on ContentKeyDuration
Dear Apple Engineers, I have downloaded the FairPlay Streaming SDK 4.4 - In which i could able to make use of the fps_safari_hls_example.html file to make a successful playback of fairplay protected content in safari, by pointing our fairplay license server, .m3u8 file & certificate. Now that, i'm trying to achieve the renewal concept, so i tried to use the fps_safari_hls_key_renewal.html file also setting up the ContentKeyDuration to 20sec in FP license server backend. But client didn't make any subsequent FP license request around when nearing 20th sec or post 20th sec. I wonder if this use case be achieved in safari, the only extra functionality i could see in renewal html file is below piece of code await runAndWaitForLicenseRequest(session, keyURI, () => { session.update(stringToUInt8Array("renew")) }); Based on the above piece, i assume that we are making sure that client to aware the when to renew the license. But in my case, there were no FP request is made , in-fact, this piece of code got executed immediate after the 1st FP license request call & protected media continues to play despite setting the ContentKeyDuration to 20sec with LIMITED as contentkeyType . Could you please suggest on how to achieve the subsequent renewal request from client based on the ContentKeyDuration send in the CKC response using this sample renewal html file..? Is there any tewaks to be made in html file, kindly suggest.
0
0
460
Mar ’24
Video player has an incorrect seek bar time with live stream on iOS 17
I've encountered an issue with the seek bar time display in the video player on iOS 17, specifically affecting live stream videos using HLS manifests with the time displayed in am/pm format. As the video progresses, the displayed start time appears to shift backwards in time with a peculiar pattern: Displayed Start Time = Normal Start Time - Viewed Duration For instance, if a program begins at 9:00 AM, at 9:30 AM, the start time shown will erroneously be 8:30 AM. Similarly, at 9:40 AM, the displayed start time will be 8:20 AM. This issue is observed with both VideoPlayer and AVPlayerViewController on iOS 17. The same implementation of the video player on iOS 16 displays the duration of the viewed program and doesn’t have any issues. Please advise on any known workarounds or solutions to address this issue on iOS 17.
3
1
626
Jun ’24
180VR streaming expectations
I have a VisionOS app that streams 180VR video to create immersive experiences, similar to the Apple TV+ immersive content. These videos are 8k60fps. When looking at "video encoding requirements" (1.25) in the hls authoring specifications, there are only specifications for up to 4k30fps for MV-HEVC. Using the "power of .75 rule" I think that 160000kbps could be close to the 8k60fps recommendation. For launching my app, I did my best guess in creating the following multivariant playlist, with a generous low end and very generous high end, with 6 second segment durations for all variants. However, in practice, users seem to be to only be picking up the lower quality bandwidth content (80000kbps) even when speed tests are showing 5x that on the device. This leads to a lot artifacts in the content during playback, since its lower quality. If I hard code a higher bit rate variant (like 240000kbps) it does playback, but obviously has a bit more lag to start up. Now that I have my Vision Pro, I've been able to see the Apple TV+ immersive content. I could be wrong, but it doesn't feel like its varying playback - when watch by tethering to my phone vs on high speed wifi, the content looks the same just a little slower to load on the phone. I'm looking for 3 points of guidance: Are there hls recommendations for 8k60fps video? For both bitrates and target duration for the segments? Any guesses as to why I am not able to pick up the high bitrates? (this could simply be that the higher ends are still too high) While 180VR is just stereo video on a larger canvas, the viewing experience is quite different due to the immersion. Are there special recommendations for 180VR video? (Such as having only one variant and specified bitrate since a changing bitrate/video quality could be jarring to the viewer) Example HLS multivariant playlist: #EXTM3U #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=80632574,BANDWIDTH=82034658,VIDEO-RANGE=SDR,CODECS="mp4a.40.2,hvc1.1.60000000.L183.B0",RESOLUTION=4096x4096,FRAME-RATE=59.940,CLOSED-CAPTIONS=NONE https://myurl.com/stream/t4096p_080000_kbps_t6/prog_index.m3u8 #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=160782750,BANDWIDTH=162523567,VIDEO-RANGE=SDR,CODECS="mp4a.40.2,hvc1.1.60000000.L186.B0",RESOLUTION=4096x4096,FRAME-RATE=59.940,CLOSED-CAPTIONS=NONE https://myurl.com/stream/t4096p_160000_kbps_t6/prog_index.m3u8 #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=240941602,BANDWIDTH=243997537,VIDEO-RANGE=SDR,CODECS="mp4a.40.2,hvc1.1.60000000.L186.B0",RESOLUTION=4096x4096,FRAME-RATE=59.940,CLOSED-CAPTIONS=NONE https://myurl.com/stream/t4096p_240000_kbps_t6/prog_index.m3u8 #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=321061516,BANDWIDTH=325312545,VIDEO-RANGE=SDR,CODECS="mp4a.40.2,hvc1.1.60000000.H255.B0",RESOLUTION=4096x4096,FRAME-RATE=59.940,CLOSED-CAPTIONS=NONE https://myurl.com/stream/t4096p_320000_kbps_t6/prog_index.m3u8
2
0
767
Mar ’24
Spatial Video In Apple Vision Pro
HELP! How could I play a spatial video in my own vision pro app like the official app Photos? I've used API of AVKit to play a spatial video in XCode vision pro simulator with the guild of the official developer document, this video could be played but it seems different with what is played through app Photos. In Photos the edge of the video seems fuzzy but in my own app it has a clear edge. How could I play the spatial video in my own app with the effect like what is in Photos?
1
0
836
3w
Is CAN-BLOCK-RELOAD=YES required for getting low latency behaviour on Safari and AVPlayer?
I am testing HLS Low latency streams but could not to a low playback latency. For comparison purpose I used Apple HLS-LL stream(https://ll-hls-test.cdn-apple.com/llhls4/ll-hls-test-04/multi.m3u8) where I can get latency of 6 seconds on Safari and AVPlayer. The main difference between the non-working and working streams is CAN-BLOCK-RELOAD attribute. This is set as YES for Apple HLS-LL test stream where latency is observed as 6 seconds vs other stream where latency is ~= 3 full segments duration(~18 seconds). As a test I tried to change CAN-BLOCK-RELOAD=YES to CAN-BLOCK-RELOAD=NO in variant playlist response of the Apple stream using a proxy tool after which I get same result of latency ~= 3 full segments duration(~18 seconds). On AVPlayer, I see following log with CAN-BLOCK-RELOAD=YES which seems to point towards AVPlayer entering low latency mode when setting up blocking reload. mediaserverd <SEGPUMP> segPumpSetupBlockingReload: 0x1040e7600: Player entering Low Latency mode Can Apple support or someone forum members who might have experience with this please share if CAN-BLOCK-RELOAD=YES is mandatory for low latency mode to be enabled in Safari browser and AVPlayer on iOS/tvOS?
3
1
722
Feb ’24
JS Fullscreen API's not working on IOS 17
According to docs the webkitEnterFullScreen() only works in IOS if the element is <video>, but own after i have updated to IOS 17.1.2 it's not working, i have tested it in chrome and safari both. Even the test code W3School for Fullscreen does not work in IOS 17.1.2 Test Done Model: 1Phone 15 pro max IOS Version: 17.0.2 Browser: Chrome Test URL https://www.w3schools.com/howto/howto_js_fullscreen.asp
0
1
796
Jan ’24
AppleCoreMedia 1.0.0.21B101 always requesting 2 streams from ABR
Hi Team, We see an issue with this version if CoreMedia requesting multiple qualities at all times for a stream. We don't see this issue on 1.0.0.21C62. We are unsure what would be causing this. [2024-01-05 16:53:51] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1145 2529 1090199 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1146 2396 1013356 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_0.fmp4 HTTP/1.0" 200 1139 24975 1013385 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1145 2603 998670 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_1.fmp4 HTTP/1.0" 200 1138 40534 998739 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1145 2677 835327 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_2.fmp4 HTTP/1.0" 200 1138 57656 835207 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1146 2458 986038 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_1.fmp4 HTTP/1.0" 200 1139 24700 986032 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1145 2751 1013257 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_3.fmp4 HTTP/1.0" 200 1138 55900 1013324 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1146 2520 1016693 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_2.fmp4 HTTP/1.0" 200 1139 25014 1016717 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1145 2825 917753 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_4.fmp4 HTTP/1.0" 200 1138 103745 917903 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1146 2582 958102 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_3.fmp4 HTTP/1.0" 200 1139 24782 958195 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1145 2899 931101 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_5.fmp4 HTTP/1.0" 200 1138 112113 931228 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1146 2644 935550 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_4.fmp4 HTTP/1.0" 200 1139 24824 937720 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1146 2706 895680 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_5.fmp4 HTTP/1.0" 200 1139 24843 895734 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=631&_HLS_part=0 HTTP/1.0" 200 1145 2529 907045 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
0
0
532
Jan ’24
No audio in generated mpeg4AppleHLS by AVAssetWriter
I'm trying to use AVCaptureSession and AVAssetWriter to convert video and audio from an iPhone's camera and microphone into a fragmented video file in AppleHLS format. Below is part of the code. It seems that the capture is successful, and I have confirmed that the data received with captureOutput() can be appended to videoWriterIput and audioWriterInput using append(). When executing audioWriterInput!.append(sampleBuffer), sampleBuffer has the following value, and it looks like the audio data has been passed to AssetWriter. sampleBuffer.duration : CMTime(value: 941, timescale: 44100, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0) sampleBuffer.totalSampleSize : 1882 However, the final output init.mp4 and *.m4s do not contain Audio. (The video can be played without any problems.) Could you please tell me any problems or hints as to why Audio is not included? /// Capture Session let captureSession = AVCaptureSession() /// Capture Input var videoDevice: AVCaptureDevice? var audioDevice: AVCaptureDevice? /// Configure and Start Capture Session func startCapture() { // Start Configuration captureSession.beginConfiguration() // Setup Input Video videoDevice = self.defaultCamera(cameraSide: cameraSide) videoDevice!.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 30) let videoInput = try AVCaptureDeviceInput(device: videoDevice!) as AVCaptureDeviceInput captureSession.addInput(videoInput) // Setup Input Audio audioDevice = AVCaptureDevice.default(for: AVMediaType.audio) let audioInput = try AVCaptureDeviceInput(device: audioDevice!) as AVCaptureDeviceInput captureSession.addInput(audioInput) // Setup Output Video let videoDataOutput = AVCaptureVideoDataOutput() videoDataOutput.setSampleBufferDelegate(self, queue: recordingQueue) videoDataOutput.alwaysDiscardsLateVideoFrames = true videoDataOutput.videoSettings = [ kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)] as [String : Any] captureSession.addOutput(videoDataOutput) // Setup Output Audio let audioDataOutput = AVCaptureAudioDataOutput() audioDataOutput.setSampleBufferDelegate(self, queue: recordingQueue) captureSession.addOutput(audioDataOutput) //End Configuration captureSession.commitConfiguration() // Start Capture captureSession.startRunning() } private let assetWriter: AVAssetWriter? private let startTimeOffset: CMTime private var audioWriterInput: AVAssetWriterInput? private let videoWriterInput: AVAssetWriterInput? func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { if assetWriter == nil { // AssetWriter assetWriter = AVAssetWriter(contentType: UTType(AVFileType.mp4.rawValue)!) self.startTimeOffset = CMTime(value: 1, timescale: 1) // Setup Input of Audio. let audioCompressionSettings: [String: Any] = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVSampleRateKey: 44_100, AVNumberOfChannelsKey: 1, AVEncoderBitRateKey: 128_000 ] audioWriterInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioCompressionSettings) audioWriterInput!.expectsMediaDataInRealTime = true assetWriter.add(audioWriterInput!) // Setup Input of Video. let videoCompressionSettings: [String: Any] = [ AVVideoCodecKey: AVVideoCodecType.h264 ] let videoCompressionSettings: [String: Any] = [ AVVideoCodecKey: AVVideoCodecType.h264, AVVideoWidthKey: 1280, AVVideoHeightKey: 720, AVVideoCompressionPropertiesKey: [ kVTCompressionPropertyKey_AverageBitRate: 1_024_000, kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Baseline_AutoLevel ] ] videoWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoCompressionSettings) videoWriterInput!.expectsMediaDataInRealTime = true pixelBuffer = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: videoWriterInput!, sourcePixelBufferAttributes: [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]) assetWriter.add(videoWriterInput!) // Configure the asset writer for writing data in fragmented MPEG-4 format. assetWriter.outputFileTypeProfile = AVFileTypeProfile.mpeg4AppleHLS assetWriter.preferredOutputSegmentInterval = CMTime(seconds: 1.0, preferredTimescale: 1) assetWriter.initialSegmentStartTime = startTimeOffset assetWriter.delegate = self // start AssetWriiter startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) assetWriter.startWriting() assetWriter.startSession(atSourceTime: startTime) } let isVideo = output is AVCaptureVideoDataOutput if isVideo { if videoWriterInput.isReadyForMoreMediaData { videoWriterInput!.append(sampleBuffer) } }else{ if audioWriterInput!.isReadyForMoreMediaData { audioWriterInput!.append(sampleBuffer) } } } func assetWriter(_ writer: AVAssetWriter, didOutputSegmentData segmentData: Data, segmentType: AVAssetSegmentType, segmentReport: AVAssetSegmentReport?) { : : }
0
0
444
Dec ’23