Post not yet marked as solved
Hello,
We're developing a video streaming service. Can't get the video to stream on AirPlay2-enabled TVs, there is only a loader visible on the tV.
We can only get the sounds to stream to AirPlay2 speakers.
Media Stream Validator and HLS report on the playlists, shows that the video segments are recognized as something but can't process them. "Processed 0 of 600 segments" etc.
We're at a loss on how to debug this. Is there any way to get access to AirPlay 2 logs to see what isn't working?
Post not yet marked as solved
Hello we are experiencing a quality drop when streaming via HTTP Live Streaming on iOS 15. The file behaves as expected on iOS 14 and 13.
Is this a known issue? If so when can we expect a fix or is there a workaround?
Post not yet marked as solved
When playing several short HLS clips using AVPlayer connected to a TV using Apple's Lightning-to-HDMI adapter (A1438) we often fail with those unknown errors.
CoreMediaErrorDomain -12034
and
CoreMediaErrorDomain -12158
Anyone has any clue what the errors mean?
Environment:
iPhone8
iOS 15.4
Lightning-to-HDMI adapter (A1438)
Post not yet marked as solved
Hi,
I have an app that uses AVPlayer to stream and play videos (HLS) but I'm struggling to find a way to do the same with fmp4.
This is what I use to play my HLS stream. I tried just to replace the url with the fmp4 one but it does not work.
private func connect() {
let stringUrl = "https://wolverine.raywenderlich.com/content/ios/tutorials/video_streaming/foxVillage.m3u8"
let url = URL(string: stringUrl)!
let asset = AVURLAsset(url: url)
let item = AVPlayerItem(asset: asset)
if #available(iOS 10.0, *) {
item.preferredForwardBufferDuration = Double(50000) / 1000
}
if #available(iOS 13.0, *) {
item.automaticallyPreservesTimeOffsetFromLive = true
}
self.player = AVPlayer(playerItem: item)
let playerLayer = AVPlayerLayer(player: self.player)
playerLayer.frame = self.playerView.bounds
playerLayer.videoGravity = .resizeAspect
self.playerView.layer.addSublayer(playerLayer)
self.videoLayer = playerLayer
self.videoLayer?.frame = self.playerView.bounds
player?.play()
}
I haven't got any luck looking for a possible solution and I'm out of ideas. I'll be really grateful if anyone of you could point me to a good direction.
Post not yet marked as solved
We continuously are getting playback failed for live stream with error logs as:
//Error 1
Segment exceeds specified bandwidth for variant||The operation couldn’t be completed. (CoreMediaErrorDomain error -12889.)||
//Error 2
The operation couldn’t be completed. (CoreMediaErrorDomain error -12888 - Playlist File unchanged for longer than 1.5 * target duration)||
//Error 3
Segment exceeds specified bandwidth for variant||
If we could actually find what -16042 error actually signifies, that would be great.
P.S : We can provide any details regarding stream if required.
Post not yet marked as solved
we are playing a HLS stream with the help of AV player and trying to read HLS manifest. We are able to detect majority of the tags however player is not detecting the EXT-X-DATERANGE:ID with DURATION tag ie
#EXT-X-DATERANGE:ID="aba74c45-e963-45bf-8171-1f910c33f64a",DURATION=32.44
Where as, the other #EXT-X-DATERANGE:ID has been detected at the beginning of the manifest.
#EXT-X-DATERANGE:ID="aba74c45-e963-45bf-8171-1f910c33f64a",START-DATE="2022-03-10T13:18:15.179Z",PLANNED-DURATION=15,X-AD-ID="9858"
#EXT-X-DISCONTINUITY
We are using the AVPlayers metadata collector delegate method to detect the metadata
func metadataCollector(_ metadataCollector: AVPlayerItemMetadataCollector,
didCollect metadataGroups: [AVDateRangeMetadataGroup],
indexesOfNewGroups: IndexSet,
indexesOfModifiedGroups: IndexSet) {}
We are not able to detect the EXT-X-DATERANGE:ID with DURATION tag with the delegate used above
Any help appreciated.
Post not yet marked as solved
I would like to implement the OTT service with apple HLS playbackURL, I wonder what is the best practice for serving HLS streaming to the player whether
Separated 2 endpoint between the playlist and segment(thus, playlist not cache at the middle layer even segment can be cached)
or
Use the same endpoint either playlist and segment, the playlist and segment will request to the same endpoint (playlist not cache at the middle layer even segment can be cached)
I suspecting the 1. scenario can cause the problem at application player site by make the video stream video and audio unsync or stream not smooth compared with scenario 2.
Can somebody help for giving me the suggestion please.
Thank you
Post not yet marked as solved
I use encrypted contents in my app . Before playing I request for decryption key which helps in playing content online . I want to play content offline too . So I download the content. But how can I store decryption keys ?
Post not yet marked as solved
I've been searching all over to find an answer to this question. I know that FairPlay/HLS supports audio Streams since there's obviously audio in video, but I'm wondering if this is practical? Also is there any way to stream FairPlay DRM encrypted content without HLS? or if I use HLS could I create only a single audio bitrate format in order to save on hosting costs? I work on and Audio Book app that requires DRM for some content.
Also any links to documentation, videos, tutorials, blog posts... etc on the topic would be awesome too.
Very much appreciate anyone who takes the time to answer this. I wish this was more explicitly talked about on the Apple Dev site, but it seems very geared towards video streaming.
Hello,
I'd like to know whether Multipeer Connectivity (MPC) between two modern iOS devices can support cross-streaming of video content at low latency and relatively high resolution. I learned from Ghostbusters that crossing the streams is a bad idea, but I need this for my app so I'm going to ignore their sage advice.
The application I have in mind involves one iOS device (A) live-streaming a feed to another iOS device (B) running the same app. At the same time, B is live streaming its own feed to A. Both feeds don't need to be crystal clear, but there has to be low latency. The sample code for "Streaming an AR Experience" seems to contain some answers, as it's based on MPC (and ARKit), but my project isn't quite AR and the latency seems high.
If MPC isn't suitable for this task (as my searches seem to indicate), is it possible to have one device set up a hotspot and link the two this way to achieve my cross-streaming ambitions? This seems like a more conservative method, assuming the hotspot and its client behave like they're wifi peers (not sure). I might start a new thread with just this question if that's more appropriate.
A third idea that's not likely to work (for various reasons) is data transfer over a lightning-lightning or lightning-usb-c connection.
If I've missed any other possible solutions to my cross-streaming conundrum, please let me know.
I've been reading around this subject on this forum as well and would be hugely grateful if Eskimo would grace me with his wisdom.
Post not yet marked as solved
When playing an HLS .m3u8 playlist containing fragmented MP4 segments which specify a transformation matrix (defined in the Movie Header Box (mvhd) and Track Header Box (tkhd) atoms in ISO 14496-12), for instance a 90 degree clockwise rotation, the transformation is ignored and the video plays untransformed. Occurs both in Quicktime Player as well as well as Safari when playing the .m3u8 playlist.
Concatenating the init.mp4 and .m4s files in the playlist into a file and playing the resulting file does apply the transformation, both in Quicktime Player and Safari.
Am I doing something wrong? Are MP4 transformations not supported in HLS only? Rotations and flips seem like a pretty fundamental use case, otherwise video needs to be transcoded.
Sample files to reproduce issue here:
https://bugs.webkit.org/show_bug.cgi?id=222781
Post not yet marked as solved
I've built a web app that uses WebRTC to allow people to video chat in the browser and not be forced to download an app.
However, it would be really helpful if iOS users could stream their video using their native camera app and not just the RTC element in the browser.
Is this possible?
I've found a way to open the native camera app using this HTML:
<input type="file" accept="video/*" capture="environment">
However, this only allows the user to upload their video and not stream it.
Post not yet marked as solved
I'm getting an issue where my fairplay video playback fails by raising AVPlayerItemFailedToPlayToEndTime with
Error Domain=CoreMediaErrorDomain Code=-12927 "(null)"
I can't find a single hit on google for this error code.
I suspect that it has something to do with some kind of bad content in the FPS license response from the server. I can play unencrypted files OK. Its just the FPS content that fails. But my DRM resource loader delegate "acts" like it takes the license fine. I've played other vendor's FPS content using the same code and it works there.
All I need is a hint as to what 12927 means. Is there some way to look this up?
Post not yet marked as solved
We are working on iOT based app
where we connect Wi-Fi with dash camera and access video files from dash cam
we are loading video link in vlckit ios sdk
its working in India
in both Wi-Fi connected to dash cam and mobile data connected
its not working in US
in Wi-Fi connected to dash cam and mobile data is ON
its working in US
in Wi-Fi connected to dash cam and mobile data is OFF
Post not yet marked as solved
AVPlayer started to throw unknow error:
The operation couldn’t be completed. (CoreMediaErrorDomain error -16190.)
Is there any exhaustive list of CoreMediaErrorDomain's errors?
Post not yet marked as solved
We tried transcoding a video file to "fMP4+HLS+HEVC" using ffmpeg. The produced video is unable to play on iOS devices. Upon testing on mediastreamvalidator, could see Error injecting segment data while its fetching the media file.
[/stream/qa-josh-content/transcode_exercise/josh/fmp4_exp/x265/hvc1_tag/2a7679347f1779c1b1575488a3f140a8_master_fs_main10_yuv420.m3u8]
Started root playlist download
[v0/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8]
Started media playlist download
[v1/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8]
Started media playlist download
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
[v0/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8]
All media files delivered and have end tag, stopping
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
[v1/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8]
All media files delivered and have end tag, stopping
--------------------------------------------------------------------------------
v0/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8
--------------------------------------------------------------------------------
HTTP Content-Type: application/x-mpegURL
Processed 0 out of 4 segments
Average segment duration: 5.808333
Total segment bitrates (all discontinuities): average: 497.11 kb/s, max: 528.99 kb/s
Playlist max bitrate: 542.558000 kb/s
Audio Group ID: AUDIO
Discontinuity: sequence: 0, parsed segment count: 0 of 4, duration: 23.233 sec, average: 497.11 kb/s, max: 528.99 kb/s
--------------------------------------------------------------------------------
v1/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8
--------------------------------------------------------------------------------
HTTP Content-Type: application/x-mpegURL
Processed 0 out of 4 segments
Average segment duration: 5.808333
Total segment bitrates (all discontinuities): average: 2132.38 kb/s, max: 2229.71 kb/s
Playlist max bitrate: 2341.058000 kb/s
Audio Group ID: AUDIO
Discontinuity: sequence: 0, parsed segment count: 0 of 4, duration: 23.233 sec, average: 2132.38 kb/s, max: 2229.71 kb/s
But the master.m3u8 is playable on local (tested on VLC, ffplay).
Whats the root cause of it and how to solve it ? Please do the needful, badly stuck here for a long time.
Post not yet marked as solved
Situation
My team is uses AVPlayer to play live audio on iPhones. We would like to better understanding why a user experiences buffering.
What we are currently doing:
We are currently monitor the following AVPlayer attributes:
buffering reason
indicated bitrate
observed bitrate
error log events
What we have noticed:
Buffering reason - is always toMinimizeStalls due to the fact that the buffer is empty.
Indicated bitrate - reports the BANDWITH from the manifest url as expected.
Observed Bitrate - Values reported here can be lower than the indicated bitrate yet still stream without encountering any buffers. I would expect values under indicated bitrate to encounter buffers as described here here on the apple developer website
Error Log Events - Occasionally the error log will report an error code and message however around 60% of the time we don’t have any details from here that indicates why the user is experiencing buffering. When we do experience error codes there doesn't appear to be any map showing what the error code means.
Questions:
Is there a way to get signal strength from an iPhone (weak signal would give us some reasoning for buffering)
What is the recommended approach for getting reasons for buffering? (How to distinguish between a server side issue and a client side issue)
Are there AVPlayer settings we can manipulate to reduce buffering?
Post not yet marked as solved
I'm using the AVPlayer to streaming video, same as:
let strURL = "https://multiplatform-f.akamaihd.net/i/multi/will/bunny/big_buck_bunny_,640x360_400,640x360_700,640x360_1000,950x540_1500,.f4v.csmil/master.m3u8"
if let url = URL(string: strURL) {
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
let layer = AVPlayerLayer(player: player)
layer.frame = viewPlaying.bounds
viewPlaying.layer.addSublayer(layer)
player?.playImmediately(atRate: 4.0)
}
The video can't play, it'll play normal with rate <= 2.0 on iOS 15.x.
Anybody can give me the advise to fix this issue?
I've been maintaining adaptive HLS+FairPlay streams with audio and video for years. My implementation works great. However, I'm now also implementing captions or subtitles, and I'm having trouble with the later.
I'm actually able to generate my HLS streams with WebVTT subtitles, and they work great. But as soon as I encrypt the streams, Apple players stop working (forever stall). And my fairplay implementation works perfectly when no subtitles are involved.
I'm not encrypting the webvtt chunks: they travel as plain text, as stated in Apple guidelines. I believe this may be the issue: encrypted a/v streams with an unencrypted subtitles stream. However, encrypting plain text subtitles with SAMPLE-AES has no sense to me, so far I was unable to find a single HLS example online with subtitles that also happen to have FairPlay encryption. All documents I have about FairPlay also say nothing about this,
I've also tried applying CEA-608 closed captions in the video stream, and this actually works great with FairPlay. But CEA-608 has its own issues, and so I would like to migrate that tech to WebVTT: which also works great, except when FairPlay is involved.
I understand that Apple also establishes that I could use TTML (ISMC1) inside fMP4, which I suspect may be SAMPLE-AES encryptable. However, given my customers use cases, I need to use TS format for HLS, and so I can't use fMP4.
With all this in mind, do anybody know how to properly configure HLS+FairPlay with a plain text WebVTT subtitles stream?
Please note this is about live streaming, and not VOD nor offline playback.
Thanks.
Post not yet marked as solved
HI, I am trying to use mediastreamsegmenter and tsrecompressor but seems like I am unable to make udp connection.
As I am getting "video encoder pipeline full" error while using tsrecompressor.
mediastreamsegmenter -w 1002 -t 4 224.0.0.50:9123 -s 16 -D -T -f /path
tsrecomprtsrecompressor -L 224.0.0.50:9123 -h -g -x -a.
PLease let me know what else need to done.
Thanks