Post not yet marked as solved
We are experiencing audio sync issues during playback on fMP4 HLS live streams (HLS and LL-HLS) on Apple devices only (iOS and macOS) and we're not sure what's causing the problem. The issue does not occur during playback on Windows or Android platforms.
During playback in Safari, everything is fine until the sync gets lost suddenly, usually 5-10 minutes after playback begins. The extent of the desync varies but is very noticeable when it does - usually in the 15-30 frame range. Sync is always restored when restarting the player, until it becomes lost again some minutes later.
We are capturing the streams on iPhone devices and encoding HEVC / AAC-LC at 30fps locally on the device, and then sending to a media server for further processing. We then transcode the source stream and create multiple variations at different bitrates (HEVC). Because we are streaming from mobile devices in the field, during our server-side transcoding we set a constant 30fps frame rate in case of drops due to network issues. I should add that the issue occurs just as much with h264 as HEVC (we've tested many different combinations of input/output formats and protocols).
Regardless of whether we playback the source stream, the individual transcoded variations, or the ABR playlist with all variations, the sync problem appears in the same manner.
One interesting note. The issue seldom occurs on one of our older devices, an iPhone 6s Plus running a slightly older iOS version (14.4.1).
We suspect it has something to do with discontinuities inherent in our input streams that are not being corrected during our normalization/transcoding process. The Apple player is not compensating as other players are doing on other platforms.
We've run the Apple MediaStreamValidator validator tool and discovered multiple "must fix" issues - but it's not clear which of these, if any, are causing our problems. See output attached.
MediaStreamValidator output
Also, here is the full HLS report from the validator tool (in PNG format due to file restrictions here):
Happy to share more details or run more tests. We've been trying to debug this for weeks now. Thanks for your help.
Post not yet marked as solved
I've been searching all over to find an answer to this question. I know that FairPlay/HLS supports audio Streams since there's obviously audio in video, but I'm wondering if this is practical? Also is there any way to stream FairPlay DRM encrypted content without HLS? or if I use HLS could I create only a single audio bitrate format in order to save on hosting costs? I work on and Audio Book app that requires DRM for some content.
Also any links to documentation, videos, tutorials, blog posts... etc on the topic would be awesome too.
Very much appreciate anyone who takes the time to answer this. I wish this was more explicitly talked about on the Apple Dev site, but it seems very geared towards video streaming.
Post not yet marked as solved
Hi
I wish to know where I can find documentation / sdk to user widevine and / or Fairplay in our application.
Is there any examples of using it in AVPlayer?
Thank you very much for any help.
Hello,
I'd like to know whether Multipeer Connectivity (MPC) between two modern iOS devices can support cross-streaming of video content at low latency and relatively high resolution. I learned from Ghostbusters that crossing the streams is a bad idea, but I need this for my app so I'm going to ignore their sage advice.
The application I have in mind involves one iOS device (A) live-streaming a feed to another iOS device (B) running the same app. At the same time, B is live streaming its own feed to A. Both feeds don't need to be crystal clear, but there has to be low latency. The sample code for "Streaming an AR Experience" seems to contain some answers, as it's based on MPC (and ARKit), but my project isn't quite AR and the latency seems high.
If MPC isn't suitable for this task (as my searches seem to indicate), is it possible to have one device set up a hotspot and link the two this way to achieve my cross-streaming ambitions? This seems like a more conservative method, assuming the hotspot and its client behave like they're wifi peers (not sure). I might start a new thread with just this question if that's more appropriate.
A third idea that's not likely to work (for various reasons) is data transfer over a lightning-lightning or lightning-usb-c connection.
If I've missed any other possible solutions to my cross-streaming conundrum, please let me know.
I've been reading around this subject on this forum as well and would be hugely grateful if Eskimo would grace me with his wisdom.
Post not yet marked as solved
Trying to download an encrypted HLS stream we faced the following issue:
When we start a new download, calling resume() function of AVAssetDownloadTask, the download process gets stuck (not every time) and neither of urlSession(_:assetDownloadTask:didFinishDownloadingTo:) or urlSession(_:task:didCompleteWithError:) delegate functions (AVAssetDownloadDelegate) are getting called.
There are cases where not even the urlSession(_:assetDownloadTask:didLoad:totalTimeRangesLoaded:timeRangeExpectedToLoad:) delegate function is getting called.
Any suggestions on how to troubleshoot?
Post not yet marked as solved
Hi,It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.Any ideas, tips, advice would be greatly appreciated.Thanks,Frank
Post not yet marked as solved
Seeing this error a lot while playing HTTP live stream. AVPlayer stops play when it encounters this error. I reasearched about the error but couldnt find any clue. Anyone has any recommendations or work around for this error?
Post not yet marked as solved
When playing an HLS .m3u8 playlist containing fragmented MP4 segments which specify a transformation matrix (defined in the Movie Header Box (mvhd) and Track Header Box (tkhd) atoms in ISO 14496-12), for instance a 90 degree clockwise rotation, the transformation is ignored and the video plays untransformed. Occurs both in Quicktime Player as well as well as Safari when playing the .m3u8 playlist.
Concatenating the init.mp4 and .m4s files in the playlist into a file and playing the resulting file does apply the transformation, both in Quicktime Player and Safari.
Am I doing something wrong? Are MP4 transformations not supported in HLS only? Rotations and flips seem like a pretty fundamental use case, otherwise video needs to be transcoded.
Sample files to reproduce issue here:
https://bugs.webkit.org/show_bug.cgi?id=222781
Post not yet marked as solved
Hi,
when I use a local .mp4 video file encoded in HEVC + Alpha channel with an AVPlayer as the material of a SCNNode, the transparency is restituted correctly similarly as if I used a .png image with transparency.
The issue is:
when I encode this same .mp4 file into a HLS stream using mediafilesegmenter and try to play it in the same manner as a SCNNode material with AVPlayer the transparency is not restituted and instead the transparent zones are filled with opaque black. (the hls stream has correct transparency as verified by opening its url with Safari)
Sample Test:
import UIKit
import ARKit
class ViewController: UIViewController {
private var arView: ARSCNView!
lazy var sphere: SCNNode = {
let node = SCNSphere(radius: 5)
node.isGeodesic = false
node.segmentCount = 64
node.firstMaterial?.lightingModel = .constant
node.firstMaterial?.diffuse.contents = colorLiteral(red: 0, green: 0, blue: 0, alpha: 0)
node.firstMaterial?.cullMode = .front
return SCNNode(geometry: node)
}()
private var avPlayer: AVPlayer!
override func viewDidLoad() {
super.viewDidLoad()
setupArView()
setupArSession()
setupButton()
}
private func setupButton() {
let button = UIButton()
button.setTitle("START", for: .normal)
button.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(button)
NSLayoutConstraint.activate([
button.centerXAnchor.constraint(equalTo: view.centerXAnchor),
button.centerYAnchor.constraint(equalTo: view.centerYAnchor)
])
button.addTarget(self, action: #selector(createSphere), for: .touchUpInside)
}
@IBAction func createSphere() {
guard avPlayer == nil else { return }
addSphere()
}
}
extension ViewController {
private func setupArView() {
arView = ARSCNView()
arView.backgroundColor = .black
arView.translatesAutoresizingMaskIntoConstraints = false
view.insertSubview(arView, at: 0)
NSLayoutConstraint.activate([
arView.leadingAnchor.constraint(equalTo: view.leadingAnchor),
arView.topAnchor.constraint(equalTo: view.topAnchor),
arView.trailingAnchor.constraint(equalTo: view.trailingAnchor),
arView.bottomAnchor.constraint(equalTo: view.bottomAnchor)
])
arView.preferredFramesPerSecond = 60
}
private func setupArSession() {
let configuration = ARWorldTrackingConfiguration()
configuration.worldAlignment = .gravityAndHeading
configuration.environmentTexturing = .none
if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {
configuration.frameSemantics.insert(.personSegmentationWithDepth)
}
if ARWorldTrackingConfiguration.supportsUserFaceTracking {
configuration.userFaceTrackingEnabled = true
}
arView.session.run(configuration)
}
private func addSphere() {
// let asset = AVURLAsset(url: URL(string: "https://SOMECLOUDSTORAGE.com/hls-bug/prog_index.m3u8")!)
let asset = AVURLAsset(url: Bundle.main.url(forResource: "puppets", withExtension: "mp4")!)
let playerItem = AVPlayerItem(asset: asset)
playerItem.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil)
playerItem.isAudioSpatializationAllowed = true
playerItem.allowedAudioSpatializationFormats = .monoStereoAndMultichannel
avPlayer = AVPlayer()
sphere.position = SCNVector3(0, 0, 0)
arView.scene.rootNode.addChildNode(sphere)
avPlayer.replaceCurrentItem(with: playerItem)
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == #keyPath(AVPlayerItem.status) {
let status: AVPlayerItem.Status
if let statusNumber = change?[.newKey] as? NSNumber {
status = AVPlayerItem.Status(rawValue: statusNumber.intValue)!
} else {
status = .unknown
}
switch status {
case .readyToPlay:
DispatchQueue.main.async {
self.avPlayer.playImmediately(atRate: 1)
self.sphere.geometry?.firstMaterial?.diffuse.contents = self.avPlayer
}
case .failed, .unknown:
break
@unknown default:
break
}
} else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
}
The video file used is the puppets_with_alpha_hevc.mov file from the Apple's HEVC alpha demo that I re-muxed into .mp4 container using ffmpeg.
To reproduce both scenarios replace the AVURLAsset with either a local .mp4 file or the HLS stream url.
Issue reproduced on iPhone 11 Pro iOS 15.
This issue has been unresolved for a bit of time now though I tried everything to get attention. Unsuccessful TSI ticket, silent Feedback Assistant Bug report, I even discussed about this bug during WWDC 2021 with Shiva Sundar who is in charge of HEVC dev and who said it would be checked.
Hopes
Post not yet marked as solved
We tried transcoding a video file to "fMP4+HLS+HEVC" using ffmpeg. The produced video is unable to play on iOS devices. Upon testing on mediastreamvalidator, could see Error injecting segment data while its fetching the media file.
[/stream/qa-josh-content/transcode_exercise/josh/fmp4_exp/x265/hvc1_tag/2a7679347f1779c1b1575488a3f140a8_master_fs_main10_yuv420.m3u8]
Started root playlist download
[v0/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8]
Started media playlist download
[v1/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8]
Started media playlist download
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
[v0/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8]
All media files delivered and have end tag, stopping
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
Error injecting segment data
[v1/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8]
All media files delivered and have end tag, stopping
--------------------------------------------------------------------------------
v0/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8
--------------------------------------------------------------------------------
HTTP Content-Type: application/x-mpegURL
Processed 0 out of 4 segments
Average segment duration: 5.808333
Total segment bitrates (all discontinuities): average: 497.11 kb/s, max: 528.99 kb/s
Playlist max bitrate: 542.558000 kb/s
Audio Group ID: AUDIO
Discontinuity: sequence: 0, parsed segment count: 0 of 4, duration: 23.233 sec, average: 497.11 kb/s, max: 528.99 kb/s
--------------------------------------------------------------------------------
v1/2a7679347f1779c1b1575488a3f140a8_prog_index_fs_main10_yuv420.m3u8
--------------------------------------------------------------------------------
HTTP Content-Type: application/x-mpegURL
Processed 0 out of 4 segments
Average segment duration: 5.808333
Total segment bitrates (all discontinuities): average: 2132.38 kb/s, max: 2229.71 kb/s
Playlist max bitrate: 2341.058000 kb/s
Audio Group ID: AUDIO
Discontinuity: sequence: 0, parsed segment count: 0 of 4, duration: 23.233 sec, average: 2132.38 kb/s, max: 2229.71 kb/s
But the master.m3u8 is playable on local (tested on VLC, ffplay).
Whats the root cause of it and how to solve it ? Please do the needful, badly stuck here for a long time.
Post not yet marked as solved
I'm getting an issue where my fairplay video playback fails by raising AVPlayerItemFailedToPlayToEndTime with
Error Domain=CoreMediaErrorDomain Code=-12927 "(null)"
I can't find a single hit on google for this error code.
I suspect that it has something to do with some kind of bad content in the FPS license response from the server. I can play unencrypted files OK. Its just the FPS content that fails. But my DRM resource loader delegate "acts" like it takes the license fine. I've played other vendor's FPS content using the same code and it works there.
All I need is a hint as to what 12927 means. Is there some way to look this up?
Post not yet marked as solved
We are working on iOT based app
where we connect Wi-Fi with dash camera and access video files from dash cam
we are loading video link in vlckit ios sdk
its working in India
in both Wi-Fi connected to dash cam and mobile data connected
its not working in US
in Wi-Fi connected to dash cam and mobile data is ON
its working in US
in Wi-Fi connected to dash cam and mobile data is OFF
Post not yet marked as solved
AVPlayer started to throw unknow error:
The operation couldn’t be completed. (CoreMediaErrorDomain error -16190.)
Is there any exhaustive list of CoreMediaErrorDomain's errors?
Post not yet marked as solved
Situation
My team is uses AVPlayer to play live audio on iPhones. We would like to better understanding why a user experiences buffering.
What we are currently doing:
We are currently monitor the following AVPlayer attributes:
buffering reason
indicated bitrate
observed bitrate
error log events
What we have noticed:
Buffering reason - is always toMinimizeStalls due to the fact that the buffer is empty.
Indicated bitrate - reports the BANDWITH from the manifest url as expected.
Observed Bitrate - Values reported here can be lower than the indicated bitrate yet still stream without encountering any buffers. I would expect values under indicated bitrate to encounter buffers as described here here on the apple developer website
Error Log Events - Occasionally the error log will report an error code and message however around 60% of the time we don’t have any details from here that indicates why the user is experiencing buffering. When we do experience error codes there doesn't appear to be any map showing what the error code means.
Questions:
Is there a way to get signal strength from an iPhone (weak signal would give us some reasoning for buffering)
What is the recommended approach for getting reasons for buffering? (How to distinguish between a server side issue and a client side issue)
Are there AVPlayer settings we can manipulate to reduce buffering?
Post not yet marked as solved
I'm using the AVPlayer to streaming video, same as:
let strURL = "https://multiplatform-f.akamaihd.net/i/multi/will/bunny/big_buck_bunny_,640x360_400,640x360_700,640x360_1000,950x540_1500,.f4v.csmil/master.m3u8"
if let url = URL(string: strURL) {
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
let layer = AVPlayerLayer(player: player)
layer.frame = viewPlaying.bounds
viewPlaying.layer.addSublayer(layer)
player?.playImmediately(atRate: 4.0)
}
The video can't play, it'll play normal with rate <= 2.0 on iOS 15.x.
Anybody can give me the advise to fix this issue?
Post not yet marked as solved
HI, I am trying to use mediastreamsegmenter and tsrecompressor but seems like I am unable to make udp connection.
As I am getting "video encoder pipeline full" error while using tsrecompressor.
mediastreamsegmenter -w 1002 -t 4 224.0.0.50:9123 -s 16 -D -T -f /path
tsrecomprtsrecompressor -L 224.0.0.50:9123 -h -g -x -a.
PLease let me know what else need to done.
Thanks
Post not yet marked as solved
I am using GoogleWebRTC to do live view.There are padding data attached to video frame,I want to get the padding data,if there is any way to do this?
Post not yet marked as solved
Hello! I trying to inserting and playing a live stream inside video tag. In chrome it works perfectly, video start playing as soon as possible, but in safari I can see when live stream is available and poster was removed(autoplay was successful triggered), but until I switch between browser tabs, I see only black screen where the video should be. Also I can see my livestream if I switch between applications windows e.g. I switch to slack then back to the safari and video will playing.
During the study of the problem, I managed to find out that is not problem about webrtc (because on server I can see that live stream bytes was delivered to client), and it's not problem of html5 video tag, because autoplay (muted, of course) always triggered as expected. But this may be due to internal safari tricks with web pages, because only one thing happens when I switch between tabs/apps -- it's document.visibilityState.
How can I fix this behavior? I want to see how my stream works equally well in all browsers and now the problem I described is observed exclusively in safari
Post not yet marked as solved
On iOS 15.2 when playing live stream content AVPlayerItem.status reports AVPlayerItemStatusUnknown
but it does NOT report AVPlayerItemStatusReadyToPlay
or AVPlayerItemStatusFailed.
It remains in buffering state somehow. It is not a network/connection issue because it works fine on older versions of iOS. It also works fine on other type of content, it only happens on live streams like 90% of time.
Post not yet marked as solved
Hello, I have some Videos that I can play with AVPlayer but I can't Download and Play offline from my device. When I download the asset and play the asset one time then I can play offline the asset as it would have been downloaded.I use the code from apple's recommended project https://developer.apple.com/documentation/avfoundation/media_assets_playback_and_editing/using_avfoundation_to_play_and_persist_http_live_streamsBy replacing in apples project the link with my own link I experience the same result. Also in my project the other .m3u8 video that I am trying to download work fine. The main difference is that in my link URL Signing is being used.Also in apples project the video can be played but not downloaded and play.Example Link to test my download is: https://tr.vod.cdn.cosmotetvott.gr/v1/524/16/403201605073/403201605073.ism/.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4The video does not have any encryption so that's not the issue, but it might be downloading over HTTPS.When I download the Container From Devices for my app I see that I have downloaded the myVideo.movpkg file and inside that Folder I have the root.xml file, the Data folder, the boot.xml file, and only 2 folders with fragment data (In some videos only 1 folder). When I play the downloaded video with Internet connection more folders with fragments are being downloaded. And if I download all the fragment folders then the video can be played offline normally.root.xml file<?xml version="1.0" encoding="UTF-8"?>
<MoviePackage xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://apple.com/IMG/Schemas/MoviePackage" xsi:schemaLocation="http://apple.com/IMG/Schemas/MoviePackage /System/Library/Schemas/MoviePackage.xsd">
<Version>1.0</Version>
<MoviePackageType>HLS</MoviePackageType>
<BootImage>boot.xml</BootImage>
</MoviePackage>boot.xml file <?xml version="1.0" encoding="UTF-8"?>
<HLSMoviePackage xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://apple.com/IMG/Schemas/HLSMoviePackage" xsi:schemaLocation="http://apple.com/IMG/Schemas/HLSMoviePackage /System/Library/Schemas/HLSMoviePackage.xsd">
<Version>1.0</Version>
<HLSMoviePackageType>PersistedStore</HLSMoviePackageType>
<Streams><Stream ID="0-ROFVRWFPBCMNXKILE7FOE4O3PNGGD2HZ-454000" Path="0-ROFVRWFPBCMNXKILE7FOE4O3PNGGD2HZ-454000" NetworkURL="https://gr-ampelokipoi-prod-cache01.vod.cdn.cosmotetvott.gr/v1/524/16/403201605073/403201605073.ism/403201605073-audio_eng=128000-video=300000.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4"><Complete>YES</Complete></Stream><Stream ID="0-IZ4BCCN2TMQXA2S47ILRQOD5IPNLNK5R-454000" Path="0-IZ4BCCN2TMQXA2S47ILRQOD5IPNLNK5R-454000" NetworkURL="https://tr.vod.cdn.cosmotetvott.gr/v1/524/16/403201605073/403201605073.ism/403201605073-audio_eng=128000-video=300000.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4"><Complete>NO</Complete></Stream></Streams>
<MasterPlaylist><NetworkURL>https://tr.vod.cdn.cosmotetvott.gr/v1/524/16/403201605073/403201605073.ism/.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4</NetworkURL></MasterPlaylist><DataItems Directory="Data"><DataItem><ID>65E586E6-53C7-4750-A5B7-55416C783867</ID><Category>Playlist</Category><Name>master.m3u8</Name><DataPath>Playlist-master.m3u8-65E586E6-53C7-4750-A5B7-55416C783867.data</DataPath><Role>Master</Role></DataItem></DataItems></HLSMoviePackage>Inside Data folder Playlist-master.m3u8-65...67.data file#EXTM3U
#EXT-X-VERSION:1
## Created with Unified Streaming Platform (version=1.10.12-18737)
# variants
#EXT-X-STREAM-INF:BANDWIDTH=454000,CODECS="mp4a.40.2,avc1.77.30",RESOLUTION=424x240,FRAME-RATE=25
403201605073-audio_eng=128000-video=300000.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4
#EXT-X-STREAM-INF:BANDWIDTH=878000,CODECS="mp4a.40.2,avc1.77.30",RESOLUTION=640x360,FRAME-RATE=25
403201605073-audio_eng=128000-video=700000.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4
#EXT-X-STREAM-INF:BANDWIDTH=1726000,CODECS="mp4a.40.2,avc1.77.31",RESOLUTION=1024x576,FRAME-RATE=25
403201605073-audio_eng=128000-video=1500000.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4
#EXT-X-STREAM-INF:BANDWIDTH=3952000,CODECS="mp4a.40.2,avc1.77.31",RESOLUTION=1280x720,FRAME-RATE=25
403201605073-audio_eng=128000-video=3600000.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4
#EXT-X-STREAM-INF:BANDWIDTH=7026000,CODECS="mp4a.40.2,avc1.77.40",RESOLUTION=1920x1080,FRAME-RATE=25
403201605073-audio_eng=128000-video=6500000.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4
# variants
#EXT-X-STREAM-INF:BANDWIDTH=136000,CODECS="mp4a.40.2"
403201605073-audio_eng=128000.m3u8?hdnts=st=1578561097~exp=1578604327~acl=*~id=3497d811-9d5d-4f6e-9727-e08e759b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4
b3eec~hmac=014af872b1c5455e3369c2a6005a3346e74a7da4My question is what's wrong, is it a client issue or something must change from back-end?