Now I am working on a project that carry private information through ID3.PRIV field within HLS stream and expected get the information from function metadataOutput of protocol AVPlayerItemMetadataOutputPushDelegate. But there is a tricky situation when I played. metadataOutput is not always triggered. Sometimes it happens, sometime not. However, web player, Video.js could always get the data when it played. I am wondering what's the difference between AVPlayer and Video.js.
For this issue, I took some experiments.
First, I took basic HLS stream(https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8) which carry a ID3 data every 5 seconds. I found that every time the player launched, metadataOutput will be triggered as expected and I found that in that segments, audio, video and data stream will have the same pts at the beginning of the segment. I am curious is that the root cause of the problem. That's the requirement of AVPlayer.
Because in my own hls stream, audio, data and video will have different pts. If I played audio playlist only, the data will show the start time which is the offset between audio track and data track. Function metadataOutput will be triggered every time. But if I played master playlist which includes audio playlist and video playlists, it doesn't be triggered every time. When it is luckily triggered, the metadata start time show the offset between data track and video track.
Could anyone point out what's the problem and the normal behavior of pts sync when I use A/V split HLS stream.
All the information are taken as pictures and the links are included in the attachment.
Noted: In one situation, data pts is the same as video pts, and audio pts start from 0. Every time I use power button to turn off screen and turn on again, the function metadataOutput will be called immediately, but I don't know why.