Hi.
Does the Safari / WebKit MediaSource API implementation support a "low-delay" mode for sub-second latency live streaming scenarios?
If not, is there another way to achieve my use-case? (see details below)
I am trying to stream live video to the Safari browser as fragmented MP4 to a MediaSource SourceBuffer.
See this StackOverflow answer - Chrome is using hints from the MP4 stream whether to enable this mode. In this mode, the buffer size is reduced, and importantly, the video does not enter the "paused" state if video is not fed in time (buffer underflow) - and it will continue playing the video once new frames are fed.
In Safari, the <video> element will pause if I run out of data to feed to the SourceBuffer and will not start playing again even if I feed more data. I must forcefully call HTMLVideoElement.play but this causes hitches in playback.
Other details:
- It is often the case that I sometimes run out of data to feed to
SourceBuffer.appendBufferand cause buffer underflow, because either the video has not changed server-side, or network conditions cause data to be dropped. - My video is H.264 encoded with a single I-Frame and subsequently only P-Frames. This is to reduce network bandwidth and minimum latency.
- WebRTC
MediaStreamAPI is not an option as I need more control over the encoding and network protocol.
Thank you!