Posts

Post not yet marked as solved
0 Replies
260 Views
Situation My team is uses AVPlayer to play live audio on iPhones. We would like to better understanding why a user experiences buffering. What we are currently doing: We are currently monitor the following AVPlayer attributes: buffering reason indicated bitrate observed bitrate error log events  What we have noticed: Buffering reason - is always toMinimizeStalls due to the fact that the buffer is empty. Indicated bitrate - reports the BANDWITH from the manifest url as expected. Observed Bitrate - Values reported here can be lower than the indicated bitrate yet still stream without encountering any buffers. I would expect values under indicated bitrate to encounter buffers as described here here on the apple developer website Error Log Events - Occasionally the error log will report an error code and message however around 60% of the time we don’t have any details from here that indicates why the user is experiencing buffering. When we do experience error codes there doesn't appear to be any map showing what the error code means. Questions: Is there a way to get signal strength from an iPhone (weak signal would give us some reasoning for buffering) What is the recommended approach for getting reasons for buffering? (How to distinguish between a server side issue and a client side issue) Are there AVPlayer settings we can manipulate to reduce buffering?
Posted
by schretze.
Last updated
.
Post not yet marked as solved
0 Replies
471 Views
Situation: I have an HLS audio only stream comprised of aac files. I've confirmed that timed metadata is attached to the stream using ffprobe. Unfortunately I'm unable to access the timed metadata from the AVPlayer. Output from FFProbe ~ ffprobe index_1_296.aac .... Input #0, aac, from 'index_1_296.aac': Metadata: id3v2_priv.com.apple.streaming.transportStreamTimestamp: \x00\x00\x00\x00.\x00\x05\xc0 Duration: 00:00:06.02, bitrate: 96 kb/s Stream #0:0: Audio: aac (LC), 48000 Hz, stereo, fltp, 96 kb/s What I've done: In my class containing the AVPlayer I've extended the AVPlayerItemMetadataOutputPushDelegate and implemented the metadataOutput method. Code I followed an example I found here: https://dcordero.medium.com/hls-timed-metadata-with-avplayer-9e20806ef92f however below is the code implementing the metadataOutput method: func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { if let item = groups.first?.items.first { item.value(forKeyPath: #keyPath(AVMetadataItem.value)) let metadataValue = (item.value(forKeyPath: #keyPath(AVMetadataItem.value))!) print("Metadata value: \n \(metadataValue)") } else { print("MetaData Error") } } What I'm seeing: When playing manifests containing .ts files this metadataOutput method is triggered with timed metadata. However when I'm playing a manifest containing only .aac files the metadataOutput method is never triggered. Question: Does AVPlayer support extracting timed metadata from aac files? If it does are there any examples of this working?
Posted
by schretze.
Last updated
.