VTCompressionSession Bitrate/Datarate overshooting

I have been working on an H264 hardware accelerated encoder implementation using ************'s

VTCompressionSession
for a while now, and a consistent problem has been the unreliable bitrate coming out of it. I have read many forum posts and looked through existing code for this, and tried to follow suit, but the bitrate out of my encoder is almost always somewhere between 5% and 50% off what it is set at, and on occasion I've seen some huge errors, like even 400% overshoot, where even one frame will be twice the size of the given average bitrate.


My session is setup as follows:

  • kVTCompressionPropertyKey_AverageBitRate
    = desired bitrate
  • kVTCompressionPropertyKey_DataRateLimits
    = [desired bitrate / 8, 1]; accounting for bits vs bytes
  • kVTCompressionPropertyKey_ExpectedFrameRate
    = framerate (30, 15, 5, or 1 fps)
  • kVTCompressionPropertyKey_MaxKeyFrameInterval
    = 1500
  • kVTCompressionPropertyKey_MaxKeyFrameIntervalDuration
    = 1500 / framerate
  • kVTCompressionPropertyKey_AllowFrameReordering
    = NO
  • kVTCompressionPropertyKey_ProfileLevel
    =
    kVTProfileLevel_H264_Main_AutoLevel
  • kVTCompressionPropertyKey_RealTime
    = YES
  • kVTCompressionPropertyKey_H264EntropyMode
    =
    kVTH264EntropyMode_CABAC
  • kVTCompressionPropertyKey_BaseLayerFrameRate
    = framerate / 2


And I adjust the average bitrate and datarate values throughout the session to try and compensate for the volatility (if it's too high, I reduce them a bit, if too low, I increase them, with restrictions on how high and low to go). I create the session and then apply the above configuration as a single dictionary using

VTSessionSetProperties
and feed frames into it like this:


VTCompressionSessionEncodeFrame(compressionSessionRef,
                                static_cast<CVImageBufferRef>(pixelBuffer)
                                CMTimeMake(capturetime, 1000), 
                                kCMTimeInvalid, 
                                frameProperties, 
                                frameDetailsStruct, 
                                &encodeInfoFlags)


So I'm supplying timing information as the API says to do. Then I add up the size of the output for each frame and divide over a periodic time period, to determine the outgoing bitrate and error from desired. This is where I see the significant volatility.


I'm looking for any help in getting the bitrate under control, as I'm not sure what to do at this point. Thank you!

We see the same thing - Although we are using CAVLC.

The bit rate appears to be overruning because the session is putting out only I-Frames.

We also tried kVTCompressionPropertyKey_AllowTemporalCompression = true, to no effect.

VTCompressionSession Bitrate/Datarate overshooting
 
 
Q