Use Metal to conver HDR Pixelbuffer to SDR Pixelbuffer

I see some demo show convert HDR video to SDR Pixelbuffer,such AVAssetReader、 AVVideoComposition 、AVComposition 、AVFoundation. But In some cases,I want to render HDR Pixelbuffer and record video.

AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([videoDevice isVideoHDRSupported]) {
    NSError *error = nil;
    if ([videoDevice lockForConfiguration:&error]) {
        videoDevice.automaticallyAdjustsVideoHDREnabled = NO;
        videoDevice.videoHDREnabled = YES; // 开启 HDR
        [videoDevice unlockForConfiguration];
    } else {
        NSLog(@"Error: %@", error.localizedDescription);
    }
}

Real-time processing of HDR data requires processing of video frame data (such as filters), ensuring that the processing chain supports 10-bit color depth and HDR metadata. And use imagesBuffer to object tracking, etc. How to solve this problem?

Hi manqinlin,

There is a difference between EDR ("enhanced" dynamic range) and HDR. The "videoHDREnabled" API actually refers to the first. It is 8-bit, but uses extended dynamic range compared to SDR. To get true HDR, you need to iterate through the AVCaptureDevice's formats, and look for your desired resolution and frame rate that has the pixel format "x420" or "x422". These are both 10-bit video pixel formats.

You can record either of these formats in HEVC or in AppleProRes.

If we've misunderstood your question, please let us know.

Use Metal to conver HDR Pixelbuffer to SDR Pixelbuffer
 
 
Q