I wrote the code to get the video through AVCaptureSession.
the code looks like the following.
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
guard let baseRawAddress = CVPixelBufferGetBaseAddress(pixelBuffer) else {
return
}
// Convert the base address to a safe pointer of the appropriate type
let opaquePtr = OpaquePointer(baseRawAddress)
let baseAddress = UnsafeMutablePointer<UInt8>(opaquePtr)
let byteBuffer = UnsafeMutablePointer<UInt8>(baseAddress)
CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
}
Is gamma correction already applied to the color information of the pixels(byteBuffer in above swift code) retrieved here?
If so, is it possible to retrieve the gamma value?