I am new to Swift and iOS programming. I am currently handling the output buffer from the camera in the following callback as:
extension VideoCapture: AVCaptureVideoDataOutputSampleBufferDelegate {
public func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection)
{
guard let pixelBuffer = sampleBuffer.imageBuffer else { return }
var image: CGImage?
// Here I want to create the image above with the same size as the pixelBuffer and filled with a uniform colour
}
Here, what I simply want to do is create a CGImage canvas of the same size as the input buffer and fill it with the color black, for example, in an efficient fashion.
What would be the best way to do that?
Also, additionally would it be possible to also set the input sampleBuffer to all black?