I'm super excited about finally getting access to raw camera data. However, it'd seem that there's something odd with the linear space filter. I tried running my conversion with a CILinearGradient filter (which should replace my entire image with a linear gradient), but I'm just getting my processed photo out. Am I doing something wrong?--documentation at this point isn't exactly ample yet, so maybe I'm missing something:
func capture(_ captureOutput: AVCapturePhotoOutput,
didFinishProcessingRawPhotoSampleBuffer rawSampleBuffer: CMSampleBuffer?,
previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings,
bracketSettings: AVCaptureBracketedStillImageSettings?,
error: NSError?)
{
print("Received photo!");
let data = AVCapturePhotoOutput.dngPhotoDataRepresentation(forRawSampleBuffer: rawSampleBuffer!, previewPhotoSampleBuffer: nil);
let processor : CIFilter = CIFilter(name: "CILinearGradient")!;
let rawConverter = CIFilter(imageData: data, options: [kCGImageSourceTypeIdentifierHint:"com.adobe.raw-image",
kCIInputLinearSpaceFilter: processor]);
let size = rawConverter?.value(forKeyPath: kCIOutputNativeSizeKey) as! CIVector;
let rawConverterOutput = rawConverter!.outputImage!;
print("Finished Image dimensions: ", size.x, size.y);
let ctx = CIContext();
let finished = ctx.createCGImage(rawConverterOutput, from: CGRect(x: 0, y: 0, width: size.x, height: size.y));
ImageTakenView.image = UIImage(cgImage: finished!);
}