I'm writing an app to help with astrophotography, and I need to perform a contrast stretch to the image, because it was taken with a specialized astrophotography camera in monochrome and most of the data is very dark.
Most astrophotography software (astropy, Pixinsight) has something called an autostretch, which is a form of contrast stretching. I would like to do the same thing in my iOS app, using the tools available to me in SwiftUI, UIImage, CIImage, and CGImage.
I am to the point that I have created a buffer .withUnsafeMutableBufferPointer that contains the image data as 16-bit unsigned integers (the format given to me by the camera). I then create a vImage_Buffer using:
var buffer = vImage_Buffer(data: outPtr.baseAddress, height: vImagePixelCount(imageHeight), width: vImagePixelCount(imageWidth), rowBytes: MemoryLayout<Float>.size * imageWidth)
... and now I would like to apply either an equalizeHistogram() or a contrastStretch() to the buffer. What do I need to do? Do I need to create a CGImageFormat, like this?
let cgiImageFormat = vImage_CGImageFormat(bitsPerComponent: 16, bitsPerPixel: 16, colorSpace: CGColorSpaceCreateDeviceGray(), bitmapInfo: bitmapInfo)!
Which function should I use to do the equalization or contrast stretch? There appears to be a vImageContrastStretch_PlanarF() function, but I'm not sure the input data will be in the proper format (is a monochrome CGImage 32-bit planar F?), and I certainly don't know how to setup the histogram_entries parameter for that function. It seems like the function could just scan the image itself, form the histogram, and then stretch it, right?
So a code example would help a lot!
Thanks in advance, Robert