How Can I Apply A vImage ContrastStretch To A Grayscale Image

I'm writing an app to help with astrophotography, and I need to perform a contrast stretch to the image, because it was taken with a specialized astrophotography camera in monochrome and most of the data is very dark.

Most astrophotography software (astropy, Pixinsight) has something called an autostretch, which is a form of contrast stretching. I would like to do the same thing in my iOS app, using the tools available to me in SwiftUI, UIImage, CIImage, and CGImage.

I am to the point that I have created a buffer .withUnsafeMutableBufferPointer that contains the image data as 16-bit unsigned integers (the format given to me by the camera). I then create a vImage_Buffer using:

var buffer = vImage_Buffer(data: outPtr.baseAddress, height: vImagePixelCount(imageHeight), width: vImagePixelCount(imageWidth), rowBytes: MemoryLayout<Float>.size * imageWidth)

... and now I would like to apply either an equalizeHistogram() or a contrastStretch() to the buffer. What do I need to do? Do I need to create a CGImageFormat, like this?

let cgiImageFormat = vImage_CGImageFormat(bitsPerComponent: 16, bitsPerPixel: 16, colorSpace: CGColorSpaceCreateDeviceGray(), bitmapInfo: bitmapInfo)!

Which function should I use to do the equalization or contrast stretch? There appears to be a vImageContrastStretch_PlanarF() function, but I'm not sure the input data will be in the proper format (is a monochrome CGImage 32-bit planar F?), and I certainly don't know how to setup the histogram_entries parameter for that function. It seems like the function could just scan the image itself, form the histogram, and then stretch it, right?

So a code example would help a lot!

Thanks in advance, Robert

Answered by Engineer in 824655022

Hi, you are correct, vImage provides 8-bit unsigned integer and 32-bit floating point histogram functions. I would suggest you use vImageConvert_16UToF to convert your images to 32-bit and perform the transformations with those.

You can create a CGImage from the 32-bit planar buffer using:

var imageFormat = vImage_CGImageFormat(
    bitsPerComponent: 32,
    bitsPerPixel: 32,
    colorSpace: CGColorSpaceCreateDeviceGray(),
    bitmapInfo: CGBitmapInfo(rawValue:
                             CGBitmapInfo.byteOrder32Little.rawValue |
                             CGBitmapInfo.floatComponents.rawValue |
                             CGImageAlphaInfo.none.rawValue))

If you haven't seen it, we have an article on the vImage histogram functions, Enhancing image contrast with histogram manipulation.

I don't know anything about Accelerate, other than it looks really difficult and isn't all that accelerated.

The documentation for vImage_Buffer provides several links to long-format descriptions that are more tutorials than examples.

Look at:

WWDC 2018 session 701 Using Accelerate and simd

WWDC 2019 session 718 Introducing Accelerate for Swift

WWDC 2014 session 703 What's new in the Accelerate Framework

WWDC 2013 session 713 The Accelerate Framework

There are some example that use these functions in:

WWDC 2020 Creating a game with scene understanding

WWDC 2018 Halftone descreening with 2D fast Fourier transform

WWDC 2018 Realtime video effects with vImage

WWDC 2013 Running with a snap

WWDC 2013 UIImageEffects

(sorry, can't find links for these last ones)

I'm currently working on a very similar app. I'm doing all this directly in Metal. For me, it seems more straightforward and more efficient. Be careful using high-level APIs like CGImage, UIImage, and CIImage. It's fine for small images and getting things up and running. But for this domain in particular, these APIs don't scale with the size of the images. Tiling is hard.

Accepted Answer

Hi, you are correct, vImage provides 8-bit unsigned integer and 32-bit floating point histogram functions. I would suggest you use vImageConvert_16UToF to convert your images to 32-bit and perform the transformations with those.

You can create a CGImage from the 32-bit planar buffer using:

var imageFormat = vImage_CGImageFormat(
    bitsPerComponent: 32,
    bitsPerPixel: 32,
    colorSpace: CGColorSpaceCreateDeviceGray(),
    bitmapInfo: CGBitmapInfo(rawValue:
                             CGBitmapInfo.byteOrder32Little.rawValue |
                             CGBitmapInfo.floatComponents.rawValue |
                             CGImageAlphaInfo.none.rawValue))

If you haven't seen it, we have an article on the vImage histogram functions, Enhancing image contrast with histogram manipulation.

Well, I think I will just create a 32-bit floating point buffer in the first place. I was hoping I was just missing a function definition somewhere (like vImageConvert_16UtoF).

By the way, I had a second question buried in my post (sorry about that). What does the parameter "histogram_entries" represent in the vImageContrastStretch_PlanarF() method? It says it is the "number of histogram entries". Do I just pick a number? Or does it represent the number of unique pixel values in my data?

Thanks, Robert

I will take a look at Metal. I'm not really doing much image processing in the app. I just want to show a stretched version of the last image taken by my camera. Most of the app deals with controlling all the equipment at the observatory and setting up the automated imaging runs.

I don't recommend using the comment feature. I almost missed your reply.

If your images will fit into RAM, maybe look at CIImage. It's kind of the best of both worlds. You can more easily get data into it and out. You can define your own Metal-based stretch filter really easily. And there are a ton of built-in filters, including histograms, that might already do what you want.

The "number of histogram entries" is the number of histogram bins. For 8-bit, vImage uses 256 bins (one for every possible value). For 32-bit, vImage allows you to specify this value. You'll get better quality (for example, less banding) with higher values, at the expense of longer processing times.

How Can I Apply A vImage ContrastStretch To A Grayscale Image
 
 
Q