- iOS 13.0+
- Xcode 11.0+
Histogram specification is a technique that allows you to calculate the histogram of a reference image and apply it to an input image. This sample walks you through the steps to implement histogram specification in vImage:
Creating vImage buffers that represent the reference image and input image.
Calculating the histogram of the reference image.
Specifying the histogram of the input image as the reference image’s histogram.
The example below shows an input image (top left) and a histogram reference image (bottom left), with the result on the right:
Create the vImage Buffers
To learn about creating a Core Graphics image format that describes your input and reference images, see Creating a Core Graphics Image Format. In this example,
format describes an 8-bit-per-channel ARGB image:
The following code shows how to create a vImage buffer initialized with the input image (the flower in the above image).
The following code shows how to create a vImage buffer initialized with the histogram reference image (the rainbow in the above image).
Calculate the Reference Image Histogram
The histogram data is stored in four arrays—one for each channel—where the value of each element is the number of pixels in the reference image with that color value. In an 8-bit-per-channel image, each color channel can hold 256 different values, so each array is defined with a count of 256.
To populate the histogram arrays with the calculated histogram data, prepare an array of
Unsafe from the arrays, and pass it to
v returns, the four arrays are now populated with the histogram data from the image that
histogram points to.
Specify the Input Image Histogram
v function accepts a different parameter to receive the histogram data: an array of
Unsafe. The following code prepares the four arrays for use in the specification function.
v can work in place, you can pass the source buffer as both the source and destination:
source contains the original input image with the histogram specified by the reference image.