Create a displayable ARGB image from the luminance and chrominance information supplied by your device’s camera.
- iOS 11.2+
- Xcode 10.0+
As an alternative to the technique discussed in Applying vImage Operations to Video Sample Buffers, vImage provides lower-level functions for creating RGB images from the separate luminance and chrominance planes provided by an
AVCapture instance. These functions offer better performance and more granular configuration than the
This sample walks you through the steps for converting separate luminance and chrominance planes to an ARGB image:
Configuring YpCbCr-to-ARGB Information.
Defining the vImage destination buffer.
Initializing the source buffers.
Initializing the destination buffer.
Converting the Yp and CbCr planes to ARGB.
Displaying the result.
Configure YpCbCr-to-ARGB Information
v function generates the information required to convert the luminance and chrominance planes to a single ARGB image. You supply it with a matrix containing the coefficients for the conversion and, in this example,
kv contains the conversion matrix from ITU-R Recommendation BT.601-4.
The following code shows how to populate
info with the required conversion information for 8-bit pixels, clamped to a video range. An 8-bit video range format typically uses the range
[16,235] for luminance and
[16,240] for chrominance.
Define the vImage Destination Buffer
To avoid reinitializing the destination RGB buffer for every video frame, declare it as a class member rather than declaring it inside the
capture method of the sample buffer delegate:
Initialize the Source Buffers from Pixel Buffer Planes
The source luminance and chrominance vImage buffers are initialized directly from the two planes of the pixel buffer. The first plane contains luminance information and the second plane contains chrominance information.
Use the following code to query pixel buffer properties for each plane and initialize your source buffers:
Initialize the Destination Buffer
data property of the destination buffer you instantiated earlier, to find out if it needs to be initialized. The destination buffer will contain the RGB image after conversion. This code initializes
destination on the first pass and sets its size to match the luminance plane of the pixel buffer:
Convert Yp and CbCr Planes to ARGB
With the conversion information, destination buffers, and source buffers initialized, you’re ready to convert the two source buffers to the destination buffer by using the
Display the Result
To display the ARGB image to the user, create a Core Graphics image from
destination, and initialize a
UIImage instance from that. The
v function returns an unmanaged
CGImage instance based on the supplied buffer and the same format you used earlier.
capture runs in a background thread, you must dispatch the call to update the image view to the main thread.