CAMetalLayer renders HDR images with a color shift

I can't get CoreImage to render an HDR image file with correct colors to a CAMetalLayer on macOS 14. I'm comparing the result with NSImageView and the SupportingHDRImagesInYourApp 'HDRDemo23' sample code, which use CVPixelBuffer. With CAMetalLayer, the images are displayed as HDR (definitely more highlights), but they're darker with some kind saturation increase & color shift.

Files I've tested include the sample ISO HDR files in the SupportingHDRImagesInYourApp sample code. Methods I've tried to render to CAMetalLayer include:

  • Modifying the GeneratingAnAnimationWithACoreImageRenderDestination sample code's ContentView so it uses HDRDemo23/example-ISO-HDR-images/image_01.heic, loaded using CIImage(contentsOf:)
  • Creating a test AppKit app that uses MTKView and CIRenderDestination the same way. I have NSImageView and the MTKView in the same window for comparison.
  • Using CIRAWFilter > CIRenderDestination > IOSurface > MTKView/CAMetalLayer

All these methods produce the image with the exact same appearance; a dark HDR image with some saturation/color shift.

The only clue I've found is that when using the Metal Debugger on the test AppKit app, the CAMetalLayer's 'Present' shows the 'input' thumbnail is HDR without the color shift, but the 'output' thumbnail looks like what I actually see. I tried changing the color profile on the layer to various things but nothing looked more correct.

I've tried this on two Macs, an M1 Mac Studio with an LG display, and a MacBook Air M2. The MacBook Air shows the same color shift, but since it has less dynamic range overall there isn't as much difference between NSImageView and MTKView.

Did you set wantsExtendedDynamicRangeContent on the CAMetalLayer? What happens when you set the layer's colorSpace to some HDR color space? Also, make sure that you set the CIRenderDestinations colorSpace to the same space as the layer.

I found that Metal/Core Image doesn't process the PQ transfer function when rendering which is why it looks dark. Core Animation handles it when it renders the CVPixelBuffer tagged with that TF, which is why it looks correct.

I was able to get it to render properly by using CIRenderDestination to an IOSurface with the ITU Rec. 2100 PQ color space, and then using MPSImageConversion between the surface and the layer's drawable. The converter's source color space is set to PQ (matching the surface) and the dest space to the CAMetalLayer's.

There might be a method without using an intermediate IOSurface/CVPixelBuffer, but I'm already using a surface anyway.

I have the same issue.

Any chance you could post some sample working code, either here or in a DropBox folder?

Thanks!

Regarding the "SupportingHDRImagesInYourApp" sample app from Apple:

If I load a RAW image file, then it initially looks correct in the UIImageView class, but it doesn't seem to have enough dynamic range in Edit Mode when the adjustment sliders are displayed (colors seems slightly darker).

CAMetalLayer renders HDR images with a color shift
 
 
Q