Images with unusual color spaces not correctly loaded by Core Image

Some users reported that their images are not loading correctly in our app. After a lot of debugging we identified the following:

  • This only happens when the app is build for Mac Catalyst. Not on iOS, iPadOS, or “real” macOS (AppKit).
  • The images in question have unusual color spaces. We observed the issue for uRGB and eciRGB v2.
  • Those images are rendered correctly in Photos and Preview on all platforms.
  • When displaying the image inside of a UIImageView or in a SwiftUI Image, they render correctly.
  • The issue only occurs when loading the image via Core Image.
  • When comparing the different Core Image render graphs between AppKit (working) and Catalyst (faulty) builds, they look identical—except for the result.

Mac (AppKit):

Catalyst:

Something seems to be off when Core Image tries to load an image with foreign color space in Catalyst.

We identified a workaround: By using a CGImageDestination to transcode the image using the kCGImageDestinationOptimizeColorForSharing option, Image I/O will convert the image to sRGB (or similar) and Core Image is able to load the image correctly. However, one potentially loses fidelity this way.

Or might there be a better workaround?

Also filed as FB17081255, including a sample project.

Thanks for looking into this!

Images with unusual color spaces not correctly loaded by Core Image
 
 
Q