Why does Apple use simplified 1.961 gamma instead of precise ITU-R 709 transfer function?

(For Apple folks: rdar://47577096.)


# Background


The Core Video function `CVImageBufferCreateColorSpaceFromAttachments` creates custom color profiles with simplified transfer functions instead of using the standard system color profiles. Let’s take ITU-R 709 as an example.


The macOS `Rec. ITU-R BT.709-5` system color profile specifies the transfer function as

f(x) = { (0.91x + 0.09)^2.222 where x >= 0.081
       { 0.222x where x < 0.081


The Apple-custom `HDTV` color profile created by the above Core Video function specifies the transfer function as

f(x) = x^1.961


My understanding is that `x^1.961` is the closest approximation of the more complex ITU-R 709 transfer function.


# Questions


1. Why use a custom color profile with a simplified transfer function rather than the official specification?

- Was it done for performance?

- Was it done for compatibility with non-QuickTime-based applications?

- etc.


2. Speaking of compatibility, there is a problem when an encoding application uses the official transfer function and the decoding application uses the approximated transfer function. I tested this using two images. One image uses the `Rec. ITU-R BT.709-5` color profile. The other image is derived from the former by assigning the Apple-custom `HDTV` color profile. The latter image loses the details in the darker areas of the image. Why go to the trouble of approximating the transfer function when the approximation isn’t that great?


3. Are the Apple-custom color profiles also used for encoding? Or are they only for decoding?


4. Another thing that concerns me is that the Apple-custom `HDR (PQ)` and `HDR (HLG)` color profiles use the same simplified transfer function of `f(x) = x^1.801`. Isn’t the whole point of the PQ and HLG standards to define more sophisticated transfer functions? Doesn’t simplifying those two transfer functions defeat their purpose?

Post not yet marked as solved Up vote post of fumoboy007 Down vote post of fumoboy007
2.7k views

Replies

Linear segment (just like in sRGB curve) only matters when you encode the picture, you will lose some black gradations otherwise. On decoding it will just be off by one.

You are wrong. That math comes from 1.2 end to end gamma (illusion because of our eyes watching stuff in the dark) on back then 2.35 reference CRT, which was later changed on OLED to 2.4 gamma, because black on OLED is much darker. Math:

1.2 / 2.35 = 0.51 = 1/1.9608.

Simple.

  1. Are the Apple-custom color profiles also used for encoding? Or are they only for decoding?

Neither. Decoding cannot be done with OETF, while encoding cannot be done with this simple approximation. It is like you totally know nothing. ICC profiles are not for decoding, they are for color managment, that linear data will have to be further managed.

Was it done for performance?

No, that is what reference was before 2006, when BT.1886 got created by Sony.

Was it done for compatibility with non-QuickTime-based applications?

There is only mpv that supports color managment of primaries and transfer. Maybe Davinci.

Doesn’t simplifying those two transfer functions defeat their purpose?

No, it does not because PQ values are untouched. All color managment happens on device, for LG C9 webos does it. HLG may not stay untouchrd though, in windows only PQ can be outputed.

TL:DR. You do not know what scene referred means. That means that (remember, change of primaries always happens on linear light) linear light is scene light. So OETF is used to get scene light, then you color manage, then you encode again with inverse OETF and then apply EOTF, 2.4 gamma for perfect OLED. Imgine it is LED, then gamma is 2.2, since black is bad. Imagine sunlight. Then gamma is as it was originally! 1.9. That is all just because of our vision. Next imagine absolutely black room cinema with bad projectors allowing only for 48 nits. Then gamma is 2.6.