Hello,
The existing behaviour before any of our colour work was that the native
display's primaries were being used for SDR content. (Ie. just scanning
out game's buffer directly)
Games are not submitting us any primaries for the buffers they are sending.
I mean they are saying they are sRGB so "technically 709", but
colorimetry for SDR content (outside of mastering) is very wishy-washy.
Deck Display Info:
static constexpr displaycolorimetry_t displaycolorimetry_steamdeck_spec
{
.primaries = { { 0.602f, 0.355f }, { 0.340f, 0.574f }, { 0.164f, 0.121f
} },
.white = { 0.3070f, 0.3220f }, // not D65
};
static constexpr displaycolorimetry_t displaycolorimetry_steamdeck_measured
{
.primaries = { { 0.603f, 0.349f }, { 0.335f, 0.571f }, { 0.163f, 0.115f
} },
.white = { 0.296f, 0.307f }, // not D65
};
https://github.com/ValveSoftware/gamescope/blob/master/src/color_helpers.h#L451
For the rest of this, consider displaycolorimetry_steamdeck_measured to
be what we use for the internal display.
To improve the rendering of content on the Deck's internal display with
the modest gamut, we go from the display's native primaries (sub 709) to
somewhere between the native primaries (0.0) and a hypothetical wider
gamut display (1.0) that we made up.
The hypothetical display's primaries were decided based by making
content look appealing:
static constexpr displaycolorimetry_t displaycolorimetry_widegamutgeneric
{
.primaries = { { 0.6825f, 0.3165f }, { 0.241f, 0.719f }, { 0.138f,
0.050f } },
.white = { 0.3127f, 0.3290f }, // D65
};
We have a single knob for this in the UI, in code it's "SDR Gamut
Wideness", but known in the UI as "Color Vibrance". It's the knob that
picks the target color gamut that gets mapped to the native display.
This is how that single value interacts to pick the target primaries:
https://github.com/ValveSoftware/gamescope/blob/master/src/color_helpers.cpp#L798
We then use the result there to do a simple saturation fit based on the
kob and some additional parameters that control how we interpolate.
(blendEnableMinSat, blendEnableMaxSat, blendAmountMin, blendAmountMax)
Those parameters also change with the SDR Gamut Wideness value, based on
things that "look nice". :P
https://github.com/ValveSoftware/gamescope/blob/master/src/color_helpers.cpp#L769
We also do some other things like Bradford chromatic adaptation to fix
the slightly-off whitepoint too.
We use all this to generate a 3D LUT with that saturation fit, chromatic
adaptation and use Shaper + 3D LUT at scanout time to apply it.
(We also have a shader based fallback path)
The goal of all of this work is less 'color accuracy' and more 'making
the display more inline with consumer expectations'.
We wanted to try and make the display appear much more 'vivid' and
colourful without introducing horrible clipping.
We also use this same logic for wider gamut displays (where 0.0 = sRGB
and 1.0 = native) and for SDR content on HDR.
Hope this helps!
- Joshie 🐸✨
On 11/3/23 13:00, Pekka Paalanen wrote:
This is a continuation of
https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/14#note_2152254
because this is off-topic in that thread.
No, we did widening. The Deck's internal display has a modest gamut
that is < 71% sRGB.
If games do wide (well, full sRGB or wider) gamut, then why would you
need to make that gamut even wider to fit nicely into a significantly
smaller gamut display?
Here's what I think happened.
You have a game that produces saturation up to P3, let's say. When you
did the colorimetrically correct matrix conversion (CTM) from BT.2020
to the "modest gamut", you found out that it is horribly clipping
colors, right?
If you then removed that CTM, it means that you are
re-interpreting BT.2020 RGB encoding *as if* it was "modest gamut" RGB
encoding. This happens if you simply apply the input image EOTF and
then apply the display inverse-EOTF and do nothing to the color gamut
in between. Adjusting dynamic range does not count here. This is an
extreme case of saturation reduction.
(Note: Doing nothing to numbers equals to applying a major semantic
operation. Like telling someone something in cm and they take that
number in mm instead. Or metric vs. imperial units. Color space
primaries and white point define the units for RGB values, and if you
have other RGB values, they are not comparable without the proper CTM
conversion.)
That does not look good either, so after that re-interpretation you
added saturation boosting that nicely makes use of the capabilities of
the integrated display's "modest gamut" so that the image looks more
"vibrant" and less de-saturated. However, the total effect is still
saturation reduction, because the re-interpretation of the game content
RGB values is such a massive saturation reduction that your boosting
does not overcome it.
I could make up an analogue: Someone says they are making all sticks
50% longer than what you ask. You ask them to make a stick 100 long.
They give you a stick that you measure to be 15 long, and they still
claim it is 50% longer than what you asked. How is this possible? The
length spaces are different: you were thinking and measuring in cm,
they did mm. They did give you a stick of 150 mm, which is 50% longer
than 100 mm. But from your perspective, the stick is 85% smaller than
you asked. If one had started by converting to a mutual length space
first (referring to the correct CTM), there would be an initial
agreement of how long is 1.
Thanks,
pq