> The BT.709 and BT.2020 OETFs are the same, the only difference
> being that the BT.2020 variant is defined with more precision
> for 10 and 12-bit per color encodings.
Just to make sure, the spec defines this precision, correct? It's
not an AMD-specific thing?
> Both are used as encoding functi
Is this 125 magic number something we can expect other hardware to
implement as well?
Could AMD use the HDR multiplier or another block to behave as if
the multiplier didn't exist?
Note, I am no HDR expert. Maybe others have a better idea whether this
makes sense or not.
> diff --git a/drivers/gpu/drm/drm_atomic_uapi.c
> b/drivers/gpu/drm/drm_atomic_uapi.c
> index a3e1fcad47ad..4744c12e429d 100644
> --- a/drivers/gpu/drm/drm_atomic_uapi.c
> +++ b/drivers/gpu/drm/drm_atomic_uapi.c
> @@ -701,6 +701,9 @@ static int drm_atomic_color_set_data_property(struct
> drm_col
> + prop = drm_property_create_range(dev, DRM_MODE_PROP_IMMUTABLE, "SIZE",
Ah, I forgot something: I think this needs to be DRM_MODE_PROP_ATOMIC?
Reviewed-by: Simon Ser