Re: The state of Quantization Range handling
On Thu, 17 Nov 2022 22:39:36 +0100 Sebastian Wick wrote: > On Wed, Nov 16, 2022 at 1:34 PM Pekka Paalanen wrote: > > > > Is it a good idea to put even more automation/magic into configuring > > the color pipeline and metadata for a sink, making them even more > > intertwined? > > > > I would prefer the opposite direction, making thing more explicit and > > orthogonal. > > In general I completely agree with this, I just don't think it's > feasible with the current state of KMS. For the color pipeline API [1] > that's exactly the behavior I want but it should be guarded behind a > DRM cap. > > For that new API, user space needs direct control over the > quantization range infoframe and the kernel has to somehow tell user > space what quantization range the sink expects for the default > behavior. User space then programs the infoframe when possible and > builds the color pipeline in such a way that the output is whatever > the sink expects. > > The issue really is that if we push this all to user space it would be > a backwards incompatible change. So let's fix the current situation in > a backwards compatible way and then get it right for the new API that > user space can opt-into. > > Does that make sense? It makes sense as far as userspace does not need to be changed to make use of this. But if userspace will need changes regardless, why continue on a dead-end API? One reason could be that a new explicit API is too much work compared to when you want your issues fixed. If you are introducing a new KMS property (the override control), then by definition userspace needs changes to use it. [1] OTOH is a re-design the world -approach, which is am not suggesting when I talk about making this explicit. I'm thinking about a much smaller step that concerns only quantization range handling inside the existing color pipeline "framework". E.g. deprecate "Broadcast RGB" property and add "quantization range conversion" property that is orthogonal to another new property for the quantization range metadata sent to a sink. Thanks, pq > > > Appendix A: Broadcast RGB property > > > > > > A few drivers already implement the Broadcast RGB property to control > > > the quantization range. However, it is pointless: It can be set to > > > Auto, Full and Limited when the sink supports explicitly setting the > > > quantization range. The driver expects full range content and converts > > > it to limited range content when necessary. Selecting limited range > > > never makes any sense: the out-of-range bits can't be used because the > > > input is full range. Selecting Default never makes sense: relying on > > > the default quantization range is risky because sinks often get it > > > wrong and as we established there is no reason to select limited range > > > if not necessary. The limited and full options also are not suitable > > > as an override because the property is not available if the sink does > > > not support explicitly setting the quantization range. > > > > > > > [1] https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/11 > pgpiB1c6ckwdu.pgp Description: OpenPGP digital signature
Re: The state of Quantization Range handling
On Thu, 17 Nov 2022 22:13:26 +0100 Sebastian Wick wrote: > Hi Dave, > > I noticed that I didn't get the Broadcast RGB property thanks to you > (more below) > > On Tue, Nov 15, 2022 at 2:16 PM Dave Stevenson > wrote: > > > > Hi Sebastian > > > > Thanks for starting the conversation - it's stalled a number of times > > previously. > > > > On Mon, 14 Nov 2022 at 23:12, Sebastian Wick > > wrote: > > > > > > There are still regular bug reports about monitors (sinks) and sources > > > disagreeing about the quantization range of the pixel data. In > > > particular sources sending full range data when the sink expects > > > limited range. From a user space perspective, this is all hidden in > > > the kernel. We send full range data to the kernel and then hope it > > > does the right thing but as the bug reports show: some combinations of > > > displays and drivers result in problems. > > > > I'll agree that we as Raspberry Pi also get a number of bug reports > > where sinks don't always look at the infoframes and misinterpret the > > data. > > > > > In general the whole handling of the quantization range on linux is > > > not defined or documented at all. User space sends full range data > > > because that's what seems to work most of the time but technically > > > this is all undefined and user space can not fix those issues. Some > > > compositors have resorted to giving users the option to choose the > > > quantization range but this really should only be necessary for > > > straight up broken hardware. > > > > Wowsers! Making userspace worry about limited range data would be a > > very weird decision in my view, so compositors should always deal in > > full range data. > > Making this a user space problem is IMO the ideal way to deal with it > but that's a bit harder to do (I'll answer that in the reply to > Pekka). So let's just assume we all agree that user space only deals > with full range data. Limited range was invented for some reason, so it must have some use somewhere, at least in the past. Maybe it was needed to calibrate mixed digital/analog video processing chains with test images that needed to contain sub-blacks and super-whites, to make sure that sub-blacks come out as the nominal black etc. Just because desktop computers do not seem to have any need for limited range, I personally wouldn't be as arrogant as to say it's never useful. Maybe there are professional video/broadcasting needs that currently can only be realized with proprietary OS/hardware, because Linux just can't do it today? Why would TVs support limited range, if it was never useful? Why would video sources produce limited range if it was always strictly inferior to full range? Even digital image processing algorithms might make use of out-of-unit-range values, not just analog circuitry for overshoot. But no, I can't give a real example, just speculation. Hence it's fine by me to discard limited range processing for now. Still, what I explain below would allow limited range processing without any extra complexity by making the KMS color pipeline better defined and less limiting for userspace. > > How would composition of multiple DRM planes work if some are limited > > range and some are full but you want limited range output? Your > > hardware needs to have CSC matrices to convert full range down to > > limited range, and know that you want to use them to effectively > > compose to limited range. > > In fact you can't currently tell DRM that an RGB plane is limited > > range - the values in enum drm_color_range are > > DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1]. Yeah, that's because range conversion has been conflated with YUV-to-RGB conversion, and the result is always full-range RGB in practise, AFAIU. There is no way to feed limited range color into the further color pipeline in KMS, but that's actually a good thing.(*) The following is my opinion of the future, as someone who has been thinking about how to make HDR work on Wayland while allowing the display quality and hardware optimizations that Wayland was designed for: Userspace should not tell KMS about a plane being limited range at all. The reason is the same why userspace should not tell KMS about what colorspace a plane is in. Instead, userspace wants to program specific mathematical operations into KMS hardware without any associated or implied semantics. It's just math. The actual semantics have been worked out by userspace before-hand. This allows to use the KMS hardware to its fullest effect, even for things the hardware or KMS UAPI designers did not anticipate. IMO, framebuffers and KMS planes should ultimately be in undefined quantization range, undefined color space, and undefined dynamic range. The correct processing of the pixel values is programmed by per-plane KMS properties like CTM, LUT, and more specialized components like quantization range converter or YUV-to-RGB converter (which is just another CTM at
Re: The state of Quantization Range handling
On 11/18/22 10:53, Dave Stevenson wrote: > Hi Pekka > > On Fri, 18 Nov 2022 at 10:15, Pekka Paalanen wrote: >> >> On Thu, 17 Nov 2022 22:13:26 +0100 >> Sebastian Wick wrote: >> >>> Hi Dave, >>> >>> I noticed that I didn't get the Broadcast RGB property thanks to you >>> (more below) >>> >>> On Tue, Nov 15, 2022 at 2:16 PM Dave Stevenson >>> wrote: Hi Sebastian Thanks for starting the conversation - it's stalled a number of times previously. On Mon, 14 Nov 2022 at 23:12, Sebastian Wick wrote: > > There are still regular bug reports about monitors (sinks) and sources > disagreeing about the quantization range of the pixel data. In > particular sources sending full range data when the sink expects > limited range. From a user space perspective, this is all hidden in > the kernel. We send full range data to the kernel and then hope it > does the right thing but as the bug reports show: some combinations of > displays and drivers result in problems. I'll agree that we as Raspberry Pi also get a number of bug reports where sinks don't always look at the infoframes and misinterpret the data. > In general the whole handling of the quantization range on linux is > not defined or documented at all. User space sends full range data > because that's what seems to work most of the time but technically > this is all undefined and user space can not fix those issues. Some > compositors have resorted to giving users the option to choose the > quantization range but this really should only be necessary for > straight up broken hardware. Wowsers! Making userspace worry about limited range data would be a very weird decision in my view, so compositors should always deal in full range data. >>> >>> Making this a user space problem is IMO the ideal way to deal with it >>> but that's a bit harder to do (I'll answer that in the reply to >>> Pekka). So let's just assume we all agree that user space only deals >>> with full range data. >> >> Limited range was invented for some reason, so it must have some use >> somewhere, at least in the past. Maybe it was needed to calibrate mixed >> digital/analog video processing chains with test images that needed to >> contain sub-blacks and super-whites, to make sure that sub-blacks come >> out as the nominal black etc. Just because desktop computers do not >> seem to have any need for limited range, I personally wouldn't be as >> arrogant as to say it's never useful. Maybe there are professional >> video/broadcasting needs that currently can only be realized with >> proprietary OS/hardware, because Linux just can't do it today? >> >> Why would TVs support limited range, if it was never useful? Why would >> video sources produce limited range if it was always strictly inferior >> to full range? >> >> Even digital image processing algorithms might make use of >> out-of-unit-range values, not just analog circuitry for overshoot. >> >> But no, I can't give a real example, just speculation. Hence it's fine >> by me to discard limited range processing for now. Still, what I >> explain below would allow limited range processing without any extra >> complexity by making the KMS color pipeline better defined and less >> limiting for userspace. > > AIUI limited range comes from the analogue world, or possibly creative > (film/TV) world, hence being used on Consumer devices rather than IT > ones (CTA and CEA modes vs VESA and DMT modes). > > YCbCr output from video codecs typically uses a range of 16-235, > therefore a media player wanting to pass the decoded video out to the > display exactly as-is needs to be able to signal that to the display > for it to be interpreted correctly. > HDMI extended DVI. I believe both YCbCr support and range control were > added to the HDMI spec at the same time, presumably to allow for this > use case. Limited range RGB seems to be a bit of a quirk though. > > Just to be annoying, JPEG uses full range YCbCr. > How would composition of multiple DRM planes work if some are limited range and some are full but you want limited range output? Your hardware needs to have CSC matrices to convert full range down to limited range, and know that you want to use them to effectively compose to limited range. In fact you can't currently tell DRM that an RGB plane is limited range - the values in enum drm_color_range are DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1]. >> >> Yeah, that's because range conversion has been conflated with >> YUV-to-RGB conversion, and the result is always full-range RGB in >> practise, AFAIU. There is no way to feed limited range color into the >> further color pipeline in KMS, but that's actually a good thing.(*) >> >> The following is my opinion of the future, as someone who has been >> thinking about how to make HDR work on Wayland whil
Re: The state of Quantization Range handling
Hi Pekka On Fri, 18 Nov 2022 at 10:15, Pekka Paalanen wrote: > > On Thu, 17 Nov 2022 22:13:26 +0100 > Sebastian Wick wrote: > > > Hi Dave, > > > > I noticed that I didn't get the Broadcast RGB property thanks to you > > (more below) > > > > On Tue, Nov 15, 2022 at 2:16 PM Dave Stevenson > > wrote: > > > > > > Hi Sebastian > > > > > > Thanks for starting the conversation - it's stalled a number of times > > > previously. > > > > > > On Mon, 14 Nov 2022 at 23:12, Sebastian Wick > > > wrote: > > > > > > > > There are still regular bug reports about monitors (sinks) and sources > > > > disagreeing about the quantization range of the pixel data. In > > > > particular sources sending full range data when the sink expects > > > > limited range. From a user space perspective, this is all hidden in > > > > the kernel. We send full range data to the kernel and then hope it > > > > does the right thing but as the bug reports show: some combinations of > > > > displays and drivers result in problems. > > > > > > I'll agree that we as Raspberry Pi also get a number of bug reports > > > where sinks don't always look at the infoframes and misinterpret the > > > data. > > > > > > > In general the whole handling of the quantization range on linux is > > > > not defined or documented at all. User space sends full range data > > > > because that's what seems to work most of the time but technically > > > > this is all undefined and user space can not fix those issues. Some > > > > compositors have resorted to giving users the option to choose the > > > > quantization range but this really should only be necessary for > > > > straight up broken hardware. > > > > > > Wowsers! Making userspace worry about limited range data would be a > > > very weird decision in my view, so compositors should always deal in > > > full range data. > > > > Making this a user space problem is IMO the ideal way to deal with it > > but that's a bit harder to do (I'll answer that in the reply to > > Pekka). So let's just assume we all agree that user space only deals > > with full range data. > > Limited range was invented for some reason, so it must have some use > somewhere, at least in the past. Maybe it was needed to calibrate mixed > digital/analog video processing chains with test images that needed to > contain sub-blacks and super-whites, to make sure that sub-blacks come > out as the nominal black etc. Just because desktop computers do not > seem to have any need for limited range, I personally wouldn't be as > arrogant as to say it's never useful. Maybe there are professional > video/broadcasting needs that currently can only be realized with > proprietary OS/hardware, because Linux just can't do it today? > > Why would TVs support limited range, if it was never useful? Why would > video sources produce limited range if it was always strictly inferior > to full range? > > Even digital image processing algorithms might make use of > out-of-unit-range values, not just analog circuitry for overshoot. > > But no, I can't give a real example, just speculation. Hence it's fine > by me to discard limited range processing for now. Still, what I > explain below would allow limited range processing without any extra > complexity by making the KMS color pipeline better defined and less > limiting for userspace. AIUI limited range comes from the analogue world, or possibly creative (film/TV) world, hence being used on Consumer devices rather than IT ones (CTA and CEA modes vs VESA and DMT modes). YCbCr output from video codecs typically uses a range of 16-235, therefore a media player wanting to pass the decoded video out to the display exactly as-is needs to be able to signal that to the display for it to be interpreted correctly. HDMI extended DVI. I believe both YCbCr support and range control were added to the HDMI spec at the same time, presumably to allow for this use case. Limited range RGB seems to be a bit of a quirk though. Just to be annoying, JPEG uses full range YCbCr. > > > How would composition of multiple DRM planes work if some are limited > > > range and some are full but you want limited range output? Your > > > hardware needs to have CSC matrices to convert full range down to > > > limited range, and know that you want to use them to effectively > > > compose to limited range. > > > In fact you can't currently tell DRM that an RGB plane is limited > > > range - the values in enum drm_color_range are > > > DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1]. > > Yeah, that's because range conversion has been conflated with > YUV-to-RGB conversion, and the result is always full-range RGB in > practise, AFAIU. There is no way to feed limited range color into the > further color pipeline in KMS, but that's actually a good thing.(*) > > The following is my opinion of the future, as someone who has been > thinking about how to make HDR work on Wayland while allowing the > display quality and hardware optimizations