Hi, I'm new to this list and probably can't contribute but interested, I passed your original posting to a friend and have enclosed his thoughts ... old hash or food for thought ??? I ask your forgiveness if you find this inappropriate. (am of the elk * act first, ask for forgiveness afterward) ;-)
Steven --------------------- start Steven Kucharzyk wrote: > Thought you might find this of interest. Hi, thanks for sending it to me. Unfortunately I don't know enough about the context to say anything specific about it. The best I can do is state the big picture aims I would look for, as someone with a background in display systems electronic design, rendering software development and Color Science. (I apologize in advance if any of this is preaching to the choir!) 1) I would make sure that someone with a strong Color Science background was consulted in the development of the API. 2) I would be measuring the API against its ability to support a "profiling" color management workflow. This workflow allows using the full capability of a display, while also allowing simultaneous display of multiple sources encoded in any colorspace. So the basic architecture is to have a final frame buffer (real or virtual) in the native displays colorspace, and use any graphics hardware color transform and rendering capability to assist with the transformation of data in different source colorspaces into the displays native colorspace. 3) The third thing I would be looking for, is enough standardization that user mode software can be written that will get key benefits of what's available in the hardware, without needing to be customized to lots of different hardware specifics. For instance, I'd make sure that there was a standard display frame buffer to display mode that applied per channel curves that are specified in a standard way. (i.e. make sure that there is an easy to use replacement for XRRCrtcGamma.) Any API that is specific to a type or model of graphics card, will retard development of color management support to a very large degree - the financial and development costs of obtaining, configuring and testing against multiple graphic card makes and models puts this in the too hard basket for anyone other than a corporation. Perhaps little of the above is relevant, if this is a low level API that is to be used by other operating system sub-systems such as display graphics API's like X11 or Wayland, which will choose specific display rendering models and implement them with the hardware capabilities that are available. From a color management point of view, it is the operating system & UI graphics API's that are the ones that are desirable to work with, since they are meant to insulate applications from hardware details. Cheers, Graeme Gill. ------------------------ end On Sat, 6 May 2023 06:40:20 +1000 Dave Airlie <airl...@gmail.com> wrote: > On Fri, 5 May 2023 at 01:23, Simon Ser <cont...@emersion.fr> wrote: > > > > Hi all, > > > > The goal of this RFC is to expose a generic KMS uAPI to configure > > the color pipeline before blending, ie. after a pixel is tapped > > from a plane's framebuffer and before it's blended with other > > planes. With this new uAPI we aim to reduce the battery life impact > > of color management and HDR on mobile devices, to improve > > performance and to decrease latency by skipping composition on the > > 3D engine. This proposal is the result of discussions at the Red > > Hat HDR hackfest [1] which took place a few days ago. Engineers > > familiar with the AMD, Intel and NVIDIA hardware have participated > > in the discussion. > > > > This proposal takes a prescriptive approach instead of a > > descriptive approach. Drivers describe the available hardware > > blocks in terms of low-level mathematical operations, then > > user-space configures each block. We decided against a descriptive > > approach where user-space would provide a high-level description of > > the colorspace and other parameters: we want to give more control > > and flexibility to user-space, e.g. to be able to replicate exactly > > the color pipeline with shaders and switch between shaders and KMS > > pipelines seamlessly, and to avoid forcing user-space into a > > particular color management policy. > > I'm not 100% sold on the prescriptive here, let's see if someone can > get me over the line with some questions later. > > My feeling is color pipeline hw is not a done deal, and that hw > vendors will be revising/evolving/churning the hw blocks for a while > longer, as there is no real standards in the area to aim for, all the > vendors are mostly just doing whatever gets Windows over the line and > keeps hw engineers happy. So I have some concerns here around forwards > compatibility and hence the API design. > > I guess my main concern is if you expose a bunch of hw blocks and > someone comes up with a novel new thing, will all existing userspace > work, without falling back to shaders? > Do we have minimum guarantees on what hardware blocks have to be > exposed to build a useable pipeline? > If a hardware block goes away in a new silicon revision, do I have to > rewrite my compositor? or will it be expected that the kernel will > emulate the old pipelines on top of whatever new fancy thing exists. > > We are not Android, or even Steam OS on a Steamdeck, we have to be > able to independently update the kernel for new hardware and not > require every compositor currently providing HDR to need to support > new hardware blocks and models at the same time. > > Dave.