Søren Sandmann wrote:
However, unless using 16 bpp for the internal buffers is really a performance win, there doesn't seem to be any benefits to doing it this way over just dithering the final image. And I definitely want to see convincing numbers before believing that 16 bpp for internal buffers is a performance benefit.
No, you are confusing dithering with just adding enough noise to hide dithering artifacts. And that is a *lot* of noise. Imagine if the destination is 1-bit and you wanted to show a grayscale image on it: it would turn into black and white blotches, and I think enough noise to hide the edges of those blotches would result in no image visible at all!
Dithering involves adding the pattern to the source, such that a pixel of 5.5 is much more likely to end up >= 6 than a pixel of 5.0, and thus on average will be brighter. If both of these are converted to 5.0 before the pattern is added then both will end up the same level.
Error diffusion will result in the highest quality result, such as really silky smooth gradients. However it is a pain to implement on the GPU so most schemes add patterns.
_______________________________________________ Pixman mailing list [email protected] http://lists.freedesktop.org/mailman/listinfo/pixman
