On Mon, 20 May 2013 13:56:27 -0500 Jason Ekstrand <[email protected]> wrote:
> On Mon, May 20, 2013 at 4:00 AM, Pekka Paalanen <[email protected]> wrote: > > > On Thu, 16 May 2013 16:43:52 -0500 > > Jason Ekstrand <[email protected]> wrote: > > > > > The point of this soi is to allow surfaces to render the same size on > > > different density outputs. > > > > Are you serious? Really? Same size measured in meters? > > > > No, measured in inches. :-P > > Seriously though. While we can't make it *exactly* the same on all your > displays, we should be able to make it usably close. I do not think that should be a goal here, on the core protocol level. It's a can of worms, like you see from all fractional pixel problems raised, which the current integer-only proposal does not have. > > I do not think that will ever work: > > http://blogs.gnome.org/danni/2011/12/15/more-on-dpi/ > > and doing it via scaling is going to be worse. > > > > Yes, scaling looks bad. I don't know that we can avoid it in all cases > (see also the 200DPI and 300 DPI case). Sorry, which email was this in? > > Going for the same size is a very different problem than just trying to > > get all apps readable by default. I'm not sure same size is a better > > goal than same look. > > > > And on a side note: > > http://web.archive.org/web/20120102153021/http://www.fooishbar.org/blog > > > > What I would like in the end is a per-output slider bar (or something of > that ilk) that let's the user select the interface size on that output. > Sure, they probably won't be able to select *any* resolution (the > compositor may limit it to multiples of 24 dpi or something). And they can > certainly make an ugly set-up for themselves. However, I want them to be > able to make something more-or-less reasonable and I see no reason why the > compositor shouldn't coordinate this and why this "scale factor" can't be > used for that. I think that is an orthogonal issue. That would be a DE thing, just like choosing font sizes. Buffer_scale OTOH is a Wayland core feature, and is best kept as simple as possible. The slider would control window and widget sizes, while buffer_scale only controls the resolution they are rendered in. Or... > My primary concern is that integer multiples of 96 DPI isn't going to be > enough granularity. I don't know whether we can really accomplish a higher > granularity in a reasonable way. For the cases where buffer_scale cannot offer a usable resolution, we can still fall back to arbitrary scaling in the compositor by private surface or output transformations. That does not allow pixel-accurate/high-resolution presentation of windows like buffer_scale, but I believe is an acceptable compromise. Didn't OS X or something do similar for the 1.5 factor? I recall someone mentioning about that, but couldn't find it. ...or the slider could control buffer_scale and output scaling in tandem, using buffer_scale for integer factors (which the GUI would recommend), and realize non-integer factors by some combination of the two. Naturally the units in slider would be scaling factors, not DPI, since DPI is meaningless to a user. I can imagine how hilarious it would be to have "Please, try to use integer multiples of 96 DPI for the best performance and look" in the GUI. ;-) Thanks, pq _______________________________________________ wayland-devel mailing list [email protected] http://lists.freedesktop.org/mailman/listinfo/wayland-devel
