Quentin Glidic wrote:

I use a mouse button as my Push-to-Talk button. This button is
detected as “History back” by my web browser, and *I want to have both
working.*

I think you are confusing what I was complaining about (which was the idea that there is some difference between your bind "actions" and the events sent from the compositor to clients. Basically if the compositor is going to do a lot of decoding, then decoded result should be sent to the clients in events).

For your request, I think whether a binding "eats" the event can be a fixed setting for each gesture. Ie mouse buttons and modifier keys are not eaten, all keys that produce text are eaten, etc.

Also, binding gestures and complex stuff is a really good feature to
have (think of three-thingers scrolling to up/down volume as an
example), and more easily done on the compositor side since that would
require to expose yet another information (pointer movement, touchscreen
touches) to the client.

I agree that translating "gestures" should be done by the compositor.

But I would like to see the simplest gestures attacked first: the "gesture" of pushing a button with the text "FOOBAR" on it should send an event containing the text "FOOBAR" to the client!

Or just maybe all the gestures should be done by the client. That would be consistent at least. I would prefer that the compositor does it all however.

Events should certainly be decoded to more like the xkeysym level

AFAIU, the decoding is done by libxkbcommon and it was designed so that
clients would have to support that explicitly. This has nothing to do
with global bindings.

I think that is a serious mistake. The biggest problem I have with remote X (both NX and normal $DISPLAY) is bungled scancodes because the local x server has to backwards translate key events into phony scancodes. When I log in remotely I always have to run xmodmap with a special table to fix this. This is obviously a buggy hack and Wayland should fix it.

Please note that music and video players are not currently using
multimedia keys directly: mplayer and VLC are using the space bar as
play/pause (while focused, of course), VLC uses the mouse wheel to
control volume. This is a good design, to keep the global keys usable at
the system level (e.g. controlling the PCM volume).

I want the multimedia app to be able to decide whether the volume up/down adjust it's volume or the global one. What I proposed was that all events go to the clients and they can say they did not handle them (thus the volume buttons go to the focused media player, but adjust the global volume if the focues app ignores them). I think there is dislike for this idea, so I now propose that it be moved to the binding api.

That would require a round trip and that would be wrong is some (many?)
cases: I have two music players A and B, I focused B last, why would A
have to decline the action if it can handle that perfectly, since it
does not have the knowledge that B exists at all?

It would be sent to B first (as you proposed, they are sent to the last-focused client). The only change I was making is that B could say "I did not use it" and then (perhaps) A gets it.
_______________________________________________
wayland-devel mailing list
[email protected]
http://lists.freedesktop.org/mailman/listinfo/wayland-devel

Reply via email to