When I first became aware of this patch, my first response was excitement that Wayland would finally take gamepad input into account when doing screensaver suppression.

However, with the idea to pass file descriptors for bottleneck-avoidance, I've been growing increasingly concerned that no mention has been made of how it would interact with screensaver suppression.

For the following reasons, activity detection is the only satisfactory means for screensaver suppression on my desktop and, under X11, I've actually been meaning to build a daemon which injects no-op keycodes via uinput in response to evdev activity:

1. I run my videos full-screened on the one of my three monitors, but
   explicitly disable screensaver suppression in MPV because:

        a. I sometimes fall asleep while watching video and want to
           preserve my monitors' ability to sleep under those
           circumstances.

        b. Screensaver suppression, as implemented by MPV, interferes
           with system idle detection, which is used for triggering
           heuristic behaviour that would otherwise require
           instrumenting my chair with a wireless load sensor.

2. I tend to run my games windowed because most games crop the HUD
   pathologically when presented with a 5:4 aspect ratio (which means
   that suppressing the screensaver based on fullscreened windows is
   not helpful).

3. Applications have proven that asking them to support manual
   screensaver suppression is unreliable at best and tends to
   result in strange, buggy interactions with other programs.

I worry that, if the gaming input protocol doesn't take this into account, I might actually find myself worse off than under X11.

(Under X11, I'd have to proxy gamepad activity to devices X11 takes into consideration. I worry that, in the Wayland world where monkey-patching isn't the word of the day, I might even wind up needing to go the other way and work around non-gamepad-input screensaver suppression by writing something which maintains its own definition of "system idle" and uses home automation gear to actually cut off power to the monitors to overrule Wayland.)

P.S. Because my inbox is such a horrendous mess, it's not feasible for me to subscribe to mailing lists and then filter down (and GMane NNTP isn't working for me for some reason), so I'm watching this thread by manually polling the list archive for updates twice a day.

Hi all,

sorry for the fly-by commenting, but I would like to mention an idea
that has popped up in the past for game controller support, IIRC on
wayland-devel at .

That idea was that the compositor would not translate everything into
Wayland protocol. Instead, the compositor would hand out open evdev
device file descriptors to applications when game controller focus is
given, and revoke those file descriptors (make them dead) when game
controller focus is taken away.

(Hmm, do we have a mechanism to revoke fds or was that never merged in
the kernel?)

If you have something wild like a motion sensor at 1kHz, the compositor
might actually become a bottle-neck on performance or at least consume
quite some CPU if it was forwarding events. Passing device file
descriptors to the app would avoid that.

This means that applications would need to understand the kernel ABI
and do all the semantic mappings themselves, plus handle different
capability sets etc. A possible solution to that could be a
"libgamecontroller", somewhat akin to libinput except used by apps.
That would also support game apps that did not use a display server at
all. I'm not sure I remember right, but someone might have started on
one already.

The justification why this could work with game controllers while it
does not work with mice, keyboards, etc. is the difference in focus and
state handling. When a user activates a game, the compositor could just
give the game all the game controllers and not care what happens with
them. There is no global state to be managed, like pointer position or
keys held down, and switching focus would be very rare.

However, of course it is possible that in some cases the compositor
would need to catch and react to game controller events, e.g. if there
are no other input devices. I suppose the compositor would need to keep
its own instance of the device open and filter the event stream? Or
maybe the kernel provided a separate evdev device for "system
attention" buttons? I'm not sure.

Even bigger open questions are how to handle features like touchpads. I
also do not know how player id assignments should be handled, by the
compositor or by the app? Who would e.g. set up the player id leds in a
controller?

This email is meant as food for thought, an example of how things might
be designed completely differently. I would also encourage to search
for the earlier game controller / gamepad threads in the wayland-devel@
mailing list archives for ideas and interested people to CC. IIRC there
were quite many people and long discussions.

Dennis, Jingkui, did you ever consider this approach, and if you did,
what downsides did you see?


Thanks,
pq

--
Stephan Sokolow

Note: My e-mail address IS valid. It's a little trick I use to fool
"smarter" spambots and remind friends and family to use the custom
aliases I gave them.
_______________________________________________
wayland-devel mailing list
[email protected]
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Reply via email to