Thanks for reply, We want to achieve an application for touch input map, it will map touch input to a output device when we have multi-screen.
The input map need provide a input device and a output device, in our design, we will traversal all screen to ask user "is this screen you want to map", and user choice a touch device by tap on a touch device, just like "touch config" feature in Windows11. All in all, we want to achieve an application for touch input map, if have a batter design, i will discuss it with the designer. At 2023-11-22 06:31:07, "Peter Hutterer" <peter.hutte...@who-t.net> wrote: >On Tue, Nov 21, 2023 at 04:20:10PM +0800, weinan wang wrote: >> Hey Guys, >> We have an application that needs to get the touch device >> corresponding to the touch event, and on X11, we can get the device ID >> that sends this event according to the XI_Touch events, and then find >> the device we want according to this ID. But on Wayland, it seems >> that neither wl_seat nor wl_touch can get device-related information, >> this application is a user permission, so we can't use libinput to get >> more low-level events. >> Please ask if there is any other way to get the device that sends the >> event? > >This is not possible in wayland, physical devices are abstracted into >wl_pointer/wl_keyboard/wl_touch and the underlying device is not exposed >anywhere through wayland or even current wayland-protocols. > >The sole exception is the wl_tablet protocol which is per physical >device but that's due to how tablets are being used. > >You don't usually have access to the libinput context inside the >compositor either (because Wayland itself doesn't require libinput, it's >an implementation detail). > >The question here is: what are you trying to achieve? maybe there's a >different way to do it than having access to the specific physical >device. > >Cheers, > Peter