Hi, How are touchscreens in multiscreen setups supposed to work nowadays? The issue with touchscreens is that their input events have to be transformed according to configuration of the screen they are physically connected to.
For a concrete example, I have a dual screen setup with nextwindow USB (HID) touchscreens. This used to work nicely with the evtouch driver, where you could configure each touchscreen to bind to a specific screen number in xorg.conf, which evtouch would then use to deliver the input events transformed to that screen. Now, with xrandr instead of Xinerama there is only one screen and the above doesn't work (the coordinates gets scaled to the entire screen instead of the individual outputs). What is the suggested solution for this? Add randr output tracking to evtouch or should evdev handle this instead? -- Bye, Peter Korsgaard _______________________________________________ xorg mailing list [email protected] http://lists.freedesktop.org/mailman/listinfo/xorg
