Okay, but I am still not seeing a clear answer to "why are touch pads
and touch screens different?" Perhaps the client has to do gestures, but
it seems to me that the answer is going to be the same whether the
controlling surface is on the screen surface or not.
The fact that you can point to som
On 21/02/2015 06:08 , Bill Spitzak wrote:
> On 02/19/2015 04:49 PM, Peter Hutterer wrote:
>
>> unless you have the context you cannot know. and the only thing to have
>> that context is the client. sure you can make all sorts of exceptions
>> ("but
>> double-tap should always be doubletap") but th
On 02/19/2015 04:49 PM, Peter Hutterer wrote:
unless you have the context you cannot know. and the only thing to have
that context is the client. sure you can make all sorts of exceptions ("but
double-tap should always be doubletap") but that just changes the caliber
you're going to shoot yourse
On Thu, Feb 19, 2015 at 01:15:31PM -0800, Bill Spitzak wrote:
> I think I'm not explaining my question right.
>
> I fully think it is correct for libinput to do gesture recognition.
>
> My question is why you think this should not be done for touch screens.
>
> I think it should be done for them
I think I'm not explaining my question right.
I fully think it is correct for libinput to do gesture recognition.
My question is why you think this should not be done for touch screens.
I think it should be done for them, and for every other input device in
the world (including mundane things
On Wed, Feb 18, 2015 at 11:36:18AM -0800, Bill Spitzak wrote:
>
>
> On 02/18/2015 04:26 AM, Hans de Goede wrote:
> >For touchscreens we always send raw touch events to the compositor, and the
> >compositor or application toolkits do gesture recognition. This makes sense
> >because on a touchscree
On 02/18/2015 04:26 AM, Hans de Goede wrote:
For touchscreens we always send raw touch events to the compositor, and the
compositor or application toolkits do gesture recognition. This makes sense
because on a touchscreen which window / widget the touches are over is
important context to know t
For touchscreens we always send raw touch events to the compositor, and the
compositor or application toolkits do gesture recognition. This makes sense
because on a touchscreen which window / widget the touches are over is
important context to know to interpret gestures.
On touchpads however we ne