> On 20 Apr 2017, at 11:31, Alexandru Croitor <alexandru.croi...@qt.io> wrote:
> 
> So I found out about this recently as well, but touch events are not sent to 
> QQuickItems on macOS by default, because QNativeGestures are sent instead, 
> due to receiving high-precision gestures from the OS itself.
> 
> To receive touch events you need to use private API at the moment. Have a 
> look at the implementation of
> void QQuickMultiPointTouchArea::setTouchEventsEnabled(bool enable)
> 
> You would need to get the relevant function pointer and call 
> registerTouchWindow on your custom QQuickItem.

Right, that’s because of an unfortunate OS limitation that you can either get 
touchpoints from the trackpad, or gesture events (pinches, swipes, etc.), but 
not both.  A trackpad isn’t a touchscreen.  The first finger which you lightly 
put down on the surface moves the virtual mouse cursor, and generates only 
mouse move events.  The second finger doesn’t generate a touchpoint, because 
the OS is trying to detect 2-finger gestures (there are so many, and it’s not 
practical to disable that).  When you do the 2-finger flick, you normally get 
QWheelEvents in Qt, and those can drive the Flickable, so that it has a fairly 
native feel, like flicking text in Mail or Safari or whatever.  But because 
MultiPointTouchArea enables touch events via that private API, that disables 
the gesture detection just for that one NSView (which is normally the window 
containing the whole QtQuick scene), and that means you can get the touchpoints 
when you place 2 or more fingers on the trackpad.  It's not a direct touch 
device, so you know where the first point is (it’s the cursor), and the 
remaining ones will be nearby, based on how far away they are on the trackpad.  
But you will have trouble trying to tap on something onscreen with the second 
or third finger, say, because there is no visual feedback about where that 
finger is onscreen until you put down the finger - and then the feedback is up 
to the application.  And if you have some 3-finger OS gestures enabled in 
Preferences, those will occur simultaneously when you have 3 fingers in play, 
and interfere with what you are trying to do in the MPTA.

With all those limitations, MPTA isn’t actually that useful (except to just 
visualize the finger positions for a demo), and most likely you’re not going to 
have a great experience with a custom Item either (if you enable touchpoints 
via the private API, and thus disable gestures, that applies to the whole 
scene).  We prefer to have the OS-generated gesture events by default, because 
they enable smoother interaction with a native feel.  PinchArea works great 
with gesture events, and it doesn’t even have to do as much calculation, 
because the pinch event tells it exactly what to do: how much to zoom and how 
much to rotate.  (It may be that later on we will prefer native gesture events 
whenever possible on other platforms, but we were practically forced to do it 
on macOS.)

What I wish Apple would have done instead is send tentative touchpoints right 
away, then recall them whenever a gesture is detected, and let you either veto 
the recall to keep getting touchpoints, or accept the recall to switch over to 
gesture events for the duration of the gesture.  Then we could have quality 
data either way; it would be more flexible, but more complex.  If done right, 
an API like that could maybe even be portable between iOS and macOS.  But the 
trackpad evolved as more of a mouse-with-gestures than a touch device.  
Touchpad drivers on other platforms tend to be that way too, for that matter, 
with some exceptions.

On iOS the native APIs are different; you’ve got a real touchscreen after all.

_______________________________________________
Interest mailing list
Interest@qt-project.org
http://lists.qt-project.org/mailman/listinfo/interest

Reply via email to