I think we should do this in steps since everybody is currently starved for
cycles. Basically:
1) get win32 widget cleaned up such that it is sending w3c touch events
using WM_TOUCH when RegisterTouchWindow is called, also get it firing
simulated click events and pixel scroll as it does currently as a temporary
measure.
2) file bugs on moving simulated clicks into dom, and work on that once we
have the time / people who understand all the current touch capable
platforms (win32, winrt, android, b2g) to do it.
3) file a bug to investigate moving pixel scroll out of win32 widget and
into front end code, similar to what metrofx is doing. This work may be
impacted by omtc / apzc in the near future, so it might be prudent to wait
until that gets fleshed out anyway.
With 1 complete, content should work better and metrofx / desktopfx should
generally match each other.
2 & 3 can be worked on when we have the time.
Jim
-----Original Message-----
From: Tim Abraldes
Sent: Thursday, April 25, 2013 6:32 PM Newsgroups: mozilla.dev.platform
To: dev-platform@lists.mozilla.org
Subject: A Proposal for Reorganizing Processing of Touch Input
In this thread [1], we discussed a proposal that aimed to clean up
Windows widget code, speed up performance, and consolidate desktop and
metro logic. Thanks to everyone who participated in that thread! Based
on the input there, and an extensive discussion in #windev, the
following seems to be what the collective mind is converging on in terms
of how touch input should be processed. Because this proposal modifies
shared code (dom), let's discuss in this new thread.
Keep in mind that the following describes the processing of touch input
only! It makes no mention of how other types of input (keyboard, mouse,
stylus, yelling, etc) will be processed.
widget:
* Emit W3C Touch events and W3C Pointer events. No exceptions. No
additional events emitted from widget.
dom:
* Translate Touch/Pointer events into mouse/click events when the touch
input is recognized as a "tap" [2]
* Translate Touch/Pointer events into rotation/pinch-zoom/etc simple
gesture events [2][3]
chrome JS:
* When a sequence of Touch/Pointer events is recognized as a scrolling
gesture, scroll [4]
[1]
https://groups.google.com/forum/?fromgroups=#!topic/mozilla.dev.platform/QxgrqBlqAdk
[2] dom should be aware of whether preventDefault was called on the
touchstart/touchmove events that are part of these gestures and NOT
perform the translation if preventDefault was called.
[3] This seems like a large amount of effort with less benefit than
other aspects of the proposal. Perhaps the rest of the proposal could
be landed first and this could be follow-up work?
[4] Eventually, on both desktop and metro, we will use async scrolling
via the APZC, so this chrome JS is an interim solution until APZC takes
over. The chrome JS and the APZC would have to have the ability to
recognize whether preventDefault was called on the touchstart/touchmove
events and ignore the gesture if so.
Open questions:
Is this a crazy amount of effort?
Are we in need of an incremental solution that will give us
perf/cleanliness gains that is not as ambitious as this?
(How badly) Will this break platforms other than Windows?
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform