Touch code should live in it's own framework #636

tim-evans opened this Issue Dec 8, 2011 · 4 comments


None yet
3 participants

tim-evans commented Dec 8, 2011

Referencing #635.


charlesjolley commented Dec 8, 2011



publickeating commented Dec 8, 2011

+1 to start the process (after we have a chance to tag off 1.7.2 though?)


tim-evans commented Dec 8, 2011

That sounds good to me!


publickeating commented May 8, 2013

Closing this, touch/MSPointer support must be built into all code, because we can only roughly guess whether we are going to receive mouse, touch or pointer events or not and more often than not we will be wrong if we try.

For instance, devices may be "touch" devices at one point and then "mouse" devices at another. Desktop devices may have touch screens, mobile devices may have touch screens but not use touch events in one browser and use touch events in another, a user session may swap intermittently between touch and mouse events depending on bluetooth devices, etc. Since determining touch support of the device is impossible (all we can determine is if the browser supports touch events, not whether the device uses them or not), I don't think there is any benefit in trying to separate this out. Every app must be designed to handle touch even if we think the app is only for "desktop" and so now I believe we need to keep all user event handling code together.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment