Join GitHub today
GitHub is home to over 36 million developers working together to host and review code, manage projects, and build software together.Sign up
Allow touchmode-interface for Microsoft Surface tablets #613
Navigate to about:flags in MS Edge in order to change the default "Always off" to anything else. Once that is done master works with touch events without a problem.
The only issue on master right now is that once that is done, then mouse events are disabled. One of the fixes for that can be #619 (which, this time, was properly tested on touch devices, including an android tablet for clicking and dragging).
Yeah, I'm quite aware that this would only solve part of the problem. The fact still stands that the UI-detection for touch is incomplete. I believe that we should display the touch controls in the UI for any touch device, even if the device has access to an ordinary mouse and keyboard.
Expecting the users to change settings is not a good "solution" in my mind.
In order to properly support noVNC on MS Surface tablets for instance, we would need both the changes from this PR and the changes from your #619
Agree, but without that option change MS Edge pretends to not have a touchscreen. Meaning that even though the touch interface will be displayed, it will have no effect since only "mouse" messages will be coming through.
I have taken out the whole condition for touch events locally and put in a console.log statement in the mousebuttonhandle. With the option at its default setting, even though touchstart was registered, only MouseEvents where received.
So in the end it is not a problem with the code as much as with the browser. Btw, there is the same thing with Firefox on desktops (both mobile ie and firefox work fine with touch).
Maybe include a warning: we've detected that your device has a touchscreen but its support is disabled for this browser, click here for instructions how to enable touch support.
Or something like that. What do you think?
Actually it is even worse than I thought: with touchscreen support disabled, attempting to double tap engages some weird zoom mode that does not display as zoom in the menu, but still makes elements bigger and messes with the coordinate translation in the Mouse object.
In case you do not have a windows touch device we could set up a call and I'll share my screen with you to show you what is going on, if you'd like. Having more experience with noVNC, you are more likely to understand what is going on.
P.S.: a few double taps later the canvas area disappeared, this is what I saw in the dom explorer:
You are confusing me here.. Didn't you say that #619 is a fix for sending touch events correctly, even on MS Edge?
#619 only concerns mouse events sent to the server. This PR only concerns user interface. The goal should be to fix both these parts.
The functions that are specific to touch devices in the UI are:
I might've claimed something like that in one of my edited messages, but that was before i read up more about touch events, browser support and have actually tried on an edge browser. My tests were more concerned with testing on a variety of devices than a variety of browsers and hence did not notice that edge does not process touches. (I don't have Firefox installed on my touch device, btw.)
PR #619 is concerned with mouse events being captured, proceeded on client side without interfering with touch event handling. So that dual input devices can use touch and mouse if touch is supported by both the browser and the device. In other words: fix mouse control without screwing touch.
@samhed, sorry for confusing you, hope that this reply will introduce more clarity. Also, please let me know if my awkward attempts to be helpful are not helping.
referenced this pull request
Jun 8, 2016
Fixed in d6cb04a