You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I've been using Weylus for a while and recently got the Windows version of Weylus. When I connect to my desktop via a touchscreen device, I've noticed that even though it sends "touch" input through the Weylus client, the Weylus server registers touch inputs as a left click/mouse movement. This is not practical when trying to drag/scroll as swiping is interpreted as a left-click drag (ex. makes text selection via cursor instead of scrolling on a web page), rather than registering it as swipe navigation.
I have observed that with the Linux version of Weylus, if you set it up correctly, it registers touch inputs from the client as a touch input on the host machine (Ex. swiping on a page scrolls down, rather than simulating a left click drag and selecting text)
My suggestion is to try reimplementing how Weylus interprets touch input so that it communicates with Windows' equivalent of touchscreen input rather than registering it as just mouse movement/left click drag, similar to how some drawing tablets (with the right drivers) register drags as swiping as if the drawing tablet is a touchscreen device, not just a trackpad or mouse.
The text was updated successfully, but these errors were encountered:
Hello,
I've been using Weylus for a while and recently got the Windows version of Weylus. When I connect to my desktop via a touchscreen device, I've noticed that even though it sends "touch" input through the Weylus client, the Weylus server registers touch inputs as a left click/mouse movement. This is not practical when trying to drag/scroll as swiping is interpreted as a left-click drag (ex. makes text selection via cursor instead of scrolling on a web page), rather than registering it as swipe navigation.
I have observed that with the Linux version of Weylus, if you set it up correctly, it registers touch inputs from the client as a touch input on the host machine (Ex. swiping on a page scrolls down, rather than simulating a left click drag and selecting text)
My suggestion is to try reimplementing how Weylus interprets touch input so that it communicates with Windows' equivalent of touchscreen input rather than registering it as just mouse movement/left click drag, similar to how some drawing tablets (with the right drivers) register drags as swiping as if the drawing tablet is a touchscreen device, not just a trackpad or mouse.
The text was updated successfully, but these errors were encountered: