Touch support #323
Labels
A-input
Area: Input handling, IME etc.
enhancement
New feature or request
help wanted
Extra attention is needed
At the current moment, with the focus on native development (Oni 2), we haven't implemented any touch support. This is glaringly apparent if you try out our playground on any sort of touch device (iPad, iPhone).
And even though our focus is on desktop, lots of laptops and monitors do support touch - so it's important to create a story for this, even if mobile is still a ways out.
However, prior to implementing gestures (#229) we need a way to get and register the touch activity, so that they can be plugged into the 'gesture engine' discussed in #229
To implement the gesture engine - we'd need to provide these callbacks:
onTouchesBegan(interaction, beganTouches, allTouches)
onTouchesMoved(interaction, movedTouches, allTouches)
onTouchesEnded(interaction, beganTouches, allTouches)
A good overview of what the
touch
object itself could look is provided by @jordwalke 's gesture detection prototype here: https://github.com/reasonml/reason-react/blob/ac13a4308eeb057eeb72571a96550fb1726306a5/explorations/lib/interact/Touch.reThe best way for us to start prototyping this would be in our WebGL + JS build, via the JS Touch API.
We could start implementing events for these APIs on the JS side - and then later add native implementations, which will be different for each platform. But we could at least start prototyping and seeing how the gestures 'feel' with interactions like the scroll bounce in #315 - and once we get the native support, they should feel even faster and more fluid.
This is a pre-requisite to implementing gestures in #229 , because we need to be able to track the touch points before we can engage a gesture engine.
We could start by adding the following events to our Window object:
onTouchesBegan
onTouchesMoved
onTouchesEnded
And then listen to them here:
https://github.com/revery-ui/revery/blob/master/src/UI/Revery_UI.re#L89
And from there - the challenge will be figuring out how to hook up the
GestureRecognizer
engine and the available gestures. A good starting point for the purpose of this PR would be a simple 'tap' gesture, hardcoded without the engine - and then we could expand it with #229 .Open issues:
reason-glfw
? I think it's reasonable to start in Revery, and we could branch out and extract it back out, if it makes sense.Goals:
The text was updated successfully, but these errors were encountered: