Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Touch support #323

Open
bryphe opened this issue Feb 9, 2019 · 0 comments
Open

Touch support #323

bryphe opened this issue Feb 9, 2019 · 0 comments
Labels
A-input Area: Input handling, IME etc. enhancement New feature or request help wanted Extra attention is needed

Comments

@bryphe
Copy link
Member

bryphe commented Feb 9, 2019

At the current moment, with the focus on native development (Oni 2), we haven't implemented any touch support. This is glaringly apparent if you try out our playground on any sort of touch device (iPad, iPhone).

And even though our focus is on desktop, lots of laptops and monitors do support touch - so it's important to create a story for this, even if mobile is still a ways out.

However, prior to implementing gestures (#229) we need a way to get and register the touch activity, so that they can be plugged into the 'gesture engine' discussed in #229

To implement the gesture engine - we'd need to provide these callbacks:

  • onTouchesBegan(interaction, beganTouches, allTouches)
  • onTouchesMoved(interaction, movedTouches, allTouches)
  • onTouchesEnded(interaction, beganTouches, allTouches)

A good overview of what the touch object itself could look is provided by @jordwalke 's gesture detection prototype here: https://github.com/reasonml/reason-react/blob/ac13a4308eeb057eeb72571a96550fb1726306a5/explorations/lib/interact/Touch.re

The best way for us to start prototyping this would be in our WebGL + JS build, via the JS Touch API.

We could start implementing events for these APIs on the JS side - and then later add native implementations, which will be different for each platform. But we could at least start prototyping and seeing how the gestures 'feel' with interactions like the scroll bounce in #315 - and once we get the native support, they should feel even faster and more fluid.

This is a pre-requisite to implementing gestures in #229 , because we need to be able to track the touch points before we can engage a gesture engine.

We could start by adding the following events to our Window object:

  • onTouchesBegan
  • onTouchesMoved
  • onTouchesEnded

And then listen to them here:
https://github.com/revery-ui/revery/blob/master/src/UI/Revery_UI.re#L89

And from there - the challenge will be figuring out how to hook up the GestureRecognizer engine and the available gestures. A good starting point for the purpose of this PR would be a simple 'tap' gesture, hardcoded without the engine - and then we could expand it with #229 .

Open issues:

  • Should this touch work be implemented in Revery itself, or in reason-glfw? I think it's reasonable to start in Revery, and we could branch out and extract it back out, if it makes sense.

Goals:

  • After this issue is completed, I should be able to 'tap' buttons on our sample app on an iOS/iPhone for our WebGL + JS build.
@bryphe bryphe added enhancement New feature or request help wanted Extra attention is needed labels Feb 9, 2019
@glennsl glennsl added the A-input Area: Input handling, IME etc. label Nov 25, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-input Area: Input handling, IME etc. enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants