Generic & user-mappable touch input system #2135

Merged
merged 13 commits into from Mar 7, 2013

Projects

None yet

5 participants

@Montellese
Member

These commits change the way we handle touch input on Android and ideally also on other touch enabled platforms (with the help of other devs). Right now every touch enabled platform has its own touch input handling logic and maps whatever touch event to some platform specific action. This means that a lot of logic is duplicated and hardcoded.

This PR is meant to improve the touch input system with two changes:

  1. The touch input system used by Android has been reworked into a generic implementation based on interfaces which can be implemented by any platform. There's a lower level (ITouchInputHandler) responsible for handling "raw" touch events (touch down/up/move/...) and interpreting those touch events to detect gestures etc. On platforms with builtin gesture recognition (like iOS) the lower level implementation would basically just forward whatever the builtin logic provides to the proper callbacks of the higher level. The higher level (ITouchActionHandler) can be platform specific but ideally doesn't have to be. It is responsible to translate touch actions like "single tap", "single long press", "rotate", "swipe left" etc into XBMC specific actions. There's a generic implementation for both levels but every platform can provide its own implementation or extension of either (or both) levels which offers maximum flexibility.
  2. The touch actions translated into XBMC specific actions by the higher level are user-mappable through touchscreen.xml. Available tags right now are <tap>, <longpress>, <doubletap>, <doublelongpress>, <pan>, <swipe direction="left/right/up/down">, <zoom> and <rotate> and can be mapped to any existing xbmc action in the same way keyboard- or remote-mapping works.

As I only have an android phone with touch support I was only able to implement this for android but I hope that @Memphiz (or someone else) will look into adopting this for iOS and someone else into adopting this for Windows 7/8 touch. The interfaces of ITouchInputHandler might be a bit android-specific and I'm open to any suggestions to make it more generic. ITouchActionHandler is just a list of methods representing touch actions but obviously can also be extended if necessary.

Last but not least the generic implementation of ITouchInputHandler (CGenericTouchInputHandler) has been refactored to use implementations of the IGenericTouchGestureDetector interface to detect complex touch gestures like swipe, rotate and zoom. That cleans up CGenericTouchInputHandler a bit and should make it easier to add any other touch gestures.

As always I'm open for feedback on the idea, concept and the implementation. Furthermore I really hope that devs with touch devices running other platforms will look into this and see if it can be used to unify XBMC's touch input system. I'd be willing to do the work for Windows 7/8, WP8 and Ubuntu (or whatever linux OS has touch capable devices) if I could get my hands on a device.

PS: b48c855 is only listed as a seperate commit because I'd like to get some extra feedback on that implementation i.e. using the screen's DPI value to determine the minimum touch movement in the swipe gesture detection.

@Montellese
Member

After some discussion with Memphiz who looked at how to integrate this with iOS I've made some changes to the ITouchActionHandler interface. OnSingleTag(), OnSingleLongPress(), OnDoubleTap() and OnSwipe() now take an optional "pointers" parameter which indicates how many fingers/pointers were involved in the touch action. So OnSingleTap() with "pointers" = 2 means that there was a single tap of two fingers/pointers etc. Because of this change I also removed OnDoubleLongPress() because it can be represented by OnSingleLongPress() with "pointers" = 2 whereas the meaning of OnDoubleTap() has changed to a touch action where the screen is tapped twice in quick succession (like a double click) which IMO makes more sense.

This change effects how we do the keymapping because we need to support all those touch/gesture actions for multiple pointers. What I've come up with is to define e.g. ACTION_TOUCH_TAP and ACTION_TOUCH_TAP_TEN with values that range over 10 numbers. So ACTION_TOUCH_TAP_TEN = ACTION_TOUCH_TAP + 10 and a tap with two pointers would be ACTION_TOUCH_TAP + 1 etc. I'm not sure if this is the best way to do this but our action translation system basically relies on the fact that there is an action ID for every mappable action. If anyone has a better idea, please let me know.

Memphiz and I also discovered another problem: How to differ between a pan and a swipe gesture. Detecting a swipe gesture is easy but it is only possible to do this detection once the gesture is finished so as long as the gesture isn't finished yet, it is also a pan gesture. In the generic touch input handler I wrote (which is used for Android) I send both pan gesture and swipe gesture events but on iOS if you want to detect swipe gestures you can't detect pan gesture and vice versa. Up until now this was solved by having swipe detection logic in the pan gesture action handling in the places where necessary but that requires code duplication everywhere where swipe detection is needed and obviously doesn't allow swipe gestures to be user-mappable. Does anyone have an idea on how to tackle this?

@Memphiz
Member
Memphiz commented Jan 31, 2013

On rebase i realised that the focus stuff before the touch action is fired is missing (mouse move event ....) ;)

@Memphiz
Member
Memphiz commented Jan 31, 2013

nice

< swipe direction="left" pointers="2" >Back</ swipe >

works now on ios :D good work. Only the swipe is left - after that ios gestures are compatible to before again with the new unified handler.

2 things i realised to:

  1. The zoom gesture now behaves different. When the image is zoomed and i put the fingers there again - the zoom is reset to the factore my fingers are representing at that moment. Wasn't it that the zoom continued were we left of before? Same for rotation.
  2. the snap to 90° in the fullscreenpicture is broken somehow (it doesn't snap back for me).
@Montellese
Member

I added the work from @Memphiz to integrate iOS into this. There's still the pan/swipe problem that needs resolving though.

@Memphiz: both zooming and rotation snap back still work the same on Android for me.

@Montellese
Member

I rebased the commits and made some additional changes:

  • I turned the enum values of EVENT_RESULT into flags so that a GESTURE_NOTIFY query can return more than one possible gesture (e.g. zooming and rotating) and also adjusted CWinEventsWin32 accordingly.
  • I made use of the new flag nature of EVENT_RESULT in CGUIWindowSlideshow
  • I fixed a bug in CGUIWindowSlideshow which made it impossible to vertically move a picture, if it was zoomed but not enough to be horizontally moveable.
  • I added a new method QuerySupportedGestures() to CGenericTouchActionHandler as a wrapper for the GESTURE_NOTIFY queries to the GUI control.

@Memphiz: I had to do a forced push. The code should now be ready for you to try to figure out the pan/swipe problem on iOS. There are cases where a control supports both panning and swiping (e.g. in the slideshow if a picture has been zoomed and can be moved (through panning) vertically but can not be moved horizontally and therefore swipe left/right still works to get to the previous/next picture.

@anyoneelse: Does anyone have a touch device with windows 7 who'd be either willing to do the necessary changes to the win32 touch handling code OR who'd be willing to be my gpig for blind coding changes?

What about Windows 8? I checked and it comes with a new (and probably improved) touch input API (the old implementation should still work though). Does anyone have a (non-ARM) touch device running windows 8 and XBMC?

I checked the code and didn't find any touch input support for ubuntu/linux. Are there no such devices or is there no touch support? Or is the touch just turned into a mouse event by the OS?

@Montellese
Member

@Memphiz: I've pulled in your swipe support work and have done some adjustments to the ITouchActionHandler interface. Did you have time to look into calling OnTouchGestureStart() and OnTouchGestureEnd() for zoom and rotate gestures? I haven't done a rebase yet because you mentioned possible merge conflicts with some stuff done by @ulion. How should we handle that?

@Memphiz
Member
Memphiz commented Mar 4, 2013

Sorry - still on Ski vacation ;)

@theuni
Member
theuni commented Mar 5, 2013

@Montellese Just had a look.. that's.. impressive to say the least.

For linux you'll need a newish X with multitouch hooked up. I have a laptop that I could use for testing.

@elupus: Somewhat related, I picked the x11 winevents out of your SDL-nuke tree for something I was working on a few weeks ago, and it looks like it'd be handy here again. Maybe it'd be worth breaking out the input part into a PR of its own? It'd be a shame to hack on multi-touch here when it's just going to be ripped back out.

@Memphiz
Member
Memphiz commented Mar 5, 2013

@Montellese - just rebase and leave the conflicts in there. Then ping me and i fix it up.

@Montellese
Member

@Memphiz: done, just let me know which commits to pull in and then I can squash and rebase them with your other commits. Thanks.

@Montellese
Member

@Memphiz: Thanks for the update. I've merged the xcode changes from your fixup commit with the other commit messing around in the xcode file. Furthermore I've added the gesture start/end changes into 79c1030. Should I merge d091776 (swipe support) into 79c1030 as well or do you want to keep it separate?

Does the rotation snap back now work? Anything else that needs work or is all good now on iOS?

@Memphiz
Member
Memphiz commented Mar 5, 2013

Well done. Everthing fine now on ios (rotation snap back is working too now of course :D). Thx for the commit cleanup. No need to squash it down further imo.

@Montellese
Member

Great, so I guess this is ready to go in during this merge window. I'll give it another day or two in case someone else has something to say about it and if not I'll punch it in. The win32 adaption can be done as soon as I can test it and @theuni said that touch support on linux (which we currently don't have anyway) requires some more work as well which can be done in the future.

@Montellese Montellese was assigned Mar 5, 2013
@Memphiz
Member
Memphiz commented Mar 5, 2013

Sounds like a good plan to me :)

@Montellese Montellese merged commit e46bb02 into xbmc:master Mar 7, 2013
@Montellese Montellese deleted the Montellese:touch_input branch Mar 7, 2013
@ulion
Contributor
ulion commented Mar 9, 2013

GUI_MSG_UNFOCUS_ALL may need be added for OnTap or other similar handler, the code in CWinEventsIOS always send this for XBMC_MOUSEBUTTONUP event on ios. current code will leave the button in focused state without unfocus it correctly on ios.

@ulion ulion commented on the diff Mar 17, 2013
xbmc/osx/ios/XBMCController.mm
+ CGPoint point = [sender locationOfTouch:0 inView:m_glView];
+ point.x *= screenScale;
+ point.y *= screenScale;
+ swipeAllowed = false;
+
+ if ([sender numberOfTouches] == 2)
+ {
+ swipeAllowed = true;
+ }
+ else
+ {
+ int gestures = CGenericTouchActionHandler::Get().QuerySupportedGestures(point.x, point.y);
+ if (gestures & EVENT_RESULT_SWIPE)
+ {
+ swipeAllowed = true;
+ }
@ulion
ulion Mar 17, 2013 Contributor

why we need this? how about send swipe actions into the touch handler directly?
this cause problems, since the QuerySupportedGestures does not support be called in thread other than xbmc main thread.
here's my ui deadlock traceback caused by this: http://www.xbmclogs.com/show.php?id=4973

@ulion ulion commented on the diff Mar 17, 2013
xbmc/input/touch/generic/GenericTouchActionHandler.cpp
+ return true;
+}
+
+bool CGenericTouchActionHandler::OnTouchGesturePan(float x, float y, float offsetX, float offsetY, float velocityX, float velocityY)
+{
+ sendEvent(ACTION_GESTURE_PAN, x, y, offsetX, offsetY);
+
+ return true;
+}
+
+bool CGenericTouchActionHandler::OnTouchGestureEnd(float x, float y, float offsetX, float offsetY, float velocityX, float velocityY)
+{
+ sendEvent(ACTION_GESTURE_END, velocityX, velocityY, x, y);
+
+ // unfocus the focused GUI item
+ g_windowManager.SendMessage(GUI_MSG_UNFOCUS_ALL, 0, 0, 0, 0);
@ulion
ulion Mar 17, 2013 Contributor

use SendThreadMessage, OnTouchGestureEnd could be called by other thread rather than xbmc main one.

@ulion
ulion Mar 19, 2013 Contributor

even use SendThreadMessage, the unfocus event is processed before the ACTION_GESTURE_END. they are in different event queue. then the only way do that correctly is in CWinEvents do it like before.
But, I found not all time unfocus is expected ui reaction. so the question is, why we need the unfocus? is it always needed?

@koying
Member
koying commented on 967c20b Mar 22, 2013

Is there specific IOS code for swipe?
AFACT, the generic swipe detector only allows 1 pointer...

Member

Sure - we use the native ios gesture recognizers:

https://github.com/xbmc/xbmc/blob/master/xbmc/osx/ios/XBMCController.mm#L624

Member

Ah Oki. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment