touchscreen support #7789

GregDMeyer opened this Issue Jan 10, 2017 · 5 comments


None yet

2 participants


I'm working on adding touch-to-drag and pinch-to-zoom support for matplotlib. It is in general working but there are a few things to add before I submit a pull request. Wanted to get the conversation started though on a few points, and ask for any general input from the community.

Specific points:

  1. Is this a good idea (I can't think of a reason it wouldn't be, but advise)
  2. Should it be a toggle-able feature, for example in the rcParams? (I can't really think of a reason it should be, but maybe there is a use case in which people want touches to behave as mouse input)
  3. Pinch to zoom in my implementation keeps the same points on the plot under your finger. So x and y axes rescale independently and aspect ratio changes as you move the two touches. Sounds reasonable? (this is intuitive to me)
  4. Key input should hold x, y, aspect ratio (ctrl) as it does for current interactive mode. But maybe there should be some other situations in which we hold x,y: for example if the two touches are at approximately the same y-value, small changes in their y-location will lead to huge scale changes for the y-axis. So maybe if the touches are sufficiently close on an axis, that axis should be locked?

Open to any other ideas!


How does touch-screen handling relate to mulit-touch trackpad handling? I suspect that will be a much larger user base than actual touch screen users.

Can you do this in a backend-independent way? There is a kivy backend (maintained externally).

Putting a knob to turn it off seems reasonable (I suspect it will be a conditional on hooking up some event handlers?).

Have not thought about this enough to have reasonable opinions about 3 and 4.

@tacaswell tacaswell added the wishlist label Jan 11, 2017

I think touchscreen vs trackpad are handled differently. With my current setup (qt5, Linux) multitouch events on my trackpad simply get translated to scroll events, unlike touchscreen events which come from Qt as QTouchEvent class. Agree though, it would be nice to have multitouch trackpad support as more users are likely to have that. Since physically a multitouch touchscreen event and a multitouch trackpad event are different (touchscreen event corresponds to some actual screen locations), I have a feeling they may have to be handled separately--but they do both sound useful.

It is pretty backend independent, though I don't know how much the kivy backend inherits from matplotlib. Currently in my code the actual handling of the touches, zooming, panning, etc happens in, in FigureCanvasBase and NavigationToolbar2. So individual backends just have to make a container of the touch events and their locations, and base code will handle them generally (similarly to how other interaction events, like mouse events, are handled).

At least for Qt it is trivial to turn off--there is a "accept touch events" flag that can be set, to determine whether they should be interpreted as touches or mouse events. So setting up such a knob should not be hard.


👍 that all sounds perfectly reasonable.

tacaswell commented Jan 11, 2017 edited

attn @andnovar Given your work on kivy, you may have some insight on this.


I just took a look at the kivy backend source--looks like its classes do inherit from NavigationToolbar2 and FigureCanvasBase, so it should be easy to set up that backend to use pinch-to-zoom and touch to drag functionality from here. I think it will actually be especially cool for kivy since that framework likes multitouch environments!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment