PalmTouch Model and Dataset
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.

PalmTouch: Using the Palm as an Additional Input Modality on Commodity Smartphones

This repository contains the dataset, Python notebooks for training and preprocessing, the model reported in the paper, and an Android demo application to run the model.

Paper Abstract

Touchscreens are the most successful input method for smartphones. Despite their flexibility, touch input is limited to the location of taps and gestures. We present PalmTouch, an additional input modality that differentiates between touches of fingers and the palm. Touching the display with the palm can be a natural gesture since moving the thumb towards the device’s top edge implicitly places the palm on the touchscreen. We present different use cases for PalmTouch, including the use as a shortcut and for improving reachability. To evaluate these use cases, we have developed a model that differentiates between finger and palm touch with an accuracy of 99.53% in realistic scenarios. Results of the evaluation show that participants perceive the input modality as intuitive and natural to perform. Moreover, they appreciate PalmTouch as an easy and fast solution to address the reachability issue during onehanded smartphone interaction compared to thumb stretching or grip changes.

This work can be cited as follows:

title = {PalmTouch: Using the Palm as an Additional Input Modality on Commodity Smartphones},
author = {Le, Huy Viet and Kosch, Thomas and Bader, Patrick and Mayer, Sven and Henze, Niels},
doi = {10.1145/3173574.3173934},
year = {2018},
booktitle = {Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems},
publisher = {ACM},
numpages = {10},
address = {New York, NY, USA},
series = {CHI '18},
keywords = {Palm, capacitive image, machine learning, smartphone}