Skip to content

asterics/gesture_mouse

Repository files navigation

Gesture Mouse

A library that allows to control the mouse and keyboard with head movement and face gestures. This project is based on the google mediapipe library (https://google.github.io/mediapipe/).

Installation instructions

Tested with Python 3.10 and PySide6.4.3.

  1. Clone repository
  2. Create a sufolder mkdir venv
  3. Create virtual environment python -m venv venv
  4. Activate virtual environment venv\Scripts\activate.bat (Linux: . venv/bin/activate)
  5. Install packages pip install -r requirements.txt

Running Gesture Mouse

Starting the application

  • python gui.py to start gui (Linux: sudo ./venv/bin/python3.10 gui.py)
  • Alt+1 to toggle mouse controlled by python or system.
  • Esc to turn off program. (Used if you lose control over mouse)

Alternatively you can create an executable distribution and run dist\gui.exe (Windows) or dist/gui (Linux)

Creating an exe distribution

To create a distribution folder wich includes all necessery .dll and an executable one can use PyInstaller(https://pyinstaller.org). Instructions:

  1. Follow the installation instructions
  2. Activate virtual environment venv/Scripts/activate on windows and source venv/bin/activate on linux
  3. Install PyInstaller pip install pyinstaller
  4. Execute build process with
    pyinstaller gui.py -D --add-data config;config --add-data data;data --collect-all mediapipe on windows
    pyinstaller gui.py -D --add-data config:config --add-data data:data --collect-all mediapipe on linux

Algorithms

The gesture calculation (e.g. eye-blink) uses the mediapipe facial landmark detection in combination with a modified eye aspect ratio (EAR) algorithm. The EAR algorithm helps to make the gesture invariant to head movements or rotations.

Calculation of eye aspect algorithm

Links and Credits

The work for GestureMouse has been accomplished at the UAS Technikum Wien in course of the R&D-projects WBT (MA23 project 26-02) and Inclusion International (MA23 project 33-02), which has been supported by the City of Vienna.

Have a look at the AsTeRICS Foundation homepage and our other Open Source AT projects:

  • AsTeRICS: AsTeRICS framework homepage, AsTeRICS framework GitHub: The AsTeRICS framework provides a much higher flexibility for building assistive solutions. The FLipMouse is also AsTeRICS compatible, so it is possible to use the raw input data for a different assistive solution.

  • FABI: FABI: Flexible Assistive Button Interface GitHub: The Flexible Assistive Button Interface (FABI) provides basically the same control methods (mouse, clicking, keyboard,...), but the input is limited to simple buttons. Therefore, this interface is at a very low price (if you buy the Arduino Pro Micro from China, it can be under 5$).

  • FLipMouse: The FLipMouse controller: a highly sensitive finger-/lip-controller for computers and mobile devices with minimal muscle movement.

  • FLipPad: The FLipPad controller: a flexible touchpad for controlling computers and mobile devices with minimal muscle movement.

  • AsTeRICS Grid: Asterics Grid AAC Web-App: an open source, cross plattform communicator / talker for Augmented and Alternative Communication (AAC).

Support us

Please support the development by donating to the AsTeRICS Foundation:

   

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages