Skip to content

HKUST-NISL/GazeVehicle

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GazeVehicle

use eye gaze to control the movements of a vehicle in ros

What's in this repo?

  • A robot with camera sensor
  • Face detection and analysis
  • Eye gaze estimation
  • Mouth status estimation
  • Control signal rendering to the robot

Preparation

  • install ROS and gazebo
  • install tensorflow-gpu==1.14.0 (cuda-10.0, cudnn-7.4)
  • install python libs: dlib, scipy
  • Download the eye gaze models and extract it to $ROOT_REPO

Compile

cd $ROOT_REPO
catkin_make

Launch

source devel/setup.bash
roslaunch launch/one_of_the_files.launch

Demo

  1. Show robot in gazebo simulator
  1. Show image obtained by the camera on the robot
  1. Analyze mouth status and estimate eye gaze to control the robot

    • Use keys as commands
    • Use gaze as direction and a space key as a moving command
    • Use gaze dewll to push the command buttons

Cite

This repo's model is from the paper below:

@inproceedings{poy2021multimodal,
  title={A multimodal direct gaze interface for wheelchairs and teleoperated robots},
  author={Poy, Isamu and Wu, Liang and Shi, Bertram E},
  booktitle={2021 43rd Annual International Conference of the IEEE Engineering in Medicine \& Biology Society (EMBC)},
  pages={4796--4800},
  year={2021},
  organization={IEEE}
}

@inproceedings{chen2018appearance,
  title={Appearance-based gaze estimation using dilated-convolutions},
  author={Chen, Zhaokang and Shi, Bertram E},
  booktitle={Asian Conference on Computer Vision},
  pages={309--324},
  year={2018},
  organization={Springer}
}

About

Use eye gaze to control the movements of a vehicle in ROS

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published