Skip to content

Using hand gesture recognition to send motion commands to the Unitree Go1 quadruped.

Notifications You must be signed in to change notification settings

avazahedi/go1-gesture-command

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

62 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Go1-Gesture-Command

This repository contains two ROS2 packages for using hand gestures to send motion commands to the Unitree Go1:

  • ros2_hgr
  • go1_cmd
    • A ROS2 C++ package that receives hand gesture data and utilizes services to send commands to the Go1 accordingly.
winter_project_demo_compressed.mp4

How It Works

Google's MediaPipe is an open-source framework with machine learning solutions, including hand detection, face detection, and more. Here, I use MediaPipe to detect my hands, and I use a TensorFlow model from kinivi's repository to classify the gesture I am making.


MediaPipe Hand Landmarks


hand_landmarks



Project Flowchart flowchart

Dependencies

You can import the necessary repositories listed in go1_hgr.repos into your workspace using vcs. To do so, clone this repository into the src directory of your workspace. Then in the root of your workspace, run the following:
vcs import < src/go1-gesture-command/go1_hgr.repos

Other dependencies include:

Launch

ros2 launch ros2_hgr hgr.launch.xml

  • The ros2_hgr launch file has defaults set to run the hgr_node with your computer's built-in webcam.
  • The use_realsense launch argument defaults to false and can be used for a similar but separate hgr_node specifically for the RealSense.

To launch using an external RealSense camera instead of a built-in webcam, use
ros2 launch ros2_hgr hgr.launch.xml use_realsense:=true

You can also launch using the cameras onboard of the Go1 with
ros2 launch ros2_hgr hgr.launch.xml dogcam:=true

Gestures Guide

  1. Open - stop
  2. Close - look forward (normal 0° yaw)
  3. Pointer - recover stand up
  4. OK - look up
  5. Peace - look down
  6. Thumbs Up - walk forward
  7. Thumbs Down - walk backward
  8. Quiet Coyote - lay down
hand.gesture.recognition.demo.mp4

Notes

This project would not have been possible without the help of Katie Hughes, Nick Morales, Marno Nel, and Professor Matt Elwin, who I worked with throughout the Go1 disassembly and upgrades that were necessary to work with the Go1 and make this project and the other students' projects possible.

Check out my portfolio post on this project here.

About

Using hand gesture recognition to send motion commands to the Unitree Go1 quadruped.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 71.3%
  • Python 26.9%
  • C++ 1.5%
  • CMake 0.3%