Skip to content

talmolab/movenet-dance-demo

Repository files navigation

Real-time pose matching demo

A real-time demo running on webcam feed that guides subject to move to match pre-defined target poses.

Match the target pose!

Move your body to match each of the target poses on the right!

Setup

conda env create -f environment.yml -n movenet-dance-demo
conda activate movenet-dance-demo

To change the target poses, see the notebook: make_target_poses.ipynb

For an example of standalone inference, see the notebook: inference_demo.ipynb

Usage

To run the demo:

python live_demo.py

To use a different camera:

python live_demo.py -c 1

To make it easier, increase the target tolerance:

python live_demo.py --tolerance 0.7

Hotkeys:

1 - 9: Switch between target poses.

Tab: Cycle through target poses.

R: Reset completed poses.

Q or Esc: Quit the demo.

Credits

This demo was written by Talmo Pereira at the Salk Institute for Biological Studies to be presented at the SICB 2022 Annual Meeting.

This code uses MoveNet for pose estimation by Ronny Votel, Na Li and other contributors at Google.

The target pose data is courtesy of the danceTactics crew directed by Keith A. Thompson at Arizona State University.

About

A real-time demo running on webcam feed that guides subject to move to match pre-defined target poses.

Topics

Resources

License

Stars

Watchers

Forks