Skip to content

[Digital Human Proj] Implementation of motion matching pipeline, with AABB search, spring damper and inertialization. Run on LaFan1 dataset, we have multiple means of input to provide user with immersive game experience.

Ribosome-rbx/Motion-Matching-for-Human-Skeleton

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Motion Matching for Responsive Animation for Digital Humans

Longteng Duan* · Guo Han* · Boxiang Rong* · Hang Yin*

(* Equal Contribution)


Logo

Table of Contents
  1. Compilation
  2. Dataset
  3. Functionalities
  4. Example Demos
  5. Acknowledgement

Compilation

The program is easy to compile using Visual Studio Code and CMake.

git clone https://github.com/Ribosome-rbx/Motion-Matching-for-Human-Skeleton.git
cd Motion-Matching-for-Human-Skeleton
mkdir build
cd build
cmake ..
make

Note that we will fetch repo https://github.com/guo-han/motion-matching when building this project. Make sure to fetch the most recent version of master branch. Some of our developments are done in that repository.

Dataset

The dataset we used for constructing matching database is LAFAN1. But you do not need to redownload and process the dataset yourself. All necessary data have been added into this directory under the ./data folder.

Functionalities

Users can control the digital character using keyboard control, drawed trajectory or human pose input from camera. Details are elaborated below.

To start the motion matching, you should press the space key after the application is ready.

Keyboard Control

Keys Actions
W, S, A, D,
W+A, W+D, S+A, S+D
controll the character's movement in eight directions
once keys pressed, the character will move along the given direction
up arrow key↑,
down arrow key↓
increase, decrease the forward speed
the speed willl increase/decrease by 0.3 every time the key is pressed
left Shift sprint
once key pressed, the character starts dashing;
once key released, it will return to normal speed
J jump
once key pressed, the character jumps;
once key released, the character stops jumping
C creep
once key pressed, the character creeps;
once key released, the character stops creeping
P dance
once key pressed, the character starts dancing;
when the key is pressed again, the character stops dancing
Space Pause/Start Program

Drawed Trajectory

When checking the Paint trajectory box in the application UI, we can use mouse to draw a trajectory on the ground. Once paint a trajectory, uncheck the Paint trajectory box. Press any 'W, A, S, D' key to give the character an initial speed. Now keyboard control will be disabled, but you can still press "shift" to run. The character will follow a straight line to the nearest dot on the trajectory, and start following the whole trajectory. Once reaching the end, it will go straight back to the start point and do another round. To disable the trajectory control mode, click on the Clear Painting button and then you can turn back to keyboard control again.

[Demo for drawed trajectory (click to expand)]

Human Pose Control

Compile and run this application with another script opened. You are only allowed to use 13 predefined poses to animate the character for several instructions, and the input is captured from laptop camera. More information can be found in this repository.

We utilized FastPose to capture human poses. For the 13 motions, namely $\textit{crawl forward}$, $\textit{crawl left}$, $\textit{crawl right}$, $\textit{dance}$, $\textit{jump}$, $\textit{punch}$, $\textit{run forward}$, $\textit{run left}$, $\textit{run right}$, $\textit{stand still}$, $\textit{walk forward}$, $\textit{walk left}$, and $\textit{walk right}$, we pre-recorded 13 videos as the reference for the corresponding poses. Then, a kNN classifier is utilized for pose classification. We employ a Python script to translate real-time human poses into keyboard input for program control. In running time, when a new pose is received from camera, we apply a KNN classifier to assign a label to the pose, and the script will give this application corresponding key down instruction to animate the digital character.

[Demo for human pose control (click to expand)]

Camera Trcking Mode

This mode is designed for "Human Pose Control", since one can no longer modify view directions by mouse. To use this mode, check the Track Velocity box in Camera hidden menu. Then, the window view will change following the velocity direction.

External Terrain Object Loading

To run motion matching with different terrains, open the Ground hidden menu. Select from the drop down menu of Show Terrain.

Example Demos

Basic Square terrain

walk run
creep dance

Forest terrain

walk run
jump creep

Acknowledgement

This project was undertaken as part of the 2023HS digital human course at ETH Zurich. We would like to express our sincere appreciation for the valuable guidance and support provided by our supervisor, Dongho Kang.

About

[Digital Human Proj] Implementation of motion matching pipeline, with AABB search, spring damper and inertialization. Run on LaFan1 dataset, we have multiple means of input to provide user with immersive game experience.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published