Skip to content

bruceyo/EGCN

Repository files navigation

Ensemble-based Graph Convolutional Networks (EGCN)

This repository holds the codebase, dataset and models for the paper: EGCN: An Ensemble-based Learning Framework for Exploring Effective Skeleton-based Rehabilitation Exercise Assessment Bruce X.B. Yu, Yan Liu, Xiang Zhang, Gong Chen, Keith C.C. Chan, IJCAI 2022

  • Advanced version EGCN++ and EHE Dataset are available at EGCN++.

Introduction

Rehabilitation exercise aims to restore physical functions from injury. With the release of motion sensors like Kinect, skeleton-based rehabilitation assessment attracts increasing research interest in computer vision. Existing attempts on skeleton-based rehabilitation exercise assessment usually rely on geometric features or statistical methods, which is a lack of effective skeleton data representation methods. Usually, skeleton data could be collected with sensors like Kinect or motion captures that provide two groups of features (i.e., position and orientation features). Graph Convolutional Network (GCN) has achieved encouraging performance for skeleton-based action recognition. However, it might not be able to fully make use of different features of the skeleton data. To advance the prior work, we propose an Ensemble-based GCN (EGCN) learning framework for rehabilitation exercise assessment.

Visulization of Position and Orientation Features of Skeleton Joints

EGCN is able to make use of the position and angle features of the skeleton data for exercise evaluation purpose. Below figures show the visulized views of the skeleton features from KIMORE and UI-PRMD datasets. The first row of below figures is 3d position features, and the second row is the angle features (a.k.a. orientation features).

Es5 in KIMORE (Kinect v2) E1 in UI-PRMD (Kinect v2) E1 in UI-PRMD (Vicon)

Prerequisites

Our codebase is based on Python3 (>=3.5). There are a few dependencies to run the code. The major libraries we depend are

  • PyTorch (Release version 0.4.0)
  • Other Python libraries can be installed by pip install -r requirements.txt

Installation

cd torchlight; python setup.py install; cd ..

Data Preparation

We experimented on two skeleton-based action evaluation datasts: UI-PRMD and KIMORE.

UI-PRMD

UI-PRMD is a data set of movements related to common exercises performed by patients in physical therapy and rehabilitation programs. The data set consists of 10 rehabilitation movements. A sample of 10 healthy individuals repeated each movement 10 times in front of two sensory systems for motion capturing: a Vicon optical tracker, and a Kinect camera. The data is presented as positions and angles of the body joints in the skeletal models provided by the Vicon and Kinect mocap systems. We use the consistent exercise repetitions in the Reduced Dataset (174M), which is the same as the prior work. The dataset in our experimental format could be downloaded from Google Drive and Baidu Wangpan (code:1234).

After uncompressing, put the data into folder ./data/UI_PRMD, rebuild the database by this command:

sh ./tools/gen/ui_prmd_gendata_all.sh

KIMORE

For the KIMORE dataset, we perform manul segmentation on based on exercise specific features. Below are depth views of the exercises in KIMORE. Es2-4 are segmented as the left and right directions.

KIMORE Es1 KIMORE Es2(L) KIMORE Es3(L) KIMORE Es4(L)
KIMORE Es2(R) KIMORE Es3(R) KIMORE Es4(R) KIMORE Es5

KIMORE can be downloaded from their website. The segmented skeleton data could be downloaded from Google Drive and Baidu Wangpan (code:1234). After that, this command should be used to build the database for training or evaluation: After uncompressing, put the data into folder ./data/KiMoRe, then rebuild the database by this command:

sh ./tools/gen/kimore_gendata_all.sh

Test the Trained Results

We provided the trained results of our EGCN. The results are in the folder work_dir To test the trained results of UI-PRMD, run

python ./tools/result/ui_prmd_folds_statistics.py

To test the trained results of KIMORE, run

python ./tools/result/kimore_folds_statistics.py

Training

To train different ensemble strategies of EGCN for KIMORE, run

sh train_kimore_cv.sh

To train different ensemble strategies of EGCN for UI-PRMD, run

sh train_ui_prmd_cv.sh

Acknowledgements

This repo is based on

Thanks to the original authors for their work!

Contact

For any question, feel free to contact

Bruce Yu: b r u c e x b y u AT g m a i l . c o m (remove space)

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published