Skip to content

ami-iit/paper_guo_2023_humanoids_lifting_risk_prediction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Online Action Recognition for Human Risk Prediction with Anticipated Haptic Alert via Wearables

Cheng Guo, Lorenzo Rapetti, Kourosh Darvish, Riccardo Grieco, Francesco Draicchio and Daniele Pucci

Humanoid_2023_video_for_paper.mp4

2023 IEEE-RAS International Conference on Humanoid Robots (Humanoids)

Data

🔓 The labeled dataset (as txt files), raw wearables dataset and models can be downloaded here.

Dependencies

📌 Required

  • YARP: a library and toolkit for communication and device interfaces.
  • YCM: a set of CMake files that support creation and maintenance of repositories and software packages
  • CMake: an open-source, cross-platform family of tools designed to build, test and package software.
  • HDE: a collection of YARP devices for the online estimation of the kinematics and dynamics of a human subject.

📎 Optional

  • iDynTree: a library of robots dynamics algorithms for control, estimation and simulation.
  • Wearables: a library for communication and interfaces with wearable sensors.
  • iFeel: a wearable perception system providing kinematic (position and velocities) and dynamic human information.

🔥 Ubuntu 20.04.5 LTS (Focal Fossa) is used in this project.

Installation

First download this repository:

git clone https://github.com/ami-iit/paper_Guo_2023_Humanoid_Action_Recognition_For_Risk_Prediction.git

🔧 Install robotology-superbuild

  • Install mamba if you don't have one, you can follow the instructions here.

🔧 Install this project

Running

🔨 Offline annotation and training

To annotate the data, one may follow the instructions below:

yarpserver --write
yarpdataplayer --withExtraTimeCol 2
  • Go to ~/robotology-superbuild/src/HumanDynamicsEstimation/conf/xml and run the configuration file (in case full joints list use Human.xml, in case reduced joints list use HumanStateProvider_ifeel_0.xml):
yarprobotinterface --config proper-configuration-file.xml
  • Before going to ~/element_human-action-intention-recognition/build/install/bin, be sure in the virtual environment previsouly installed, then you may run (make sure all parameters in humanDataAcquisition.ini are set properly):
./humanDataAcquisitionModule --from humanDataAcquisition.ini
  • To start annotation you may need to visualize the human model by running (also be sure the parameters setting in HumanPredictionVisualizer.ini are correct):
./HumanPredictionVisualizer --from HumanPredictionVisualizer.ini

Recalling the index of each action defined here, one can annotate the data manually.

🔨 Test on recorded data

  • First of all, make sure yarpserver is running.
  • Open yarpdataplayer to replay data.
  • Go to ~/robotology-superbuild/src/HumanDynamicsEstimation/conf/xml and run configuration file (for 31 reduced joints DoF) with:
yarprobotinterface --config configuration_file_name.xml
  • Then go to ~/element_human-action-intention-recognition/build/install/bin and run:
./humanDataAcquisitionModule --from humanDataStreamingOnlineTest.ini
  • (Remember be in virtual environment) Go to ~/element_human-action-intention-recognition and run:
python3 ./scripts/MoE/main_test_moe.py
  • (Remember be in virtual environment) Additional: for displaying the action recognition/motion prediction results, go to ~/element_human-action-intention-recognition_modified/scripts/MoE and run:
bash ./runAnimators.sh
  • Additional: for visualizing simulated human models, go to ~/element_human-action-intention-recognition_modified/build/install/bin and run:
./HumanPredictionVisualizer --from HumanPredictionVisualizer.ini
  • Additional: in case calibrating the simulated model, download the file here and run it when human model is in T-pose(you can stop the yarpdataplayer first when calibrating, afterwards replay it again):
bash ./TPoseCalibration.sh zero
  • Go to ~/element_risk-prediction and run:
python3 ./src/main_model_based_risk_evaluation.py
  • To start NIOSH-based ergonomics evaluation module, run:
python3 ./src/niosh_method/nioshOnlineEasyUse.py
  • To display ergonomics evaluation results, go to ~/element_risk-prediction/src/niosh_method and run:
bash ./runAnimators.sh

🔨 Online inference

Under construction, for the moment one may follow the instructions here.

Maintainer

👤 This repository is maintained by:

@Zweisteine96

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published