Skip to content

This is the repo for the paper "Learn to Predict How Humans Manipulate Large-Sized Objects From Interactive Motions"

License

Notifications You must be signed in to change notification settings

HiWilliamWWL/Learn-to-Predict-How-Humans-Manipulate-Large-Sized-Objects-From-Interactive-Motions-objects

Repository files navigation

Learn to Predict Human Manipulation of Large-Sized Objects Through Interactive Motions

Hello, welcome to our project repository!

Dataset Access

To access our dataset, please visit this Google Drive link.

Visualizing Dataset Samples

To view samples from our dataset, use the following command:

python visData.py

Obtaining SMPL Parameters

To extract SMPL parameters of human skeleton motion:

Download the dataset and place it in the ./data/ directory. Follow the configuration settings in render_mesh.py. Run the following command to utilize SMPLify (SMPLify website) for extracting SMPL parameters. (This process may be time-consuming)

python render_mesh.py

The SMPL meshes and parameters will be saved in ./data/SAMPLE_NAME/SMPL_result/. To view the SMPL joint motions, use:

python visData.py --smpl_joint True

Simulation Tool

Included in this repository is a basic simulator for testing the physical properties of various object layouts. Run it using:

python Simulator/Simulate_Demo.py

Future Updates

Stay tuned for more materials and updates to this project.

About

This is the repo for the paper "Learn to Predict How Humans Manipulate Large-Sized Objects From Interactive Motions"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages