We use anaconda to manage necessary packages. You can create an anaconda environment called acid_train
using
conda env create -f environment.yaml
conda activate acid_train
pip install torch-scatter==2.0.4 -f https://pytorch-geometric.com/whl/torch-1.4.0+cu101.html
Next, we need to compile extension modules used for mesh utilies, which are from Convolutional Occupancy Network. You can do this via
python setup.py build_ext --inplace
You can obtain our pre-generated manipulation trajectories from PlushSim from this Google Drive directory. The manipulation trajectories are broken down to 10GB chunks. We recommend using gdown
for downloading.
After downloading, please run the following commands to decompress the data:
cat data_plush.zip.part-* > data_plush.zip
unzip data_plush.zip
You should have the following folder structure:
ACID/
data_plush/
metadata/
split1/
...
split2/
...
split3/
...
split1/
...
split2/
...
split3/
...
To generate input-output pairs for ACID training, you need to run the following scripts to generate the data:
cd preprocess
python gen_data_flow_plush.py
python gen_data_flow_splits.py
python gen_data_contrastive_pairs_flow.py
This should create train_data
directory inside this folder, with the following structure:
ACID/
train_data/
flow/
split1/
split2/
split3/
train.pkl
test.pkl
pair/
split1/
split2/
split3/
If you wish to generate the data at another location, you can pass in different flags. Check out each preprocess script for details.
Finally, to train the ACID model from scratch, run:
python plush_train.py configs/plush_dyn_geodesics.yaml
For available training options, please take a look at configs/default.yaml
and configs/plush_dyn_geodesics.yaml
.
You can download pretrained weights on Google Drive, please save model_best.pt
to result/geodesics/
.
Please check the LICENSE file. ACID may be used non-commercially, meaning for research or evaluation purposes only. For business inquiries, please contact researchinquiries@nvidia.com.
If you find our code or paper useful, please consider citing
@article{shen2022acid,
title={ACID: Action-Conditional Implicit Visual Dynamics for Deformable Object Manipulation},
author={Shen, Bokui and Jiang, Zhenyu and Choy, Christopher and J. Guibas, Leonidas and Savarese, Silvio and Anandkumar, Anima and Zhu, Yuke},
journal={Robotics: Science and Systems (RSS)},
year={2022}
}