Skip to content

Latest commit

 

History

History
129 lines (109 loc) · 6.88 KB

README.md

File metadata and controls

129 lines (109 loc) · 6.88 KB

Data preparation

We adapt the data preparation codes from Graph-CMR.To use this functionality, you need to download the relevant datasets. The datasets that our code supports are:

  1. Human3.6M
  2. UP-3D
  3. LSP
  4. MPII
  5. COCO

More specifically:

  1. Human3.6M: Unfortunately, due to license limitations, we are not allowed to redistribute the MoShed data that we used for training. We only provide code to evaluate our approach on this benchmark. To download the relevant data, please visit the website of the dataset and download the Videos, BBoxes MAT (under Segments) and 3D Positions Mono (under Poses) for Subjects S9 and S11. After downloading and uncompress the data, store them in the folder ${Human3.6M root}. The sructure of the data should look like this:
${Human3.6M root}
|-- S9
    |-- Videos
    |-- Segments
    |-- Bboxes
|-- S11
    |-- Videos
    |-- Segments
    |-- Bboxes

You also need to edit the file utils.config.py to reflect the path ${Human3.6M root} you used to store the data.

  1. UP-3D: We use this data both for training and evaluation. You need to download the UP-3D zip (that provides images and 3D shapes for training and testing) and the UPi-S1h zip (which we will need for silhouette evaluation on the LSP dataset). After you unzip, please edit config.py to include the paths for the two datasets.

  2. LSP: We again use LSP both for training and evaluation. You need to download the high resolution version of the dataset LSP dataset original (for training) and the low resolution version LSP dataset (for evaluation). After you unzip the dataset files, please complete the relevant root paths of the datasets in the file config.py.

  3. MPII: We use this dataset for training. You need to download the compressed file of the MPII dataset. After uncompressing, please complete the root path of the dataset in the file config.py.

  4. COCO: We use this dataset for training. You need to download the images and the annotations for the 2014 training set of the dataset. After you unzip the files, the folder structure should look like:

${COCO root}
|-- train2014
|-- annotations

Then, you need to edit the utils.config.py file with the ${COCO root} path.

Generate dataset files

After preparing the data, we continue with the preprocessing to produce the data/annotations for each dataset in the expected format. You need to run the file preprocess_datasets.py from the main folder of this repo that will do all this work automatically. Depending on whether you want to do evaluation or/and training, we provide two modes:

If you want to generate the files such that you can evaluate our pretrained models, you need to run:

python preprocess_datasets.py --eval_files

If you want to generate the files such that you can train using the supported datasets, you need to run:

python preprocess_datasets.py --train_files

Generate ground truth IUV image

For the training process, we also need to generate the GT IUV image using the GT SMPL parameters. You need to run:

python preprocess_datasets.py --gt_iuv

Above command will generate the IUV image under our new UV map. If you want to generate the IUV image under SMPL default UV map, you may run:

python preprocess_datasets.py --gt_iuv --uv_type=SMPL

Extra datasets preparation

We also provide the code to train and evaluate our model on some extra datasets:

  1. SURREAL
  2. 3DPW
  3. MPI-INF-3DHP
  4. HR-LSPET

Download Data

  1. SURREAL: We use SURREAL dataset for train and evaluation. You need to download the data from dataset website, and then complete the root path of the dataset in the file utils.config.py. For the evaluation on SURREAL dataset, we use the same setting as BodyNet. In BodyNet, not all eval image is used, so you may need to download the valid image list from here. The valid image list is gotten from this issue.

  2. 3DPW: We use this dataset only for evaluation. You need to download the data from the dataset website. After you unzip the dataset files, please complete the root path of the dataset in the file utils.config.py.

  3. MPI-INF-3DHP: We use this dataset for training and evaluation. You need to download the data from the dataset website. The expected fodler structure at the end of the processing looks like:

${MPI_INF_3DHP root}
|-- mpi_inf_3dhp_test_set
    |-- TS1
|-- S1
    |-- Seq1
        |-- imageFrames
            |-- video_0

Then, you need to edit the utils.config.py file with the ${MPI_INF_3DHP root} path.

  1. HR-LSPET: We use the extended training set of LSP in its high resolution form (HR-LSPET). You need to download the high resolution images. After you unzip the dataset files, please complete the root path of the dataset in the file utils.config.py.

Generate Dataset Files

In order to generate the dataset files, you may run:

python preprocess_extra_datasets.py --eval_files --train_files

For 3DPW, MPI-INF-3DHP and 3DPW dataset, you may also directly use the processed dataset files provided by SPIN. You may run;

wget http://visiondata.cis.upenn.edu/spin/dataset_extras.tar.gz

And then unzip the files to the directory containing your dataset files.

Use SPIN Fitting Results as GT

If you want to use the fitting results of SPIN as GT, you can download the fitting results from here and unzip the fits to the directory containing your dataset files.

Generate GT IUV Images

You may generate the GT IUV images by run:

python preprocess_extra_datasets.py --gt_iuv