Graph Distillation for Action Detection
Branch: master
Clone or download
Latest commit 1a7ce51 Jan 31, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
classification code upload Jan 30, 2019
data_pipeline code upload Jan 30, 2019
detection code upload Jan 30, 2019
images add images Jan 29, 2019
nets code upload Jan 30, 2019
scripts code upload Jan 30, 2019
third_party change the download link and add some text files. Jan 30, 2019
utils code upload Jan 30, 2019
CONTRIBUTING.md change the download link and add some text files. Jan 30, 2019
LICENSE change the download link and add some text files. Jan 30, 2019
README.md Update README.md Jan 31, 2019

README.md

Graph Distillation

This is the code for the paper Graph Distillation for Action Detection with Privileged Modalities presented at ECCV 2018

Please note that this is not an officially supported Google product.

In this work, we propose a method termed graph distillation that incorporates rich privileged information from a large-scale multi- modal dataset in the source domain, and improves the learning in the target domain where training data and modalities are scarce.

If you find this code useful in your research then please cite

@inproceedings{luo2018graph,
  title={Graph Distillation for Action Detection with Privileged Modalities},
  author={Luo, Zelun and Hsieh, Jun-Ting and Jiang, Lu and Niebles, Juan Carlos and Fei-Fei, Li},
  booktitle={ECCV},
  year={2018}
}

Setup

All code was developed and tested on Ubuntu 16.04 with Python 3.6 and PyTorch 0.3.1.

Pretrained Models

We can download pretrained models used in our paper running the script:

sh scripts/download_models.sh

Or alternatively you can download Cloud SDK

  1. Install Google Cloud SDK (https://cloud.google.com/sdk/install)
  2. Copy the pretrained model using the following commands:
gsutil -m cp -r gs://graph_distillation/ckpt .

Running Models

We can use the scripts in scripts/ to train models on different modalities.

Classification

See classification/run.py for descriptions of the arguments.

scripts/train_ntu_rgbd.sh trains a model for a single modality.

scripts/train_ntu_rgbd_distillation.sh trains model with graph distillation. The modality being trained is specified by the xfer_to argument, and the modalities to distill from is specified in the modalities argument.

Detection

See detection/run.py for descriptions of the arguments. Note that the visual_encoder_ckpt_path argument is the pretrained visual encoder checkpoint, which should be from training classification models.

scripts/train_pku_mmd.sh trains a model for a single modality.

scripts/train_pku_mmd_distillation.sh trains model with graph distillation. The modality being trained is specified by the xfer_to argument, and the modalities to distill from is specified in the modalities argument.