Skip to content
Tensorflow Reproduction of the EMNLP-2018 paper "Temporally Grounding Natural Sentence in Video"
Python Shell
Branch: master
Clone or download

Latest commit

Fetching latest commit…
Cannot retrieve the latest commit at this time.

Files

Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
datasets
download
results
scripts
README.md
data_provider.py
eval.py
method.png
model.py
opt.py
random_anchor.py
requirements.txt
test.py
train.py
util.py

README.md

TGN

Tensorflow Implementation of the EMNLP-2018 paper Temporally Grounding Natural Sentence in Video by Jingyuan Chen et al.

alt text

Requirements

pip install -r requirements.txt

Data Preparation

  1. Download Glove word embedding data.
cd download/
sh download_glove.sh
  1. Download dataset features.

TACoS: BaiduDrive, GoogleDrive

Charades-STA: BaiduDrive, GoogleDrive

ActivityNet-Captions: BaiduDrive, GoogleDrive

Put the feature hdf5 file in the corresponding directory ./datasets/{DATASET}/features/

We decode TACoS/Charades videos using fps=16 and extract C3D (fc6) features for each non-overlap 16-frame snippet. Therefore, each feature corresponds to 1-second snippet. For ActivityNet, each feature corresponds to 2-second snippet. To extract C3D fc6 features, I mainly refer to this code.

  1. Download trained models.

Download and put the checkpoints in corresponding ./checkpoints/{DATASET}/ .

BaiduDrive, GoogleDrive

  1. Data Preprocessing (Optional)
cd datasets/tacos/
sh prepare_data.sh

Then copy the generated data in ./data/save/ .

Use correspondig scripts for preparing data for other datasets.

You may skip this procedure as the prepared data is already saved in ./datasets/{DATASET}/data/save/ .

Testing and Evaluation

sh scripts/test_tacos.sh
sh scripts/eval_tacos.sh

Use corresponding scripts for testing or evaluating for other datasets.

The predicted results are also provided in ./results/{DATASET}/ .

Training

sh scripts/train_tacos.sh

Use corresponding scripts for training for other datasets.

You can’t perform that action at this time.