Skip to content
Implementation of "MULE: Multimodal Universal Language Embedding"
Python Shell
Branch: master
Clone or download
Latest commit f66e219 Dec 23, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
docs Add files via upload Dec 23, 2019
LICENSE Add files via upload Dec 23, 2019
README.md Update README.md Dec 23, 2019
coco.py Add files via upload Dec 23, 2019
data_loader.py Add files via upload Dec 23, 2019
fetch_fasttext_embeddings.sh Add files via upload Dec 23, 2019
flip_gradient.py Add files via upload Dec 23, 2019
multi30k.py Add files via upload Dec 23, 2019
retrieval_model.py Add files via upload Dec 23, 2019
run_mule.sh Add files via upload Dec 23, 2019
test.py Add files via upload Dec 23, 2019
train.py Add files via upload Dec 23, 2019

README.md

MULE: Multimodal Universal Language Embedding (AAAI 2020 Oral)

This repository implements:

Donghyun Kim, Kuniaki Saito, Kate Saenko, Stan Sclaroff, Bryan A. Plummer.

MULE: Multimodal Universal Language Embedding. AAAI, 2020 (Oral).

Our project can be found in here.

Environment

This code was tested with Python 2.7 and Tensorflow 1.2.1.

Preparation

  1. Download data
  • Download data from here
  • Unzip the file and place the data in the repo (All data files should be in ./data)
  1. Download FastText
  • sh fetch_fasttext_embeddings.sh

Training and Testing

./run_mule.sh [MODE] [GPU_ID] [DATASET] [TAG] [EPOCH]
# MODE {train, test, val} which indicates if you want to train the model or evaluate it using test or val splits
# GPU_ID is the GPU you want to test on
# DATASET {multi30k, coco} is defined in run_mule.sh
# TAG is an experiment name
# EPOCH optional, epoch number to test, if not provided, best model on validation data is used
# Examples:
./run_mule.sh train 0 multi30k mule
./run_mule.sh train 1 coco mule
./run_mule.sh test 1 coco mule
./run_mule.sh val 0 multi30k mule 20

By default, trained networks are saved under:

models/[NET]/[DATASET]/{TAG}/

Citation

If you find our code useful please consider citing:

@inproceedings{kimMULEAAAI2020,
  title={{MULE: Multimodal Universal Language Embedding}},
  author={Donghyun Kim and Kuniaki Saito and Kate Saenko and Stan Sclaroff and Bryan A. Plummer},
  booktitle={AAAI Conference on Artificial Intelligence},
  year={2020}
}
You can’t perform that action at this time.