Skip to content
master
Switch branches/tags
Code

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Learning Factorized Multimodal Representations

Pytorch implementation for learning factorized multimodal representations using deep generative models.

Correspondence to:

Paper

Learning Factorized Multimodal Representations
Yao-Hung Hubert Tsai*, Paul Pu Liang*, Amir Zadeh, Louis-Philippe Morency, and Ruslan Salakhutdinov
ICLR 2019. (*equal contribution)

Installation

First check that the requirements are satisfied:
Python 3.6/3.7
PyTorch 0.4.0
numpy 1.13.3
sklearn 0.20.0

The next step is to clone the repository:

git clone https://github.com/pliang279/factorized.git

Dataset

Please download the latest version of the CMU-MOSI, CMU-MOSEI, POM, and IEMOCAP datasets which can be found at https://github.com/A2Zadeh/CMU-MultimodalSDK/

Scripts

Please run

python mfm_test_mosi.py

in the command line.

Similar commands for loading and running models for other datasets can be found in mfm_test_mmmo.py, mfm_test_moud.py etc.

If you use this code, please cite our paper:

@inproceedings{DBLP:journals/corr/abs-1806-06176,
  title     = {Learning Factorized Multimodal Representations},
  author    = {Yao{-}Hung Hubert Tsai and
               Paul Pu Liang and
               Amir Zadeh and
               Louis{-}Philippe Morency and
               Ruslan Salakhutdinov},
  booktitle={ICLR},
  year={2019}
}

Related papers and repositories building upon these datasets:
CMU-MOSEI dataset: paper, code
Memory Fusion Network: paper, code
Multi-Attention Recurrent Network: paper, code
Graph-MFN: paper, code
Multimodal Transformer: paper, code
Multimodal Cyclic Translations: paper, code

About

[ICLR 2019] Learning Factorized Multimodal Representations

Topics

Resources

License

Releases

No releases published

Packages

No packages published

Languages