Skip to content
master
Switch branches/tags
Code

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
Mar 30, 2021
Mar 30, 2021
Mar 30, 2021

End2You - The Imperial Toolkit for Multimodal Profiling

We introduce End2You the Imperial toolkit for multimodal profiling. This repository provides easy-to-use scripts to train and evaluate either unimodal or multimodal models in an end-to-end manner for either regression or classification output. The input to the model can be of 1D (e.g. audio, eeg, heart rate etc.), 2D (e.g. spectrogram, image etc), or 3D (e.g. video etc). The main blocks of the unimodal and multimodal models are (i) a Convolutional Neural Network (CNN) that extracts spatial features from the raw data, and (ii) a recurrent neural network (RNN) that captures the temporal information in the data. The models can be combined in any desirable way, and the user can also define new models and combine with the existing implementations.

The End2You workflow (shown below) is comprised of the data generator, that transforms the raw file format to hdf5, the data provider that feeds the data to the models, and finally, the prediction. We provide a number of audio/visual/multimodal models (see tutorials or cli). One can easily use its own models to train or use our pre-train models on their dataset.

alt text

Citing

If you use End2you or any code from End2you in your research work, you are kindly asked to acknowledge the use of End2You in your publications.

Tzirakis, P.,Zafeiriou, S., & Schuller, B. (2017). End2You -- The Imperial Toolkit for Multimodal Profiling by End-to-End Learning. arXiv preprint arXiv:1802.01115.

@article{tzirakis2018end2you,
  title={End2You--The Imperial Toolkit for Multimodal Profiling by End-to-End Learning},
  author={Tzirakis, Panagiotis and Zafeiriou, Stefanos and Schuller, Bjorn W},
  journal={arXiv preprint arXiv:1802.01115},
  year={2018}
}

Challenges

End2You has been used with great success, providing strong baselines in the following challenges:

Installation

Conda Installation

We highly recommended to use conda as your Python distribution. Once downloading and installing conda, this project can be installed by using the conda_setup.yml file as follows:

$ conda env create -f conda_setup.yml

You can now activate the environment and use End2You.

$ conda activate end2you

Pip Installation

Another way to install End2You is via pip as shown below:

$ pip install git+https://github.com/end2you/end2you.git

Dependencies

Below are listed the required modules to run the code.

  • Python >= 3.7
  • NumPy >= 1.19.2
  • Pytorch >= 1.7 (see Installation section for installing this module)
  • MoviePy >= 1.0.3
  • sklearn >= 0.23.2
  • h5py >= 2.10.0
  • facenet-pytorch >= 2.5

About

No description, website, or topics provided.

Resources

License

Releases

No releases published

Packages

No packages published

Languages