Skip to content

This is the official implementation for the paper "EMP: Emotion-guided Multi-modal Fusion and Contrastive Learning for Personality Traits Recognition".

wykstc/PSR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

48 Commits
 
 
 
 
 
 

Repository files navigation

PSR

This is the official implementation for the paper "EMP: Emotion-guided Multi-modal Fusion and Contrastive Learning for Personality Traits Recognition".

Structure of EMP

image

Table of Contents

Dependencies

  • Python 3.9.1
  • pytorch-lightning 1.7.2
  • Linux 5.11.0-46-generic

Requirement

  • We use sentence transfomer for text feature extraction. Sentence Embedding
  • We use large X3D network for visual features extraction. X3D

Dataset

Chalearn first impressions dataset can be found in First impressions.

ELEA dataset can be found on this official website ELEA and you need to apply it.

Other required files like annotation files and transcriptions all can be found on dataset website.

Usage

Train the model

ulimit -SHn 51200
python main.py --accelerator 'gpu' --devices 1  

Citation

If this repository is useful for you, please cite as:

@inproceedings{10.1145/3591106.3592243,
author = {Wang, Yusong and Li, Dongyuan and Funakoshi, Kotaro and Okumura, Manabu},
title = {EMP: Emotion-Guided Multi-Modal Fusion and Contrastive Learning for Personality Traits Recognition},
year = {2023},
booktitle = {Proceedings of the 2023 ACM International Conference on Multimedia Retrieval},
pages = {243–252},
}

Contact

wang.y.ce@m.titech.ac.jp

About

This is the official implementation for the paper "EMP: Emotion-guided Multi-modal Fusion and Contrastive Learning for Personality Traits Recognition".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages