Skip to content
/ EMMA Public

Code for paper ”ECCVW2022 - Affective Behaviour Analysis Using Pretrained Model with Facial Prior“, imported from https://github.com/JackYFL/EMMA_CoTEX_ABAW4

Notifications You must be signed in to change notification settings

HanHuCAS/EMMA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Affective Behaviour Analysis Using Pretrained Model with Facial Priori for ABAW4

[Paper], [slides] (code: ABAW), [video] (code: ABAW)

This repository is the codebase for ABAW4 challenge, which includes EMMA for multi-task-learning (MTL) and masked CoTEX for learning from synthetic data (LSD) challenge.

Citing this paper

If you find this repo is useful, please cite the following BibTeX entry. Thank you!

@inproceedings{li2023affective,
  title={Affective Behaviour Analysis Using Pretrained Model with Facial Prior},
  author={Li, Yifan and Sun, Haomiao and Liu, Zhaori and Han, Hu and Shan, Shiguang},
  booktitle={European Conference on Computer Vision Workshop},
  pages={19--30},
  year={2023},
  organization={Springer}
}

Pretrained models

The pretrained models for EMMA and COTEX are provided through the following urls:

MAE ViT pretrained on CelebA [link] (code: ABAW)
DAN pretrained on AffectNet [link] (code: ABAW)

We also provide the pretrained EMMA model:

EMMA [link] (code: ABAW)

Requirements

This codebase is based on Python 3.7. Ensure you have installed all the necessary Python packages, run python install -r requirements.txt

Data

Please download the ABAW4 data including MTL and LSD before running the code.

Training

EMMA

  • First you need to change the pretrained model and dataset directories in the script shs/train_EMMA.sh

  • Second, run the following command:

sh shs/train_EMMA.sh

Masked CoTEX

  • First you need to change the pretrained model and dataset directories in the script shs/train_masked_CoTEX.sh

  • Second, run the following command:

sh shs/train_masked_CoTEX.sh

Reference

This code refers to masked auto-encoder (MAE) and DAN. Thank you!

About

Code for paper ”ECCVW2022 - Affective Behaviour Analysis Using Pretrained Model with Facial Prior“, imported from https://github.com/JackYFL/EMMA_CoTEX_ABAW4

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published