[Paper], [slides] (code: ABAW), [video] (code: ABAW)
This repository is the codebase for ABAW4 challenge, which includes EMMA for multi-task-learning (MTL) and masked CoTEX for learning from synthetic data (LSD) challenge.
If you find this repo is useful, please cite the following BibTeX entry. Thank you!
@inproceedings{li2023affective,
title={Affective Behaviour Analysis Using Pretrained Model with Facial Prior},
author={Li, Yifan and Sun, Haomiao and Liu, Zhaori and Han, Hu and Shan, Shiguang},
booktitle={European Conference on Computer Vision Workshop},
pages={19--30},
year={2023},
organization={Springer}
}
The pretrained models for EMMA and COTEX are provided through the following urls:
MAE ViT pretrained on CelebA [link] (code: ABAW)
DAN pretrained on AffectNet [link] (code: ABAW)
We also provide the pretrained EMMA model:
EMMA [link] (code: ABAW)
This codebase is based on Python 3.7
.
Ensure you have installed all the necessary Python packages, run python install -r requirements.txt
Please download the ABAW4 data including MTL and LSD before running the code.
-
First you need to change the pretrained model and dataset directories in the script
shs/train_EMMA.sh
-
Second, run the following command:
sh shs/train_EMMA.sh
-
First you need to change the pretrained model and dataset directories in the script
shs/train_masked_CoTEX.sh
-
Second, run the following command:
sh shs/train_masked_CoTEX.sh
This code refers to masked auto-encoder (MAE) and DAN. Thank you!