Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the pre-trained models #10

Closed
balabala-h opened this issue Dec 27, 2022 · 2 comments
Closed

About the pre-trained models #10

balabala-h opened this issue Dec 27, 2022 · 2 comments

Comments

@balabala-h
Copy link

No description provided.

@balabala-h
Copy link
Author

HI, it seems that the pre-trained models are not available now.
And I have tried to train this model in RAF-DB without pre-trained models, I got an unsatisfactory result. Would you please provide the results without pre-training

@balabala-h balabala-h changed the title About About the pre-trained models Dec 27, 2022
@zyh-uaiaaaa
Copy link
Owner

You can find the pretrained model here:
https://drive.google.com/file/d/1yQRdhSnlocOsZA4uT_8VO0-ZeLXF4gKd/view?usp=sharing

Yes, pretrain is important for the performance, as the attention consistency module needs a relatively strong backbone to effectively calculate the attention map. Without pertraining, the performance on RAF-DB with 10%, 20%, 30% noise is around 73.36%, 71.21%, 68.20%.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants