-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performances of BERT on ACE2005 and MAVEN #4
Comments
Hi, thanks for your interest in our work. I didn't reproduce the two works so I will only give some intuitions here.
|
Thanks for your reply! |
Hi, |
Thanks for your help! |
Hi, thanks for your work on this dataset.
I notice that you compare the performances of BiLSTM and BERT on both ACE2005 and MAVEN, and it seems that BiLSTM outperforms BERT on ACE2005. However, some papers report different results. For example, in https://www.aclweb.org/anthology/P19-1522/, they report a 80+ F1 score with BERT. And in https://www.aclweb.org/anthology/2020.emnlp-main.435/, results on BERT+MLP is better than DMBERT (76.2 vs 74.9). What do you think of these results?
The text was updated successfully, but these errors were encountered: