Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BERT + EDA ? #25

Closed
guotong1988 opened this issue Jun 30, 2020 · 2 comments
Closed

BERT + EDA ? #25

guotong1988 opened this issue Jun 30, 2020 · 2 comments

Comments

@guotong1988
Copy link

guotong1988 commented Jun 30, 2020

Will Random Swap (RS) and Random Deletion (RD) work well for BERT, as BERT is besed on contextual pre-training.

Thank you very much. @jasonwei20

@jasonwei20
Copy link
Owner

If you have a large dataset and are using BERT fine-tuning, the performance difference will probably be marginal. If your data size is small, however, it could be worth a try. I have not done a formal BERT + EDA experiment.

@lian6605
Copy link

lian6605 commented Sep 2, 2020

I read the paper. EDA written by Jason Wei.
In this paper, it is said that the results in Bert won't go up much.
Will applying EDA to BERT produce better results in natural language processing?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants