-
Notifications
You must be signed in to change notification settings - Fork 73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Are codes for pretraining available? #10
Comments
Hi there, check the |
Thanks for your quick reply! But just to clarify, in my case, the script is not working when transformers==2.4.1 as in environment.yml. In my case, it works well when transformers==2.8.0. |
Yes, same here. |
It seems that this repository only contains the code to perform finetuning pretrained RoBERTa. Are code for pretraining available now? Can you possibly add some command example for doing TAPT? Any advice or explanation will be highly appreciated. Thanks in advance!
The text was updated successfully, but these errors were encountered: