-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could you provide the trained weights? #14
Comments
Hi, thanks for the question! This should be possible; we're looking into it, though it will likely take a few weeks. |
Thanks a lot! |
Thanks again for the feedback! Model parameters for all baselines are available on Codalab here. If you click on the experiment you’re interested in, you should be able to find the model as In addition, we have updated the Amazon-WILDS dataset as well as its default model (from BERT to DistilBERT) in our v1.1 release. It is much faster to train now, taking 5 hours on V100. We hope this makes it easier for you to run your experiments. However, we note that these are breaking changes, so if you’re currently running WILDS experiments, could you please update the package (which will update the datasets) and your default models? Sorry about the inconvenience, and thank you! |
Hello,
I am training BERT+ERM on the Amazon dataset but it is very time cost. Is it possible to provide the best trained parameters to the users? ( like BERT is proving the pretrained weights, maybe you can have another folder under examples which contains all the weights for users.) It will save users about a week ( and computations). Thank you!
The text was updated successfully, but these errors were encountered: