Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could you provide the trained weights? #14

Closed
yachuan opened this issue Feb 2, 2021 · 3 comments
Closed

Could you provide the trained weights? #14

yachuan opened this issue Feb 2, 2021 · 3 comments

Comments

@yachuan
Copy link

yachuan commented Feb 2, 2021

Hello,

I am training BERT+ERM on the Amazon dataset but it is very time cost. Is it possible to provide the best trained parameters to the users? ( like BERT is proving the pretrained weights, maybe you can have another folder under examples which contains all the weights for users.) It will save users about a week ( and computations). Thank you!

@kohpangwei
Copy link
Collaborator

Hi, thanks for the question! This should be possible; we're looking into it, though it will likely take a few weeks.

@yachuan
Copy link
Author

yachuan commented Feb 3, 2021

Thanks a lot!

@ssagawa
Copy link
Collaborator

ssagawa commented Mar 10, 2021

Thanks again for the feedback!

Model parameters for all baselines are available on Codalab here. If you click on the experiment you’re interested in, you should be able to find the model as best_model.pth.

In addition, we have updated the Amazon-WILDS dataset as well as its default model (from BERT to DistilBERT) in our v1.1 release. It is much faster to train now, taking 5 hours on V100. We hope this makes it easier for you to run your experiments. However, we note that these are breaking changes, so if you’re currently running WILDS experiments, could you please update the package (which will update the datasets) and your default models? Sorry about the inconvenience, and thank you!

@ssagawa ssagawa closed this as completed Mar 10, 2021
teetone added a commit that referenced this issue Dec 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants