-
Notifications
You must be signed in to change notification settings - Fork 1.8k
fix AdamW import in Training loop notebook #610
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
| "outputs": [], | ||
| "source": [ | ||
| "from transformers import AdamW\n", | ||
| "from torch.optim import AdamW\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
realizing there were some hidden changes in the jupyter notebook, this is the relevant change
| "source": [ | ||
| "from transformers import AdamW, AutoModelForSequenceClassification, get_scheduler\n", | ||
| "from transformers import AutoModelForSequenceClassification, get_scheduler\n", | ||
| "from torch.optim import AdamW\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And here, too. It's 2 import changes from 2 cells
pcuenca
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello @Letssharearow, thank you for your contribution! Would you mind submitting just the relevant changes, rather than replacing the notebook? I can't even see the diff in github because it's too large:
![]()
For reference, here's how a related PR did it: https://github.com/huggingface/notebooks/pull/606/files
5640a3f to
ec81b7b
Compare
|
@pcuenca |
pcuenca
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you!
Not sure, to be honest. I don't think there's an automatic process to sync them, these are just code-only notebooks without much of the prose in the course. cc @burtenshaw as he may know more details. |
AdamW doesn't exist in the transformer library

instead you can import AdamW from torch.optim

This is alreayd done in the course repository:
https://github.com/huggingface/course/blob/main/chapters/en/chapter3/4.mdx
And as other commits also directly change the files in here, I am not sure if this comment is still relevant:
course: Open a PR directly on thecourserepo (https://github.com/huggingface/course)@sgugger