-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Fix: Correct AdamW import path #606
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Update the import statement for AdamW to use `torch.optim.AdamW` instead of `transformers.AdamW`
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
pcuenca
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @b1n1yam, the line you proposed is malfomed and would not work.
also cc @burtenshaw
Thanks for pointing that out @pcuenca - the issue has been addressed in the latest commit. |
Vaibhavs10
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks - this must be a very old notebook 😅
Yeah, it looks like this notebook was using some outdated imports. I’ve updated it to use the correct AdamW path from PyTorch. Let me know if you spot anything else that needs updating! |
pcuenca
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's still a missing \n that needs to go at the end of line from torch.optim import AdamW
burtenshaw
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks
|
Thanks @burtenshaw for the final fix! |
Fix: Correct AdamW import path
Update the import statement for AdamW to use
torch.optim.AdamWinstead oftransformers.AdamW. This corrects the import path to align with the standard PyTorch library, ensuring compatibility and proper usage of the AdamW optimizer.What does this PR do?
This PR fixes an
ImportErrorby correcting the import path for theAdamWoptimizer in the PyTorch training code snippet. It changes the import from a potentially incorrect location (liketransformersortransformers.optimization) to the correct one within the PyTorch library (torch.optim).@Vaibhavs10
@pcuenca