New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sharpness Aware Minimization (SAM) requires closure #64
Comments
Hello! First of all, thanks for your interest to the repo! you can find the usage at the docstring here! if possible, please upload your code so that debug the codes more accurately :) For now, there's lack of docs, but someday i'm gonna build a documentation to use easily (i can't sure when it's done). if you have more questions, feel free to comment here best regards |
Thank you for your reply, I have gone through this documentation but still, I am not getting how to fix it. However, the code is here `if method =='lloss':
|
I think the definition part (your code) is perfect, but the training part. To use
% |
This is the training part `def train(models, method, criterion, optimizers, schedulers, dataloaders, num_epochs, epoch_loss):
|
maybe in the |
Thank you so much for your help. I wrote something like this `def train_epoch(models, method, criterion, optimizers, dataloaders, epoch, epoch_loss):
|
maybe, it should be changed to,
similar scheme with |
` ` |
maybe I think the whole codes (below) are the
|
No, Actually this repo is using multiple methods such as Random or 'lloss' `def train_epoch(models, method, criterion, optimizers, dataloaders, epoch, epoch_loss):
|
I guess this should be worked,
if still there's an error, please check my test code! (below codes are runnable with no errors, and correct usage) |
Thank you so much for your help and recommendations. I cannot thank you enough. ` -----------------SAM Optimizer -------------------
` |
maybe you should call only calling below code is the correct usage!
|
When I call loss.backward() twice it gives me following error RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward. |
by following error, you can specify |
None of them are working ` -----------------SAM Optimizer -------------------
-----------------SAM Optimizer for LLOSS Method -------------------
` |
As per the sample given here https://github.com/davda54/sam loss.backward() is not required for secont_step. the first one working fine for me but not working for my LLOSS method |
actually it does! (do backward twice) in the example code (got from https://github.com/davda54/sam), it does backward() twice. And, by the concept of
|
Okay right, but there are some errors using backwar() the second time. I don't know how to resolve it. |
it's depends on the output(s) of model & loss function of your codes. so take a look into that part! |
Hi, thank you so much for your repo, I am using SAM optimizer but I am facing this error, how to fix this?
RuntimeError: [-] Sharpness Aware Minimization (SAM) requires closure
The text was updated successfully, but these errors were encountered: