Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SAM with Mixed precision training #11

Closed
hihunjin opened this issue Jan 22, 2021 · 1 comment
Closed

SAM with Mixed precision training #11

hihunjin opened this issue Jan 22, 2021 · 1 comment
Labels
enhancement New feature or request stale

Comments

@hihunjin
Copy link

Hi. What a great work. Thanks for sharing your work.

I try to add your SAM optimizer in addition to the mixed precision training which is provided on Pytorch > 1.7.0.
There's no guideline about this. Here what I've been trying.

...
scaler.cale(loss).backward()
...
scaler.step(optimizer.first_step)
scaler.update()
optimizer.zero_grad()

loss = loss_fn(image_preds, image_labels)
scaler.scale(loss).backward()
scaler.step(optimizer.second_step)
scaler.update()
optimizer.zero_grad()

Please let me know how to incorporate this into the mixed precision training process.

Moreover, It'd be better to clarify what versions of Pytorch are available.

@stale
Copy link

stale bot commented Feb 12, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Feb 12, 2021
@stale stale bot closed this as completed Feb 19, 2021
@davda54 davda54 added the enhancement New feature or request label Mar 25, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request stale
Projects
None yet
Development

No branches or pull requests

2 participants