Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for multiple loss functions and GradScaler #4

Open
kenmbkr opened this issue May 26, 2022 · 1 comment
Open

Support for multiple loss functions and GradScaler #4

kenmbkr opened this issue May 26, 2022 · 1 comment

Comments

@kenmbkr
Copy link

kenmbkr commented May 26, 2022

I came across your work and you did a fantastic job in improving the performance of SAM.

It seems that the current implementation supports only a single loss function.
While the code example does include the case for fp16, there are no mentions of gradient scaling, which is commonly used together in AMP.

Are there any plans to support multiple loss functions and GradScaler?

@dydjw9
Copy link
Owner

dydjw9 commented May 27, 2022

Thank you so much!
yes currently we only test it for CE loss and smoothed CE loss, the results are good. The sharpness-aware learning should also be effective for multiple loss functions.
We have tested the FP16 on the imagenet dataset, but the accuracy drops about 0.5%. We will test the FP16 then the multiple loss functions as the ddl of neurips has passed :D

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants