Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Plans for pytorch_optimizer v3 #164

Closed
10 of 11 tasks
kozistr opened this issue May 9, 2023 · 1 comment · Fixed by #235
Closed
10 of 11 tasks

Plans for pytorch_optimizer v3 #164

kozistr opened this issue May 9, 2023 · 1 comment · Fixed by #235
Assignees
Labels
feature New features

Comments

@kozistr
Copy link
Owner

kozistr commented May 9, 2023

In pytorch-optimizer v3, loss function will be added. So, finally, the optimizer & lr scheduler & loss function are all in one package.

Feature

  • support at least 60 optimizers
  • support at least 10 objectives
  • support bitsandbytes (& 4-bit optimizers)

Refactor

  • Organize utils

Docs

  • Organize documentation
  • Support contribution guide (implementation, test, etc...)
  • Add issue templates
  • Migrate to mkdocs
  • Create Q&A page
  • Benchmark on ImageNet

Test

  • Organize test cases
@kozistr kozistr added the feature New features label May 9, 2023
@kozistr kozistr self-assigned this May 9, 2023
@redknightlois
Copy link

IMAE for Noise-Robust Learning: Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude's Variance Matters
https://arxiv.org/abs/1903.12141

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New features
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants