-
Notifications
You must be signed in to change notification settings - Fork 451
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
it's better to make optimizer as a class. #17
Comments
We do not implement optimizers, only borrow them from pytorch. Currently a single function as an interface is enough. Making optimizer as a class is reasonable if it provides more functionalities. |
Yes, but it's more convenient for users to choose optimizer by "selecting" the enumerated options instead of "typing" the name. |
there is also a potential plan to implement some customized optimizers. |
Rewrite a optimizer class may be redundant. In Tester or Trainer, we actually use optimizers in Pytorch. And further implemented optimizer can inherit But I agree that choose optimizer should avoid tying names, maybe we can change optimizer.py to import all of optimizers and implemented customized optimizer. from torch.optim import *
from XXoptim import XXoptimizer
... And users can "selecting" different optimizer now. |
agree |
all right |
fastNLP/fastNLP/action/optimizor.py
Line 1 in 7c2f260
The text was updated successfully, but these errors were encountered: