-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor optimize and scheduling selection #55
Comments
I see multiple options here, but I'm sure there are more.
|
Use create_optimizer and create_scheduler in all files that define an experiment. Merge options to these 2 functions. |
cwmeijer
added a commit
that referenced
this issue
Feb 12, 2021
cwmeijer
added a commit
that referenced
this issue
Feb 12, 2021
cwmeijer
added a commit
that referenced
this issue
Feb 12, 2021
note that I removed an if statement on learning rate config as it didn't seem to have any result. This is because cyclic scheduler was always used and lr got ignored always because of that.
cwmeijer
added a commit
that referenced
this issue
Feb 12, 2021
cwmeijer
added a commit
that referenced
this issue
Feb 12, 2021
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Maybe we should take this as an opportunity to use a uniform scheduler creation procedure. I see that different experiment functions use different code (e.g.
asr.experiment()
has very different rules thanbasic.experiment()
). This should include:adadelta
,adam
...)none
,cyclic
,noam
; where I would usenone
instead ofconstant
as adaptive optimizers don't use a constant learning rate even without scheduler)constant_lr
->lr
)scheduler.py
to handle this, called from allexperiment()
functionsOriginally posted by @bhigy in #51 (comment)
The text was updated successfully, but these errors were encountered: