Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redundant learning rate decay when resuming experiment #19

Closed
natlouis opened this issue Aug 20, 2019 · 1 comment
Closed

Redundant learning rate decay when resuming experiment #19

natlouis opened this issue Aug 20, 2019 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@natlouis
Copy link
Collaborator

Relevant lines of code: https://github.com/MichiganCOG/ViP/blob/master/train.py#L114-L115

Loading saved weights using pretrained argument also loads the last saved learning rate (after decaying per config file). However the learning rate is further decayed from the lines above, because the scheduler "loops" through all of the epochs again.

Example: If I ended an experiment with a learning rate of 1e-6 after decaying twice from 1e-4. Resuming that experiments gives me a starting learning rate of 1e-8.

@natlouis natlouis added the bug Something isn't working label Aug 20, 2019
@zeonzir zeonzir self-assigned this Aug 22, 2019
@zeonzir
Copy link
Collaborator

zeonzir commented Aug 23, 2019

Completed fix by removing loop for scheduler during resumption.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants