You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Loading saved weights using pretrained argument also loads the last saved learning rate (after decaying per config file). However the learning rate is further decayed from the lines above, because the scheduler "loops" through all of the epochs again.
Example: If I ended an experiment with a learning rate of 1e-6 after decaying twice from 1e-4. Resuming that experiments gives me a starting learning rate of 1e-8.
The text was updated successfully, but these errors were encountered:
Relevant lines of code: https://github.com/MichiganCOG/ViP/blob/master/train.py#L114-L115
Loading saved weights using
pretrained
argument also loads the last saved learning rate (after decaying per config file). However the learning rate is further decayed from the lines above, because the scheduler "loops" through all of the epochs again.Example: If I ended an experiment with a learning rate of 1e-6 after decaying twice from 1e-4. Resuming that experiments gives me a starting learning rate of 1e-8.
The text was updated successfully, but these errors were encountered: