Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(pytorch): correct epoch-based learning rate decay behavior #410

Merged
merged 1 commit into from
Feb 19, 2024

Conversation

rickstaa
Copy link
Owner

This pull request addresses a bug in the epoch-based learning rate decay mechanism. Previously, the decay process did not correctly reach the specified final learning rate. This fix ensures that the learning rate accurately decays to the intended final value throughout the epochs.

This commit addresses a bug in the epoch-based learning rate decay
mechanism. Previously, the decay process did not correctly reach the
specified final learning rate. This fix ensures that the learning rate
accurately decays to the intended final value throughout the epochs.
@rickstaa rickstaa merged commit a8df90f into main Feb 19, 2024
13 checks passed
@rickstaa rickstaa deleted the fix_epoch_lr_decay branch February 19, 2024 15:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant