Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Torch handling of multistep LR policy #550

Merged
merged 1 commit into from
Jan 27, 2016

Conversation

gheinrich
Copy link
Contributor

Add unit test and fix Torch handling of multistep LR policy

fix #547

@lukeyeager
Copy link
Member

Awesome, thanks for the tests!

lukeyeager added a commit that referenced this pull request Jan 27, 2016
Fix Torch handling of multistep LR policy
@lukeyeager lukeyeager merged commit d681946 into NVIDIA:master Jan 27, 2016
@gheinrich gheinrich deleted the dev/torch-multistep-lr-policy branch May 24, 2016 11:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Error when use "Step Down (arbitrary steps)" in learning rate policy
2 participants