Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix parallel load non parallel model #49

Merged
merged 2 commits into from
Aug 11, 2019

Conversation

amjltc295
Copy link
Owner

@amjltc295 amjltc295 commented Aug 11, 2019

Why do we need this PR?

How is it implemented?

  • Load model.module weight when multiple GPUs are used

Other changes

  • Change lr decay rate to 0.9

PR readiness checklist

  • Did it pass the Flake8 check?
  • Did you test this PR?

@amjltc295 amjltc295 added the bug Something isn't working label Aug 11, 2019
@Nash2325138 Nash2325138 merged commit 8a28732 into master Aug 11, 2019
@Nash2325138 Nash2325138 deleted the fix_parallel_load_non_parallel_model branch August 11, 2019 07:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Data-parallel model fails to load pretrained non-data-parallel model
2 participants