Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Fix loading state dict in DeepSpeed Plugin #7297

Closed
wants to merge 1 commit into from

Conversation

SeanNaren
Copy link
Contributor

What does this PR do?

Closes #7282

This fixes an issue where if we saved a DeepSpeed checkpoint and resumed using Stage 3 where parameters are sharded, it crashes. We gather parameters as suggested in DeepSpeed at a per module level on rank 0, and load the state dict which is then updated to all processes.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@SeanNaren SeanNaren added distributed Generic distributed-related topic 3rd party Related to a 3rd-party bug Something isn't working labels Apr 30, 2021
@SeanNaren SeanNaren self-assigned this Apr 30, 2021
@SeanNaren SeanNaren added this to the v1.3 milestone Apr 30, 2021
@codecov
Copy link

codecov bot commented Apr 30, 2021

Codecov Report

Merging #7297 (db83a02) into master (ea2287e) will decrease coverage by 8%.
The diff coverage is 18%.

@@           Coverage Diff            @@
##           master   #7297     +/-   ##
========================================
- Coverage      91%     83%     -8%     
========================================
  Files         199     199             
  Lines       12808   13574    +766     
========================================
- Hits        11688   11290    -398     
- Misses       1120    2284   +1164     

@edenlightning edenlightning removed this from the v1.3 milestone May 4, 2021
@SeanNaren
Copy link
Contributor Author

The approach here needs to change, we always need to use the deepspeed checkpointing logic to save everything (including optimizer/lr states) and to load. to save a single checkpoint, we can append the flag to the config to tell deepspeed to do so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
3rd party Related to a 3rd-party bug Something isn't working distributed Generic distributed-related topic
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Model restore fails from stored checkpoint when using Deepspeed
2 participants