Merged
Conversation
szhengac
reviewed
Jan 20, 2023
Contributor
|
LGTM. |
szhengac
approved these changes
Jan 20, 2023
Contributor
|
Let me know if it is ready for merge |
Contributor
Author
|
Thanks @szhengac |
zarzen
reviewed
Jan 20, 2023
| from deepspeed import pipe | ||
|
|
||
| model = pipe.PipelineModule( | ||
| stage_modules, |
Contributor
There was a problem hiding this comment.
Is stage_modules a list of layers? Or is it partitioned by the Slapo schedule?
chhzh123
pushed a commit
to chhzh123/slapo
that referenced
this pull request
Jan 22, 2023
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
This PR analyzes whether parameters are tie together by looking at their
nn.Paramterobject IDs. Accordingly, this analysis has to be done before the consolidation (if needed); otherwise allnn.Parameterobjects will be replaced so that their object IDs won't be the same anymore.Note that the above analysis works only when parameters are pointing to the same Python object. In other words, for the distributed training that requires weight consolidation, users have to explicitly call
tie_weightsone more time when initializing the model with empty weights. For example:In addition, this PR also refactors the pipeline logic in
schedu.build. The ultimate goal is to ensure thatschedule.builddoesn't have any framework specific logic (e.g.,if target == "deepspeed":).cc @szhengac @zarzen
Checklist