Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[hotfix] fix ColoTensor GPT2 unitest #1309

Merged
merged 1 commit into from
Jul 14, 2022
Merged

[hotfix] fix ColoTensor GPT2 unitest #1309

merged 1 commit into from
Jul 14, 2022

Conversation

1SAA
Copy link
Contributor

@1SAA 1SAA commented Jul 14, 2022

No description provided.

@1SAA 1SAA requested review from feifeibear and ver217 July 14, 2022 07:39

for i, (input_ids, attn_mask) in enumerate(train_dataloader):
logits = model(input_ids, attn_mask)
colo_input = ColoTensor.from_torch_tensor(input_ids, ColoTensorSpec(pg))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we really need to convert input to ColoTensor?
What if pass a torch tensor into the colo model?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure whether this line is necessary.
Since we have a problem in transformation between [dp: 4, tp: 1] and [dp: 1, tp: 4], I think we'd better add this line to ensure that this transformation would not be triggered. Otherwise, to_replicate may use PG, [dp: 4, tp: 1] and the tensor wouldn't be replicated when we want to get 4 copy from it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if the dp degree is consist among the model parameters. I believe we should pass the torch tensor into the model.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you test the correctness in the original way?
logits = model(input_ids, attn_mask)

Copy link
Contributor Author

@1SAA 1SAA Jul 14, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if the dp degree is consist among the model parameters. I believe we should pass the torch tensor into the model.

I don't think so. Acctually, passed data has its own ditribution information. In TP environment, passed data is replicated among TP process group. In DP environment, passed data is just a part of batched data. It is not aprropriate to regard them having no difference with odinary tensors.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, that makes sense

@feifeibear feifeibear merged commit 3608692 into hpcaitech:main Jul 14, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants