Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[hotfix] fix zero init ctx numel #1128

Merged
merged 1 commit into from
Jun 16, 2022
Merged

Conversation

ver217
Copy link
Member

@ver217 ver217 commented Jun 16, 2022

No description provided.

@ver217 ver217 requested review from feifeibear and 1SAA June 16, 2022 08:50
@@ -169,11 +172,18 @@ def _post_context_exec(self):
torch.set_rng_state(self.cpu_rng_state)
torch.cuda.set_rng_state(self.cuda_rng_state)

params = frozenset(self.top_module.parameters())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because Module's parameters() function uses set() to store result, I think it will not return duplicated param?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but for "in" operation, set is much faster than list.

@ver217 ver217 merged commit a1a7899 into hpcaitech:main Jun 16, 2022
@ver217 ver217 deleted the hotfix/zero-init-ctx branch June 16, 2022 09:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants