Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ddp] supported customized torch ddp configuration #1123

Merged
merged 2 commits into from
Jun 15, 2022

Conversation

FrankLeeeee
Copy link
Contributor

Fixed #995 to allow the user to set parameters like find_unused_parameters for torch DDP in config file.

@FrankLeeeee FrankLeeeee merged commit 91a5999 into hpcaitech:main Jun 15, 2022
@FrankLeeeee FrankLeeeee deleted the hotfix/torch-ddp-config branch January 26, 2023 07:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG]: RuntimeError: Expected to have finished reduction in the prior iteration before starting a new one.
2 participants