New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Apex with DataParallel fixes #1032
Conversation
if isinstance(model, nn.Module): | ||
model = nn.Sequential(model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why are we checking the model for nn.Module
, but cast it to nn.Sequential
after that?
model = torch.nn.DataParallel(model[0]) | ||
model = _patch_forward(model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okay, that looks like really tricky solution
catalyst/utils/components.py
Outdated
else tensor | ||
) | ||
|
||
model.forward = lambda *args, old_fwd=model.forward, input_caster=input_caster_lambda, output_caster=output_caster_lambda, **kwargs: apex.amp._initialize.applier( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it possible to use **kwargs
with this hack?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yep
Changelog is updated, codestyle is fixed |
Before submitting
catalyst-make-codestyle && catalyst-check-codestyle
(pip install -U catalyst-codestyle
).make check-docs
?pytest .
?Description
Related Issue
Type of Change
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.