Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for PyTorch ZeroRedundancyOptimizer #15176

Closed
schmidt-jake opened this issue Oct 18, 2022 · 2 comments
Closed

Support for PyTorch ZeroRedundancyOptimizer #15176

schmidt-jake opened this issue Oct 18, 2022 · 2 comments
Labels
question Further information is requested strategy

Comments

@schmidt-jake
Copy link
Contributor

馃殌 Feature

Implement a DDP-ZeroRedundancyOptimizer strategy.

Motivation

I know that Zero is available in Fairscale, but since PyTorch now has its own implementation at torch.distributed.optim.ZeroRedundancyOptimizer, I think it would make sense to support it.

Pitch

A new strategy called e.g. DDPZero that would support DDP with torch's ZeroRedundancyOptimizer, along with support for overlapping ZeroRedundancyOptimizer with DDP ("overlap_with_ddp") and the related DDP comms hooks.

Alternatives

As mentioned above, Fairscale implements Zero. However, this requires taking on an additional dependency. And since pytorch has a native implementation, it could make sense to support that in lightning.

@schmidt-jake schmidt-jake added the needs triage Waiting to be triaged by maintainers label Oct 18, 2022
@rohitgr7
Copy link
Contributor

the support was added in #14208
and will be available in next v1.8 release in a few days :)

you would still be able to do it:

class LitModel(LightningModule):
    def configure_optimizers(self):
        return ZeroRedundancyOptimizer(self.layer.parameters(), optimizer_class=torch.optim.Adam, lr=0.1)

model = LitModel()
trainer = Trainer(accelerator="gpu", devices=2, strategy='ddp')
trainer.fit(model)

@rohitgr7 rohitgr7 added question Further information is requested strategy and removed needs triage Waiting to be triaged by maintainers labels Oct 21, 2022
@schmidt-jake
Copy link
Contributor Author

Fantastic! Apologies I missed that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested strategy
Projects
None yet
Development

No branches or pull requests

2 participants