Skip to content

01 adam optimizer#1790

Merged
jeffra merged 24 commits intodeepspeedai:masterfrom
EugeneLYC:01Adam
Mar 11, 2022
Merged

01 adam optimizer#1790
jeffra merged 24 commits intodeepspeedai:masterfrom
EugeneLYC:01Adam

Conversation

@EugeneLYC
Copy link
Contributor

Maximizing Communication Efficiency for Large-scale Training via 0/1 Adam

Author: @EugeneLYC, @conglongli, @minjiaz, Christopher De Sa, Yuxiong He
Paper: https://arxiv.org/abs/2202.06009

Copy link
Contributor

@conglongli conglongli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Yucheng for the great work! I left a few comments that should be easy to fix. I think this PR should be ready to merge after the fix and adding the doc/tutorial.

Copy link
Contributor

@conglongli conglongli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Yucheng for applying the requested changes and adding the doc/tutorial. I reviewed and left a few minor comments.

Copy link
Contributor

@conglongli conglongli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thank you Yucheng!

@conglongli conglongli enabled auto-merge (squash) March 10, 2022 04:48
@jeffra jeffra disabled auto-merge March 11, 2022 05:28
@jeffra jeffra merged commit b80e562 into deepspeedai:master Mar 11, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants