Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the maximize flag to all optimizers #68052

Closed
10 of 11 tasks
albanD opened this issue Nov 9, 2021 · 11 comments
Closed
10 of 11 tasks

Add the maximize flag to all optimizers #68052

albanD opened this issue Nov 9, 2021 · 11 comments
Labels
feature A request for a proper, new feature. module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@albanD
Copy link
Collaborator

albanD commented Nov 9, 2021

See #46480 for the original discussion.

All optimizers should get the same treatment:

cc @vincentqb @jbschlosser @albanD

@albanD albanD added feature A request for a proper, new feature. module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Nov 9, 2021
@OliverFM
Copy link
Contributor

Which one is the next highest priority? Adam?
I would like to try do another at some point in the next week or two

@albanD
Copy link
Collaborator Author

albanD commented Nov 10, 2021

Adam sounds like a good candidate. I re-ordered the list above roughly to be in order.

@wukong1992
Copy link
Contributor

Hey!
I am new comer, and want to contributing to torch. And which optim else need to be done,i will spend some time for the PR。

@OliverFM
Copy link
Contributor

Hey! I am new comer, and want to contributing to torch. And which optim else need to be done,i will spend some time for the PR。

Hey @wukong1992! Welcome!
I am also new here, however I implemented the fix for SGD, and opened #68164 for Adam.

In my opinion, the best first step is to setup the torch repo locally and run tests. After that, you'd probably want to pick one of the optimizers and mention here that you are working on that implementation.
You can look at the existing implementation in SGD for reference on how this ought to be done and test_sgd in torch/test_optim.py for an example of the testing that you'd need to do.

I hope this helps!

@wukong1992
Copy link
Contributor

Hey! I am new comer, and want to contributing to torch. And which optim else need to be done,i will spend some time for the PR。

Hey @wukong1992! Welcome! I am also new here, however I implemented the fix for SGD, and opened #68164 for Adam.

In my opinion, the best first step is to setup the torch repo locally and run tests. After that, you'd probably want to pick one of the optimizers and mention here that you are working on that implementation. You can look at the existing implementation in SGD for reference on how this ought to be done and test_sgd in torch/test_optim.py for an example of the testing that you'd need to do.

I hope this helps!

OK!Thank you so much!

alexnikulkov added a commit to alexnikulkov/ReAgent that referenced this issue Nov 17, 2021
Summary:
A new attribute has been added to SGD and will be added to other optimizers in the future. We need to make a corresponding change to `OptimizerConfig`
pytorch/pytorch#68052

Differential Revision: D32513683

fbshipit-source-id: addba19d29dd04792ce770ca4df82388146668ad
alexnikulkov added a commit to alexnikulkov/ReAgent that referenced this issue Nov 17, 2021
Summary:
Pull Request resolved: facebookresearch#581

A new attribute has been added to SGD and will be added to other optimizers in the future. We need to make a corresponding change to `OptimizerConfig`
pytorch/pytorch#68052

Differential Revision: D32513683

fbshipit-source-id: f804db019fb97818359dd02b50caf8e372e32d66
alexnikulkov added a commit to alexnikulkov/ReAgent that referenced this issue Nov 18, 2021
Summary:
Pull Request resolved: facebookresearch#581

A new attribute has been added to SGD and will be added to other optimizers in the future. We need to make a corresponding change to `OptimizerConfig`
pytorch/pytorch#68052

Differential Revision: D32513683

fbshipit-source-id: 742f0174bd996a6e3473170a572650124963d50a
facebook-github-bot pushed a commit to facebookresearch/ReAgent that referenced this issue Nov 18, 2021
Summary:
Pull Request resolved: #581

A new attribute has been added to SGD and will be added to other optimizers in the future. We need to make a corresponding change to `OptimizerConfig`
pytorch/pytorch#68052

Reviewed By: czxttkl

Differential Revision: D32513683

fbshipit-source-id: 61f4042c10f9843f73d886b9d8c1d90baa52c5c1
facebook-github-bot pushed a commit that referenced this issue Dec 13, 2021
Summary:
Solves the next most important use case in #68052.

I have kept the style as close to that in SGD as seemed reasonable, given the slight differences in their internal implementations.

All feedback welcome!

cc pietern mrshenli pritamdamania87 zhaojuanmao satgera rohan-varma gqchen aazzolini osalpekar jiayisuse SciPioneer H-Huang

Pull Request resolved: #68164

Reviewed By: VitalyFedyunin

Differential Revision: D32994129

Pulled By: albanD

fbshipit-source-id: 65c57c3f3dbbd3e3e5338d51def54482503e8850
@OliverFM
Copy link
Contributor

Adam is done, #68164, I may start work on AdamW this week.

@albanD
Copy link
Collaborator Author

albanD commented Dec 17, 2021

Thanks!

@Adnios
Copy link
Contributor

Adnios commented Dec 18, 2021

oops. @OliverFM Have you started AdamW? I just finished AdamW(#70146).

@OliverFM
Copy link
Contributor

oops. @OliverFM Have you started AdamW? I just finished AdamW(#70146).

Fortunately not :)
I'll pick up ASGD instead!

facebook-github-bot pushed a commit that referenced this issue Dec 23, 2021
Summary:
Related issue: #68052

cc pietern mrshenli pritamdamania87 zhaojuanmao satgera rohan-varma gqchen aazzolini osalpekar jiayisuse SciPioneer H-Huang

Pull Request resolved: #70146

Reviewed By: malfet

Differential Revision: D33254561

Pulled By: albanD

fbshipit-source-id: f190c836a4162f936c5953e076747c345df21421
pytorchmergebot pushed a commit that referenced this issue Apr 8, 2022
Added the maximize flag to Adadelta optimizer (#68052) and adjusted tests to take maximize into account.
Pull Request resolved: #75330
Approved by: https://github.com/cpuhrsch
facebook-github-bot pushed a commit that referenced this issue Apr 11, 2022
Summary:
Added the maximize flag to Adadelta optimizer (#68052) and adjusted tests to take maximize into account.

Pull Request resolved: #75330
Approved by: https://github.com/cpuhrsch

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/58a44523c1f74093ad9a05513a32c45a109b6c55

Reviewed By: b0noI

Differential Revision: D35510886

fbshipit-source-id: 3d123365c381a83a3b2f71eb6a889641d96af766
@zaxtax
Copy link
Contributor

zaxtax commented Apr 14, 2022

I'll handle the rest of these!

pytorchmergebot pushed a commit that referenced this issue Apr 20, 2022
This adds maximize to Adagrad (#68052) along with updates the respective tests.

Pull Request resolved: #75968
Approved by: https://github.com/albanD
malfet pushed a commit that referenced this issue Apr 20, 2022
This adds maximize to Adagrad (#68052) along with updates the respective tests.

Pull Request resolved: #75968
Approved by: https://github.com/albanD

(cherry picked from commit 6642e88)
facebook-github-bot pushed a commit that referenced this issue Apr 21, 2022
Summary:
This adds maximize to Adagrad (#68052) along with updates the respective tests.

Pull Request resolved: #75968
Approved by: https://github.com/albanD

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/6642e88ad23bfcf1349f2bce06f3490e49f195bf

Reviewed By: seemethere

Differential Revision: D35785877

fbshipit-source-id: d494b629f3a1b77cec3b3f1ab507516ddea0005a
pytorchmergebot pushed a commit that referenced this issue May 16, 2022
Added the maximize flag #68052 to Adamax optimizer and updates the respective tests.
Pull Request resolved: #77409
Approved by: https://github.com/albanD
facebook-github-bot pushed a commit that referenced this issue May 17, 2022
Summary:
Added the maximize flag #68052 to Adamax optimizer and updates the respective tests.

Pull Request resolved: #77409
Approved by: https://github.com/albanD

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/2a496e2f80a9937c336aaabf13f99cbc3983998c

Reviewed By: atalman

Differential Revision: D36421575

fbshipit-source-id: 298aa514ffa138c01daa8c538ba757b2af31b2e1
pytorchmergebot pushed a commit that referenced this issue Jul 8, 2022
Added the maximize flag #68052 to ASGD optimizer and updates the respective tests.
Pull Request resolved: #80323
Approved by: https://github.com/albanD
pytorchmergebot pushed a commit that referenced this issue Jul 8, 2022
Added the maximize flag #68052 to RMSprop optimizer and updates the respective tests.

Pull Request resolved: #80326
Approved by: https://github.com/albanD
pytorchmergebot pushed a commit that referenced this issue Jul 8, 2022
Added the maximize flag #68052 to rprop optimizer and updates the respective tests.
Pull Request resolved: #80335
Approved by: https://github.com/albanD
pytorchmergebot pushed a commit that referenced this issue Jul 8, 2022
Added the maximize flag #68052 to SparseAdam optimizer and updates the respective tests.
Pull Request resolved: #80336
Approved by: https://github.com/albanD
facebook-github-bot pushed a commit that referenced this issue Jul 8, 2022
Summary:
Added the maximize flag #68052 to RMSprop optimizer and updates the respective tests.

Pull Request resolved: #80326
Approved by: https://github.com/albanD

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/a1fd5b42730924c5e5a2beb5dabc779725768830

Reviewed By: mehtanirav

Differential Revision: D37717279

Pulled By: mehtanirav

fbshipit-source-id: a87ef2779dd8ec28513c7cff61b7cbbbf675b913
facebook-github-bot pushed a commit that referenced this issue Jul 8, 2022
Summary:
Added the maximize flag #68052 to SparseAdam optimizer and updates the respective tests.

Pull Request resolved: #80336
Approved by: https://github.com/albanD

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/f24c94d7ae00a7acc1fc3d543e6217509cfeb885

Reviewed By: mehtanirav

Differential Revision: D37717288

Pulled By: mehtanirav

fbshipit-source-id: 89d20b886febdad620b0c7e52592931500a8beef
This was referenced Jul 21, 2022
pytorchmergebot pushed a commit that referenced this issue Jul 22, 2022
Added the maximize flag #68052 to ASGD optimizer and updates the respective tests.
Pull Request resolved: #81875
Approved by: https://github.com/albanD
facebook-github-bot pushed a commit that referenced this issue Jul 26, 2022
Summary:
Added the maximize flag #68052 to ASGD optimizer and updates the respective tests.

Pull Request resolved: #81875
Approved by: https://github.com/albanD

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/50c655d5e33eb2ec4d501fe3b03647720a3e67c8

Reviewed By: osalpekar

Differential Revision: D38119385

Pulled By: osalpekar

fbshipit-source-id: cb1a91a94a642b39113ad5a163c7901e64fd80a8
pytorchmergebot pushed a commit that referenced this issue Aug 16, 2022
Added the maximize flag #68052 to rprop optimizer and updates the respective tests.
Pull Request resolved: #81864
Approved by: https://github.com/albanD
facebook-github-bot pushed a commit that referenced this issue Aug 16, 2022
Summary:
Added the maximize flag #68052 to rprop optimizer and updates the respective tests.

Pull Request resolved: #81864
Approved by: https://github.com/albanD

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/ff75562cffb54d7500a94a1091e06dc9b5c284fc

Reviewed By: atalman

Differential Revision: D38745528

fbshipit-source-id: 3b59bf641ff08d37921f231cd4a04244f8a045b6
@albanD
Copy link
Collaborator Author

albanD commented Oct 5, 2022

We will not do lbfgs as part of this issue as it is more complex than others.

@albanD albanD closed this as completed Oct 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature A request for a proper, new feature. module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

5 participants