-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add the maximize
flag to all optimizers
#68052
Comments
Which one is the next highest priority? Adam? |
Adam sounds like a good candidate. I re-ordered the list above roughly to be in order. |
Hey! |
Hey @wukong1992! Welcome! In my opinion, the best first step is to setup the torch repo locally and run tests. After that, you'd probably want to pick one of the optimizers and mention here that you are working on that implementation. I hope this helps! |
OK!Thank you so much! |
Summary: A new attribute has been added to SGD and will be added to other optimizers in the future. We need to make a corresponding change to `OptimizerConfig` pytorch/pytorch#68052 Differential Revision: D32513683 fbshipit-source-id: addba19d29dd04792ce770ca4df82388146668ad
Summary: Pull Request resolved: facebookresearch#581 A new attribute has been added to SGD and will be added to other optimizers in the future. We need to make a corresponding change to `OptimizerConfig` pytorch/pytorch#68052 Differential Revision: D32513683 fbshipit-source-id: f804db019fb97818359dd02b50caf8e372e32d66
Summary: Pull Request resolved: facebookresearch#581 A new attribute has been added to SGD and will be added to other optimizers in the future. We need to make a corresponding change to `OptimizerConfig` pytorch/pytorch#68052 Differential Revision: D32513683 fbshipit-source-id: 742f0174bd996a6e3473170a572650124963d50a
Summary: Pull Request resolved: #581 A new attribute has been added to SGD and will be added to other optimizers in the future. We need to make a corresponding change to `OptimizerConfig` pytorch/pytorch#68052 Reviewed By: czxttkl Differential Revision: D32513683 fbshipit-source-id: 61f4042c10f9843f73d886b9d8c1d90baa52c5c1
Summary: Solves the next most important use case in #68052. I have kept the style as close to that in SGD as seemed reasonable, given the slight differences in their internal implementations. All feedback welcome! cc pietern mrshenli pritamdamania87 zhaojuanmao satgera rohan-varma gqchen aazzolini osalpekar jiayisuse SciPioneer H-Huang Pull Request resolved: #68164 Reviewed By: VitalyFedyunin Differential Revision: D32994129 Pulled By: albanD fbshipit-source-id: 65c57c3f3dbbd3e3e5338d51def54482503e8850
Adam is done, #68164, I may start work on AdamW this week. |
Thanks! |
Summary: Related issue: #68052 cc pietern mrshenli pritamdamania87 zhaojuanmao satgera rohan-varma gqchen aazzolini osalpekar jiayisuse SciPioneer H-Huang Pull Request resolved: #70146 Reviewed By: malfet Differential Revision: D33254561 Pulled By: albanD fbshipit-source-id: f190c836a4162f936c5953e076747c345df21421
Added the maximize flag to Adadelta optimizer (#68052) and adjusted tests to take maximize into account. Pull Request resolved: #75330 Approved by: https://github.com/cpuhrsch
Summary: Added the maximize flag to Adadelta optimizer (#68052) and adjusted tests to take maximize into account. Pull Request resolved: #75330 Approved by: https://github.com/cpuhrsch Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/58a44523c1f74093ad9a05513a32c45a109b6c55 Reviewed By: b0noI Differential Revision: D35510886 fbshipit-source-id: 3d123365c381a83a3b2f71eb6a889641d96af766
I'll handle the rest of these! |
This adds maximize to Adagrad (#68052) along with updates the respective tests. Pull Request resolved: #75968 Approved by: https://github.com/albanD
This adds maximize to Adagrad (#68052) along with updates the respective tests. Pull Request resolved: #75968 Approved by: https://github.com/albanD (cherry picked from commit 6642e88)
Summary: This adds maximize to Adagrad (#68052) along with updates the respective tests. Pull Request resolved: #75968 Approved by: https://github.com/albanD Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/6642e88ad23bfcf1349f2bce06f3490e49f195bf Reviewed By: seemethere Differential Revision: D35785877 fbshipit-source-id: d494b629f3a1b77cec3b3f1ab507516ddea0005a
Added the maximize flag #68052 to Adamax optimizer and updates the respective tests. Pull Request resolved: #77409 Approved by: https://github.com/albanD
Summary: Added the maximize flag #68052 to Adamax optimizer and updates the respective tests. Pull Request resolved: #77409 Approved by: https://github.com/albanD Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/2a496e2f80a9937c336aaabf13f99cbc3983998c Reviewed By: atalman Differential Revision: D36421575 fbshipit-source-id: 298aa514ffa138c01daa8c538ba757b2af31b2e1
Added the maximize flag #68052 to ASGD optimizer and updates the respective tests. Pull Request resolved: #80323 Approved by: https://github.com/albanD
Added the maximize flag #68052 to RMSprop optimizer and updates the respective tests. Pull Request resolved: #80326 Approved by: https://github.com/albanD
Added the maximize flag #68052 to rprop optimizer and updates the respective tests. Pull Request resolved: #80335 Approved by: https://github.com/albanD
Added the maximize flag #68052 to SparseAdam optimizer and updates the respective tests. Pull Request resolved: #80336 Approved by: https://github.com/albanD
Summary: Added the maximize flag #68052 to RMSprop optimizer and updates the respective tests. Pull Request resolved: #80326 Approved by: https://github.com/albanD Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/a1fd5b42730924c5e5a2beb5dabc779725768830 Reviewed By: mehtanirav Differential Revision: D37717279 Pulled By: mehtanirav fbshipit-source-id: a87ef2779dd8ec28513c7cff61b7cbbbf675b913
Summary: Added the maximize flag #68052 to SparseAdam optimizer and updates the respective tests. Pull Request resolved: #80336 Approved by: https://github.com/albanD Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/f24c94d7ae00a7acc1fc3d543e6217509cfeb885 Reviewed By: mehtanirav Differential Revision: D37717288 Pulled By: mehtanirav fbshipit-source-id: 89d20b886febdad620b0c7e52592931500a8beef
Added the maximize flag #68052 to ASGD optimizer and updates the respective tests. Pull Request resolved: #81875 Approved by: https://github.com/albanD
Summary: Added the maximize flag #68052 to ASGD optimizer and updates the respective tests. Pull Request resolved: #81875 Approved by: https://github.com/albanD Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/50c655d5e33eb2ec4d501fe3b03647720a3e67c8 Reviewed By: osalpekar Differential Revision: D38119385 Pulled By: osalpekar fbshipit-source-id: cb1a91a94a642b39113ad5a163c7901e64fd80a8
Added the maximize flag #68052 to rprop optimizer and updates the respective tests. Pull Request resolved: #81864 Approved by: https://github.com/albanD
Summary: Added the maximize flag #68052 to rprop optimizer and updates the respective tests. Pull Request resolved: #81864 Approved by: https://github.com/albanD Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/ff75562cffb54d7500a94a1091e06dc9b5c284fc Reviewed By: atalman Differential Revision: D38745528 fbshipit-source-id: 3b59bf641ff08d37921f231cd4a04244f8a045b6
We will not do lbfgs as part of this issue as it is more complex than others. |
See #46480 for the original discussion.
All optimizers should get the same treatment:
maximize
flag to SGD. #67847cc @vincentqb @jbschlosser @albanD
The text was updated successfully, but these errors were encountered: