Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple Primal Optimizers #45

Merged
merged 12 commits into from
Aug 24, 2022

Conversation

juan43ramirez
Copy link
Collaborator

Closes #39

Changes

Parameter primal_optimizer of ConstrainedOptimizer renamed to primal_optimizers. It accepts either a (i) torch.optim.Optimizer or a (ii) list of Optimizers. Behavior in case (i) is unchanged.

Lines previously performing constrained_optimizer.primal_optimizer.method() have been replaced by for primal_optimizer in constrained_optimizer.primal_optimizers; primal_optimizer.method().
For instance,

for primal_optimizer in self.primal_optimizers:
primal_optimizer.step()

Saving and loading of checkpoints was modified to allow for lists of primal optimizers.

Warning
This breaks backward compatibility when instantiating a ConstrainedOptimizer with keyword argument primal_optimizer.

Testing

Toy2D testing of constrained and unconstrained execution included in test_optimizer.py.
Checkpointing test included in test_checkpoint.py

This functionality is not tested together with other optimization methods or formulations (Extragradient, Augmented Lagrangian, Proxy constraints).

Docs

Note added in constrained_optimizer.rst indicating that functionality exists.
Small section added in optim.rst describing how to setup multiple primal optimizers and how Cooper handles them.
Docstrings and overall documentation updated to be general enough for multiple optimizers.

@juan43ramirez juan43ramirez added the enhancement New feature or request label Aug 23, 2022
@juan43ramirez juan43ramirez self-assigned this Aug 23, 2022
@juan43ramirez juan43ramirez linked an issue Aug 23, 2022 that may be closed by this pull request
Copy link
Collaborator

@gallego-posada gallego-posada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The implementation of multiple primal optimizers looks good!

Left several repeated change requests for not wrapping a single optimizer object into a list (since the ConstrainedOptimizer class would already do that internally anyway).
IMO this would keep the usage less cluttered since users will most often use single primal optimizers anyway.

cooper/constrained_optimizer.py Outdated Show resolved Hide resolved
cooper/constrained_optimizer.py Outdated Show resolved Hide resolved
cooper/constrained_optimizer.py Outdated Show resolved Hide resolved
cooper/constrained_optimizer.py Outdated Show resolved Hide resolved
README.md Outdated Show resolved Hide resolved
tutorials/max_entropy.ipynb Outdated Show resolved Hide resolved
tutorials/scripts/plot_gaussian_mixture.py Outdated Show resolved Hide resolved
tutorials/scripts/plot_max_entropy.py Outdated Show resolved Hide resolved
tutorials/widget.py Outdated Show resolved Hide resolved
docs/source/additional_features.rst Outdated Show resolved Hide resolved
@gallego-posada
Copy link
Collaborator

A natural follow-up feature for this would be multiple (partially instantiated) dual optimizers.
It might be tricky to allow for too much granularity (say a different optimizer for each constraint). But allowing for different optimizers for equality and inequality constraints could be a good first step.

However, this does not seem to be a pressing feature to implement at the moment.

juan43ramirez and others added 7 commits August 23, 2022 19:05
Co-authored-by: Jose Gallego-Posada <jgalle29@gmail.com>
Co-authored-by: Jose Gallego-Posada <jgalle29@gmail.com>
Co-authored-by: Jose Gallego-Posada <jgalle29@gmail.com>
Co-authored-by: Jose Gallego-Posada <jgalle29@gmail.com>
Co-authored-by: Jose Gallego-Posada <jgalle29@gmail.com>
Co-authored-by: Jose Gallego-Posada <jgalle29@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Multiple primal optimizers
2 participants