Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generalize Optim.jl to be an interface for nonlinear optimization #309

Closed
ChrisRackauckas opened this issue Nov 27, 2016 · 26 comments
Closed

Comments

@ChrisRackauckas
Copy link
Contributor

ChrisRackauckas commented Nov 27, 2016

I think Optim.jl is in a prime position to generalize itself as a common interface for nonlinear optimization. While JuMP has some support for nonlinear optimization, by design it won't be able to be as comprehensive what is envisioned here because JuMP's constraints have special requirements which one cannot generally follow. For example, while NLopt and IPOPT accept general constraint functions g, JuMP only allows functions defined within the macros, and places heavy restrictions like "no linear algebra".

Thus I see Optim.jl as prime for giving a common interface to optimize a function f with a constraint function g, and expand this common interface to include non-Julia implementations, etc. Following the examples of other metapackages/ecosystems like JuMP, Plots, JuliaML, and DifferentialEquations, I propose the following structure:

These parts I'd consider for an OptimBase.jl:

  • A UnconstraintedOptProblem(f,x0) type which holds the information for an unconstrainted optimization problem, and a ConstraintedOptProblem(f,g,x0) (and spots for Jacobians/Hessians/whatever else)
  • A common interface built around optimize(prob,alg()). This is similar to before, but allows dispatching on the problem type (which may get more refined later, like LinearConstrainedOptProblem or something)
  • Add a promotion from Unconstrainted to Constrained with a trivial g. This will allow developers to target just the constrained problem and get the other "for free" (sometimes this might make sense).
  • A solution type (which Optim.jl already has)

I think it would be wise to then have a package for the native solvers (currently Optim.jl), and add bindings to this common interface to packages like NLOpt so optimize(prob,GN_ISRES()) makes it easy to try methods from other packages.

What do you guys think of this proposal, or of a similar design with the same goals? I for one would find this immensely useful as it would allow me to target just the single interface, but allow users the flexibility for picking backend methods from various packages.

(I also have on my mind a similar package for root finding, to blend together NLsolve, Roots.jl, and Sundials' KINSOL. Chat with me if you're interested.)

Edit: Added link to JuMP-dev Gitter Archive talking about the (lack of) possibility for this in JuMP

@mlubin
Copy link
Contributor

mlubin commented Nov 27, 2016

For the purposes of nonlinear optimization, JuMP is just a nice an interface to AD. MathProgBase is already the "common interface" you're looking for, for constrained problems.

@ChrisRackauckas
Copy link
Contributor Author

I see. Then what about making Optim.jl available in the MathProgBase nonlinear interface?

@mlubin
Copy link
Contributor

mlubin commented Nov 27, 2016

Anyone is welcome to do so. Relevant discussions:
#107
JuliaOpt/MathProgBase.jl#87

@timholy
Copy link
Contributor

timholy commented Nov 27, 2016

Need #303 to handle constraints.

@pkofod pkofod mentioned this issue Dec 22, 2016
13 tasks
@gragusa
Copy link

gragusa commented Jan 12, 2017

Is somebody is interested, I put together glue code to integrate Optim.jl in the MathProgBase nonlinear interface. Nothing too fancy, but I needed it --- so I went ahead and implemented it. Comment (and PR!) welcome. OptimMPG.jl.

@pkofod
Copy link
Member

pkofod commented Jan 14, 2017

Is somebody is interested, I put together glue code to integrate Optim.jl in the MathProgBase nonlinear interface. Nothing too fancy, but I needed it --- so I went ahead and implemented it. Comment (and PR!) welcome. OptimMPG.jl.

Wonderful! Great to see someone thinking about this. I'm not sure I would use it all that often personally, but on the other hand I can see how it can be useful to some people, especially when the constrained optimization features improves.

@pkofod
Copy link
Member

pkofod commented Mar 24, 2017

@gragusa if I understand correctly, your package allows you to use Optim through MPG right? It's not code that allows you to use Optim to reach NLOpt for example.

@gragusa
Copy link

gragusa commented Mar 24, 2017 via email

@pkofod
Copy link
Member

pkofod commented Jun 17, 2017

ping @gragusa

I may be wrong, but I think that maybe it makes most sense to keep OptimMPB.jl as a separate package, what about an OptimMPB.jl package in JuliaNLSolvers? We can simply move your existing package here.

@gragusa
Copy link

gragusa commented Jun 17, 2017 via email

@pkofod
Copy link
Member

pkofod commented Jun 17, 2017

I am happy to transfer ownership of the package to JuliaNLSolvers. I have been planning to get the package in better shape, but I waited to see whether the work done on constrained optimization would be merged. Probably though this is not going to happen anytime soon, so no need to wait.

I intend to make a push for getting the constrained ball rolling after JuliaCon.

@jonathanBieler
Copy link
Contributor

jonathanBieler commented Dec 21, 2017

I'm trying to write and optimizer (M <: Optimizer) in particular I want to reuse optimize by implementing all the methods used (initial_state, update_state!, ...) for my optimizer, all that outside of Optim. Is that possible ? it doesn't seems like MathProgBase is providing the same kind of utilities.

The first issue I ran into is a No default objective type for M..., but I can't tag my optimizer as a FirstOrderSolver since that's hardcoded here.

It seems we would need M <: FirstOrderSolver <: Optimizer or wouldn't that work ?

@pkofod
Copy link
Member

pkofod commented Dec 21, 2017

The hard coding of that is indeed going to be removed very soon!

@anriseth
Copy link
Contributor

The hard coding of that is indeed going to be removed very soon!

Great :) I hit this issue as well with my implementation of N-GMRES.

@pkofod
Copy link
Member

pkofod commented Dec 21, 2017

I've hit as well. It'll be there in a week - promise! What are you implementing Jonathan?

@jonathanBieler
Copy link
Contributor

Nice.

@pkofod I'm trying to make a composable gradient descent type that covers all possible use cases, e.g.:

gd = GradientDescent(
    MiniBatch(f,data,batch_size),
    AdaGrad(SimpleMomentum(1e-5)),
    HypergradientDescent(),
)

optimize(gd...)

I'm just playing with it to see if it's feasible and desirable at the moment.

Otherwise I'd also like to have a good CMA-ES implementation at some point, there's some versions around but they are not optimal afaik.

@pkofod
Copy link
Member

pkofod commented Dec 22, 2017

Alright, cool. Would love to see what you cook up.

@jonathanBieler
Copy link
Contributor

jonathanBieler commented Jan 12, 2018

@pkofod,
I'm trying to make minimal examples, it works well for FirstOrderOptimizer but I'm getting into some issues with ZerothOrderOptimizer, this part here has some hard-coded convergence tests, and my optimiser throw an error when gradient! gets called.

https://github.com/JuliaNLSolvers/Optim.jl/blob/master/src/multivariate/optimize/optimize.jl#L33

Maybe introducing a method initial_convergence(state, d, options) (replacing line 33-41) would do the trick ? I also noticed there's a ZerothOrderState but not a FirstOrderState.

I also didn't found a generic fallback method for trace! (something like default_convergence_assessment) so I had to copy some code there.

This is what I got for first order:

https://gist.github.com/jonathanBieler/ed2ae8868e7b317c9e6d2db86f6ed2b9

And zeroth order:

https://gist.github.com/jonathanBieler/47d9ae7e95e7ca0f7352de8f84827ae3

@pkofod
Copy link
Member

pkofod commented Jan 12, 2018

Thank you, this is very helpful. I'll have a look soon.

@Nosferican
Copy link

Status?

@pkofod
Copy link
Member

pkofod commented Jun 3, 2019

I think this might be easier with the re-write. It's not a priority of mine though. This thread also has two discussions in one I think :p

@gragusa
Copy link

gragusa commented Jun 3, 2019 via email

@pkofod
Copy link
Member

pkofod commented Jun 3, 2019

Are you talking about a moi rewrite?

@gragusa
Copy link

gragusa commented Jun 3, 2019 via email

@pkofod
Copy link
Member

pkofod commented Aug 16, 2020

I believe @ChrisRackauckas has found his answer in GalacticOptim :)

@pkofod pkofod closed this as completed Aug 16, 2020
@ChrisRackauckas
Copy link
Contributor Author

Yup 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants