New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optim of Tomorrow #326

pkofod opened this Issue Dec 22, 2016 · 6 comments


None yet
5 participants

pkofod commented Dec 22, 2016

aka Optim Roadmap 2.0 aka Make Optim Greater Still...

These lists are up for discussion, I just figured I would start that discussion now. JuliaLang v1.0 is approaching rapidly, with v0.6 feature freeze not far into the future. I am not sure we should tag Optim v1.0 the day JuliaLang starts shipping as non-beta, but it would be cool to have a lot of the current issues sorted out.

Let the number of issues guide you as to what I will tackle first :)


  • Replace df.f(x), df.g!(x, stor) and df.h!(x, stor) with value(df, x), gradient!(df, x) and hessian!(df, x) where the old storage argument is embedded in df#156 #163 #219 #241 #287 #305 #306
  • Restore matrix api ref #198 #312
  • Figure out what to do about tests and benchmarks. The current idea seems to be to have some testing here, and then extensive benchmarking and testing in OptimTests.jl. Relevant issues and PRs: JuliaOpt/JuMP.jl#42 JuliaNLSolvers/OptimTests.jl#7
  • Move problems and some testing infrastructure out of Optim
  • Add limits to f_/g_/h_calls #241
    /hessian! interface #308 (I intend to implement this together with the first bullet) through a keyword and a new function
  • Figure out if we want to integrate with JuMP/MathProgBase #107 #309 (see for mathprogbase at least
  • Better/additional/flexible stopping criteria? #374


  • General QA on the contents
  • Identify weak pages
  • Add references


  • Read through to find money for @timholy so he can take a sabbatical from his boring image lab, and work on his fun fun fun constrained optimizer instead! (@anriseth did it for free instead :) )

Solvers (up for grabs if the person is willing to "maintain")

In general I think we might want to delete feature/solver requests and have a single issue for them, as the requesters are rarely the ones willing to write up and maintain anyways, so the issues always end up idle.

... and more! I think 2017 is going to be a good year for Optim.


This comment has been minimized.


cortner commented Dec 23, 2016

I'd like to add

  • rewrite line searches as types (I realise this is LineSearches.jl, but it can only go hand in hand with Optim.jl); see JuliaNLSolvers/LineSearches.jl#9
  • an intelligent objective that avoids multiple evaluations; #282
  • a benchmark function that, given a small set of objectives, tests a variety of methods maybe with different parameters, and recommends the optimal configuration, which can then be reused for multiple future runs.

This comment has been minimized.


pkofod commented Dec 23, 2016

Those are good points.

Dispatch based linesearches
The first one has to go hand in hand with NLsolve as well I guess, @KristofferC . It would be great to make an interface such that it would be easy to call a solver here with a linesearch not in LineSearches.jl, and I think the dispatch based idea you have in mind does exactly that. This would to some degree separate the work here and in LineSearches, to the benefit of both parties.
Function evaluations
I think the second point relates to the DifferentiableFunction-rewrite. We could certainly make it so that value(df, x) only actually evaluated f it x was different from last time. This would solve it right?

function value(df, x) 
    if is_new(x, df.last_x)
        df.f_calls += 1
        df.f_x = df.f(x)
        copy!(df.last_x, x)

The benchmarking-function sounds fun, but that should probably be started outside of Optim. It can also be started as a PR here, but I think it would need some work before we can merge is as an official part of this package. You are thinking something like recommend(problems) where problems is some collection of solver/starting point/parameter specifications? I guess it could also be based on CUTEst problems and more. That is, some sort of recommender system in a stored state based on the large problem collection, and then you could add your own, which would maybe have a relatively large weight attached and update the parameters of the recommender system. I'm not sure how well it would work in practice, but we could start the discussion over at JuliaML (gitter?).


This comment has been minimized.


cortner commented Dec 24, 2016

Function evaluations
I think the second point relates to the DifferentiableFunction-rewrite. We could certainly make it so that value(df, x) only actually evaluated f it x was different from last time. This would solve it right?

Yes. A key advantage is that it would remove the need for passing function values into line searches and other things.


This comment has been minimized.


ChrisRackauckas commented Feb 9, 2017

I would like to add:

  • Less strict typing of inputs.

For the native Julia methods, there's no reason to restrict to Array{T,N}. AbstractArrays, which by definition should have a linear index, seem like they would work fine in the methods that I checked. You wouldn't be able to do a type-check until v0.6 since it would require triangular dispatch. But I would really like to start optimizing some crazy things like MultiScaleArrays, and it seems the only thing blocking it is the type-checks requiring it to be a traditional array. Note that the same change should make it compatible with things like SharedArrays, which would be an early step to parallelism.


This comment has been minimized.


timholy commented Aug 23, 2018

Congrats!! 🎆


This comment has been minimized.


brilhana commented Aug 25, 2018

I will attempt to work on #272. I'll look at the C implementation in NLopt.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment