Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

optimize: curve-fitting - implement Levenberg-Marquardt algorithm (damped least-squares) #295

Open
fawick opened this issue Nov 9, 2017 · 13 comments

Comments

@fawick
Copy link
Contributor

fawick commented Nov 9, 2017

(This is basically a dupe of gonum/optimize#174, but as that repo is deprecated, I'd like to track progress here.)

I'd love to see Levenberg-Marquardt implemented in optimize.

I have used Nelder-Mead in the meantime with acceptable results for my use cases, so far, but I needed to tune it always to prevent being stuck in local optima.

@btracey
Copy link
Member

btracey commented Nov 9, 2017

In the meantime, you could also use LBFGS or Newton. They both take advantage of gradient information and so should be much faster.

@cnbuff410
Copy link

As a guy who has no background in this area but is having a task to copy the behavior of scipy.optimize.curve_fit in Go, is there a suggested method to get the closest result to the Levenberg-Marquardt algorithm?

@btracey mentioned to try LBFGS or Newton, and seems like there are other algorithm available in the optimize package, is there other algorithm I want to try to see which one gives me the result closest to LM, or LBFGS/Newton is enough?

@vladimir-ch
Copy link
Member

The objective (cost) function for fitting a curve to a series of data points will most likely have a special, sum-of-squares form. Any of the mentioned methods will be able to minimize such functions, but Levenberg-Marquardt, if we had it, could take advantage of its special structure. Apart from that, all these methods (including LM) are local, meaning that they will converge from a given starting point to only a local minimum.

Without LM, I would start with BFGS, LBFGS, or CG (depending on the problem size). They need only the function itself and its gradient. Newton is also possible but it needs also the Hessian (matrix of second derivatives) of the objective function.

@maorshutman
Copy link

Is the implementation of LM in pure Go relevant? If so I would like to start this project.

@sbinet
Copy link
Member

sbinet commented Mar 8, 2019

Yes, Please! :)

@btracey
Copy link
Member

btracey commented Mar 8, 2019

I'm not sure it fits into the current optimize package, just because it needs to take a specific functional form. I'm not sure if it should be in its own sub-package, or grouped in with some other things. For instance, I have a bunch of SGD step sizers that could nicely go somewhere.

@maorshutman
Copy link

Implemented the LM method described in "Methods for non-linear least squares problems.", 2nd edition, 2004. I followed the sgd implementation of @btracey. See lm. Currently have 4 tests.

@sbinet
Copy link
Member

sbinet commented Mar 14, 2019

Nice!

@fawick
Copy link
Contributor Author

fawick commented Mar 14, 2019

Thank you @maorshutman!

@kegsay
Copy link

kegsay commented Jun 13, 2021

Will https://github.com/maorshutman/lm be merged into gonum at some point? It sounds like the only reason not to is not knowing where to place it, but presumably that's solvable?

@sbinet
Copy link
Member

sbinet commented Jun 14, 2021

this would need a "champion" to spearhead that work.

as for the proper home for that LM curve fitting procedure, there are (I think) 3 ways to paint that shed:

  • in a new package lm sibling of optimize
  • in a new package optimize/lm
  • integrated inside optimize

there would probably be some case to be made to have the optimize machinery be somehow reused for lm.LM

(also the license of that lm work would need to be clarified before being integrated into Gonum)

@vladimir-ch
Copy link
Member

I haven't forgotten that gonum/exp#35 exists. It's the lack of time, not of interest, that blocks me. I'll get to it, eventually.

@sbinet
Copy link
Member

sbinet commented Jun 18, 2021

I obviously did forget about that one :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants