-
Notifications
You must be signed in to change notification settings - Fork 214
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The role of Levenberg-Marquardt #207
Comments
It could also make sense to put it in NLsolve. They are the exact same algorithms. |
From a certain perspective, it would make sense to have least squares algorithms in It is also true that the Levenberg-Marquardt algorithm has optimal convergence properties whenever Also, now that PR #196 is merged, the documentation could mention this algorithm, and indeed have a section on least squares algorithms. Perhaps someone would then contribute other ones, eg the hybrid algorithm of Dennis et al (1981). |
Thank you for joining the discussion @tpapp . It is not that I am not aware that it is a solver, although a very specialized one. It is more if it fits in Optim. Now that of course depends on what Optim is supposed to be. Personally, I do not think it should contain methods as specialized as LM, but other might disagree. I think the fact that you mention other algorithms for least squares is exactly why I think a dedicated least squares (there is even one out there already!) is more appropriate. There you can have a common interface for various least squares solvers. If we start adding more here, then we have to maintain one interface for "common optimization" and another one specifically for least squares. Now, I am very happy to have had @bjarthur begin active here in 2016, I just know I am not going to prioritize them myself. I did not add documentation for two reasons. 1) honestly, I don't know too many details about the solver, 2) I had this issue in mind. However, if someone wants to add documentation, I would welcome a PR with open arms. |
One reason for including least squares routines in Optim (besides the fact that they are optimization routines for specialized problems) is sharing code for various common algorithms. Eg trust region and dampened methods have a similar general setup and require solving similar subproblems, Newton-type algorithms involve line search in both cases, etc. Perhaps they are not written to share this functionality at the moment, but IMO it would be great if they did whenever applicable. OTOH I fully understand that having least squares would involve yet another interface. Perhaps a wrapper similar to Regarding LM: in current master it does not have a method for |
I personally think this is too detailed an interface for
And there are many more structured objective functions people may want to build specialized algorithms for in the future.
But that doesn't mean LM needs to live in Optim. Why not have it live in a separate repository ( If a centralized repo is wanted, we could create a |
We currently have an active PR #196 so I thought this might be relevant to discuss the role and future of LM in Optim.jl.
I find it a bit strange, that it is a part of this package. Generally, I think of Optim as a package that takes a Julia function (and gradient and Hessian) and optimizes it (hopefully with better support for constraints in the future). LM does not fit in this framework at all, and I wonder if it should go just as nnls did with the API update. Sure it takes an
f
function and ag
function, but there are very special assumptions about these.It would require a few minutes worth of work to simply copy it line by line into
LsqFit.jl
(mostly by addingOptim.
where appropriate). Is this something "we" want, though? Is it somethingLsqFit.jl
wants?The text was updated successfully, but these errors were encountered: