-
-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[GSoC] WIP: Bounded LSQ algorithms #5019
Conversation
Why test_ltisys.py suddenly is not passing? |
The failing ltisys test is testing that some warnings are emitted. You have |
Any possibility of using optimize.differential_evolution or other optimize.minimize variants (eg LBFGS) methods here? There is a large use of those for curvefitting. |
@andyfaff: least-squares has more information available than scalar minimization, so it seems those are out of scope. In principle what's done here has similarities to LBFGS, but the problem solved is different and LBFGS fortran code is not very reusable. |
@andyfaff Nikolay benchmarked L-BFGS-B on a series of LSQ problems, see the link in the OP. AFAICT the conclusion is that specialized algorithms he implemented often outperform the general-purpose ones. |
I'm really looking at a way of plugging in the global minimisers here, like
|
At least for basinhopping, these probably are OK as the local optimization step. Some extra plumbing seems to be required, as LSQ needs to get the residual vectors. |
When I first encountered scipy I thought it could be my one stop shop for
curvefitting. Unfortunately curve_fit and leastsq have a lot of
functionality that is missing: constraints, global minimisation, etc. I use
lmfit at the moment, but I'd like to see scipy slowly get the bits that are
missing.
|
Would you list some specific problems? More problems to test current algorithms on are definitely very welcome --- the selection in Nikolay's sandbox repo could definitely be expanded. I wouldn't be surprised if he accepts PRs with more benchmarks. Regarding the new minimizers, I am tempted to consider these as an enhancement request. It would be most helpful if you could detail your suggestions in, e.g., this wiki page or even implement the needed changes to the The main aim of this PR, as far as I understand it, is to provide two new algorithms, dogbox and trf, for least squares with bounds --- as such, they complement and compete with implementations of bounds by transformation of variables, as is done in leastsqbound and lmfit. |
Yes, the |
I'd suggest discussion on details of the interface is diverted to gh-5020, and continue discussing the trf/dogbox algorithms here. |
On 6 Jul 2015 1:01 pm, "Evgeni Burovski" notifications@github.com wrote:
Sure. Anything which depends on the starting position would be useful here. |
It's a question of finding global minimum among several local, right? it's not directly related to what I'm doing. |
As Evgeni asked, I removed all sparse features from _numdiff.py (they weren't used in any way). |
An example
You'll find with that starting point and |
In your example,
Both methods included in this PR are local, and they full well can converge to a local optimum. That being said,
and |
The original idea of separating dense and sparse computations did not work out. Cosing this in favor of gh-5044 |
Hi!
I finally prepared code for PR. It is based on not yet merged #4884 So the first commit related to least squares is e908f76
The main (and only public) function is
scipy.optimize.least_squares
which wrapsscipy.optimize.leastsq
(asmethod='lm'
) and two new algorithms I was working on'dogbox'
and'trf'
. The docstring is in a reasonable shape so I encourage you to tryleast_squares
on your problems. Here is the link to benchmarks I done with algorithms.The issues at the moment:
scipy.optimize.minimize
, where general documentation is accompanied with specific methods documentation, like http://docs.scipy.org/doc/scipy-dev/reference/optimize.minimize-neldermead.html But currently I don't know how to do that and not sure if it is the best decision in general.leastsq
. I thinkleast_squares
might feel more solid and homogeneous if I write a new wrapper to MINPACK functions, instead of callingleastsq
.leastsq
. Now it seems OK to wrap it, but I'm working on extension to sparse Jacobians which will add quite a bit of additional options, which apply to both of the new algorithms, but not at all toleastsq
. So the tempting solution is to dropleastsq
(it will be still available after all) and make a very clean interface (withoutoptions
, additional doc pages, etc.)