This package solves non linear least squares problems. The package is inspired by the Ceres library.
To install the package,
Pkg.add("LeastSquaresOptim")
To find x
that minimizes f'(x)f(x)
, construct a LeastSquaresProblem
object with:
x
an initial set of parameters.f!
a callable object such thatf!(x, out)
writesf(x)
inout
.output_length
the length of the output vector.
And optionally
g!
a function such thatg!(x, out)
writes the jacobian at x inout
. Otherwise, the jacobian will be computed with theForwardDiff.jl
packagey
a preallocation forf
J
a preallocation for the jacobian
A simple example:
using LeastSquaresOptim
function rosenbrock_f!(x, fcur)
fcur[1] = 1 - x[1]
fcur[2] = 100 * (x[2]-x[1]^2)
end
x = [-1.2; 1.]
optimize!(LeastSquaresProblem(x = x, f! = rosenbrock_f!, output_length = 2))
function rosenbrock_g!(x, J)
J[1, 1] = -1
J[1, 2] = 0
J[2, 1] = -200 * x[1]
J[2, 2] = 109
end
optimize!(LeastSquaresProblem(x = x, f! = rosenbrock_f!, g! = rosenbrock_g!, output_length = 2))
The main optimize!
method accepts two main arguments : optimizer
and solver
-
Choose an optimization method:
LeastSquaresOptim.LevenbergMarquardt()
LeastSquaresOptim.Dogleg()
-
Choose a least square solver (a least square optimization method proceeds by solving successively linear least squares problems
min||Ax - b||^2
).-
LeastSquaresOptim.QR()
. Available for dense jacobians -
LeastSquaresOptim.Cholesky()
. Available for dense jacobians and sparse jacobians. For sparse jacobians, a symbolic factorization is computed at the first iteration from SuiteSparse and numerically updated at each iteration. -
LeastSquaresOptim.LSMR()
. A conjugate gradient method (LSMR with diagonal preconditioner). The jacobian can be of any type that defines the following interface is defined:A_mul_B!(α::Number, A, x, β::Number, y)
updates y to αAx + βyAc_mul_B!(α::Number, A, y, β::Number, x)
updates x to αA'y + βxcolsumabs2!(x, A)
updates x to the sum of squared elements of each columnsize(A, d)
returns the nominal dimensions along the dth axis in the equivalent matrix representation of A.eltype(A)
returns the element type implicit in the equivalent matrix representation of A.
Similarly,
x
orf(x)
may be custom types. An example of the interface to define can be found in the package SparseFactorModels.jl.For the
LSMR
solver, you can optionally specifying a functionpreconditioner!
and a matrixP
such thatpreconditioner(x, J, P)
updatesP
as a preconditioner forJ'J
in the case of a Dogleg optimization method, and such thatpreconditioner(x, J, λ, P)
updatesP
as a preconditioner forJ'J + λ
in the case of LevenbergMarquardt optimization method. By default, the preconditioner is chosen as the diagonal of of the matrixJ'J
. The preconditioner can be any type that supportsA_ldiv_B!(x, P, y)
-
The optimizers
and solvers
are presented in more depth in the Ceres documentation. For dense jacobians, the default options are Dogle()
and QR()
. For sparse jacobians, the default options are LevenbergMarquardt
and LSMR()
.
optimize!
also accept the options : ftol
, xtol
, gr_tol
, iterations
and Δ
(initial radius).
The package is written with large scale problems in mind. In particular, memory is allocated once and for all at the start of the function call ; objects are updated in place at each method iteration.
You can even avoid initial allocations by directly passing a LeastSquaresProblemAllocated
to the optimize!
function. Such an object bundles a LeastSquaresProblem
object with a few storage objects. This may be useful when repeatedly solving non linear least square problems.
rosenbrock = LeastSquaresProblemAllocated(x, fcur, rosenbrock_f!, J, rosenbrock_g!;
LeastSquaresOptim.Dogleg(), LeastSquaresOptim.QR())
optimize!(rosenbrock)
Related:
- LSqfit.jl is a higher level package to fit curves (i.e. models of the form y = f(x, β))
- Optim.jl solves general optimization problems.
- IterativeSolvers.jl includes several iterative solvers for linear least squares.
- NLSolve.jl solves non linear equations by least squares minimization.
- Nocedal, Jorge and Stephen Wright An Inexact Levenberg-Marquardt method for Large Sparse Nonlinear Least Squares (1985) The Journal of the Australian Mathematical Society
- Fong, DC. and Michael Saunders. (2011) LSMR: An Iterative Algorithm for Sparse Least-Squares Problems. SIAM Journal on Scientific Computing
- Agarwal, Sameer, Keir Mierle and Others. (2010) Ceres Solver