Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
switch argument order for gradient and hessian evaluations? #156
It would be nice if the inputs
g! = ForwardDiff.gradient(f, mutates=true) # compute gradient at x g!(storage, x)
Meanwhile Optim expects:
I think having the mutated argument come first makes sense as a convention. However, this is a small detail, and it will break a lot of code. So maybe it is best to keep things the way they are, but wanted to bring up this possibility.
+1 from me.
As it is right now you need to do quite some juggling with function argument to make it work with ForwardDiff, see for example https://github.com/EconForge/NLsolve.jl/blob/master/src/autodiff.jl
The problem is that many of the julia optimization/solver packages are consistent on having the mutated input last, i.e.:
so maybe it is a too disruptive of a change...