-
Notifications
You must be signed in to change notification settings - Fork 140
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gradient only accurate to 10 digits ? #70
Comments
Hi @GravityAssisted, forward mode AD avoids the truncation errors of numerical differentiation, and is in general accurate. It may get affected by the actual implementation of the evaluated function, as this may cause propagation of machine precision errors, so that's where I would start from. That being said, forward mode has been proven to be backward stable in the sense of Wilkinson, which means that even small perturbations of the original function due to machine precision eps should still yield accurate derivatives. A reference for this theoretical point is perhaps Griewank's recent paper (2014), see here. |
@scidom I just did a
That would explain the 10 digits of precision. Roundoff errors cant be that bad on this function. Seems fishy to me. |
Just found the error... The code is not even calling the ForwardDiff library !! Its calling the inbuilt Julia gradient function. Very suspicious that It didn't warn me of that when I did "using ForwardDiff". Now, if I do the following I match up-to 16 digits.
The module should be warning me of this or this is something that should be in documentation as I am sure lot of people might be using the gradient function directly and accumulating errors needlessly. |
erm: http://www.juliadiff.org/ForwardDiff.jl/perf_diff.html#gradients
|
Great, good to see you spotted the error @GravityAssisted; since the warning is already documented as @KristofferC mentioned I suppose we can close this issue. |
@KristofferC , @scidom thanks, my bad. I should have read the documentation more carefully. |
Glad to see that you got it sorted @GravityAssisted; if not mistaken, @jrevels has updated the examples to use |
Yup. It's a very easy mistake to make (I still make it myself sometimes when doing quick hacks in the REPL). I've taken to always using |
Here is a function I like to calculate a gradient of:
Its actual derivative at
kv = [-0.5]
is-2.8185256628482382
but using the function
gradient
I get an answer which is only 10 digits accurate:Using complex step derivative I get full 16 digits of accuracy.
I was under the impression that the ForwardDiff was accurate to machine precision, is that wrong ?
I am on Julia 0.4, MacOSX, LLVM 3.3
thanks,
Nitin
The text was updated successfully, but these errors were encountered: