New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add function that uses automatic differentiation to calculate derivatives #10624
Comments
@MichaelWedel (2014-07-01T15:23:28): @MichaelWedel (2014-07-01T15:23:28): @MichaelWedel (2014-07-03T06:47:00): @MichaelWedel (2014-07-03T06:48:41): @MichaelWedel (2014-07-03T06:52:21):
Note that I will remove the testing algorithm and function before the ticket is finished. @MichaelWedel (2014-07-03T11:24:51): Added doxygen comments and some general documentation. Cleaned up the interface of AutoDiffJacobianAdapter and added error handling. @MichaelWedel (2014-07-03T12:51:48): @MichaelWedel (2014-07-03T14:08:48): @MichaelWedel (2014-07-11T09:17:02): '''Linux'''
'''Windows'''
'''Mac OS X'''
@MichaelWedel (2014-07-24T12:04:00): '''Windows'''
'''General'''
'''What's left to do'''
@MichaelWedel (2014-07-25T08:02:08): @MichaelWedel (2014-07-25T08:02:08): @MichaelWedel (2014-07-25T08:20:12): Only Mac OS X left now. @MichaelWedel (2014-07-31T08:43:24): The same as GaussianAutoDiff, for comparison. @MichaelWedel (2014-07-31T08:43:24): Roman's changes to make a comparison possible between numerical and auto diff. @MichaelWedel (2014-07-31T08:43:24): @MichaelWedel (2014-08-13T07:09:32): @MichaelWedel (2014-08-13T07:17:13): @MichaelWedel (2014-08-13T08:03:51): Performance tests conducted by Roman and me showed that (in contrast to my expectation), automatic differentiation is slightly slower for the cases we tested, even with a fair amount of parameters and spectra to fit. I also tested different approaches regarding the integration of adept into the function fitting framework, but none of that lead to significant speedups. Compiling adept with a larger initial stack size helped a bit, since it does not need to grow the stack so often anymore (it involves re-allocation of memory and is rather expensive). Another argument generally made in favor of automatic differentiation is the better precision of the resulting derivatives, so I wanted to test this as well. I modified AutoDiffTestAlg to also include a function with hand-coded derivatives and make it more suitable for systematic testing. The algorithm expects a MatrixWorkspace with 4 spectra, which contain the following data:
It produces an equivalent workspace with the differences (reference - calculated) from calls to IFunction::function and IFunction::functionDeriv, using the specified method (num, adept, handcoded). In attachment:Gaussian_y_J_053.nxs there are data for a Gaussian (Centre: 0.0, Height: 4.0, Sigma: 3.5) on the range [-5,5](Produced with Matlab symbolic toolbox). Assuming the file is located in the data search path, the following script generates workspaces for all three methods:
For hand-coded and automatic derivatives the differences are all in the range [-5e-16, 5e-16], but not identical. I suspect differences on that level are not significant taking into account floating point precision (the values are on the order of 1e0 or 1e-1, with 15-17 significant digits I would classify these differences as "tolerable"). In the numerical case all three derivatives behave differently. For df/d(centre) the differences to the correct value are largest with values on the interval [-0.03, 0.03], where the derivatives themselves are on a scale of ca. [-0.8, 0.8]. df/d(height) looks reasonable, with differences of [-5e-14, 5e-14] with a distribution that seems more or less random. The derivations of df/ds are on the interval of [0, 7e-4], with values [0, 0.9]. Here the differences look very systematic, the difference curve looks very similar to the derivative, just 3 orders of magnitudes smaller. With the script above and the attached data these results should be reproducible. Overall, the hand-coded and automatic derivatives are comparable and do not show significant derivations from the reference results. Numeric derivatives show differences which can become large in some cases. I know that this is expected, because of truncation and rounding of floating point numbers and depends on many factors such as parameter value range, function value range and possibly other factors that I am forgetting now. I tried to choose "reasonable" example parameter ranges, but I am willing to try some more functions/parameter combinations as well. @MichaelWedel (2014-08-13T12:55:29): The algorithm now takes function parameters (for calculating values) and measures performance of derivative calculation. @MichaelWedel (2014-08-13T12:55:29): I was not aware how incredibly slow pow() is for squaring a value. @MichaelWedel (2014-08-13T13:07:41): @MichaelWedel (2014-08-13T14:37:13): They all live in a namespace called "Lorentzians". @MichaelWedel (2014-08-13T14:37:13): @MichaelWedel (2014-08-13T14:44:54): @MichaelWedel (2014-08-18T08:52:03): @MichaelWedel (2014-08-18T08:52:03): @MichaelWedel (2014-08-18T08:52:03): @MichaelWedel (2014-08-18T08:52:03): @MichaelWedel (2014-08-18T09:11:19): With the attachment data, you can do:
The derivatives are not as precise as the ones for Gaussian and Lorentzian, not even the hand-coded ones. I guess it has something to do with the implementation of the mathematical functions that are used and in the end, also floating point accuracy. This is a summary of the performance (on my Ubuntu machine). It's always Hand-coded : Numerical : Adept. '''Gaussian'''
'''Lorentzian'''
'''Pearson-VII'''
Based on these findings I think it's really worthwhile to include this feature into Mantid. @MichaelWedel (2014-10-31T15:49:40): @MichaelWedel (2014-10-31T15:49:40): @MichaelWedel (2014-10-31T16:26:28): In Mantid::CurveFitting there is a test algorithm, AutoDiffTestAlg, which can be invoked by this script:
It records timing information and generates output with accuracy information for the partial derivatives. To use it you need to download the attached data files. @MichaelWedel (2014-12-05T12:14:43): |
http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/WorkspaceData.nxs http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/6904_peaks_table_fortran.nxs http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/6904_calc_peaks1-4_fortran.nxs http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/Gaussian_y_J_053.nxs http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/Lorentzian_y_J_d.nxs http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/PearsonVII_y_J_d.nxs http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/Gaussian_y_J_d.nxs |
http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/WorkspaceData.nxs |
http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/6904_peaks_table_fortran.nxs |
http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/6904_calc_peaks1-4_fortran.nxs |
http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/Gaussian_y_J_053.nxs |
http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/Lorentzian_y_J_d.nxs |
http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/PearsonVII_y_J_d.nxs |
http://trac.mantidproject.org/mantid/raw-attachment/ticket/9782/Gaussian_y_J_d.nxs |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. If you feel this is incorrect please comment to keep it alive, with a reason why. To prevent closure, e.g. for long-term planning issues, add the "Never Stale" label. |
This issue has been closed automatically. If this still affects you please re-open this issue with a comment so we can look into resolving it. |
This issue was originally TRAC 9782
As described in the following document:
https://github.com/mantidproject/documents/blob/master/Design/Autodiff.md
I would like to have a function interface that uses automatic differentiation instead of numerical differentiation, utilizing "adept":
http://www.met.rdg.ac.uk/clouds/adept/
Since this new function requires adept as a third party library (and two or three very small source code changes to one of the files in order to make it compile in windows), I would appreciate some advice on the best way to integrate this library.
I will use this branch to keep track of my local work, the code will not compile before the "library integration issue" is solved.
If you would like to try anyway, I can explain how to build and install adept (a target to make a shared object file has to be added to the makefile, it only builds a static library by default) on Linux and/or supply the changes required to build on windows.
The text was updated successfully, but these errors were encountered: