Solve automatic numerical differentiation problems in one or more variables.
Clone or download
Permalink
Failed to load latest commit information.
appveyor Added required appveyor install scripts Oct 29, 2015
conda_recipe Removed duplicated code + pep8 Jul 21, 2016
docs Changed Example and Reference to Examples and References in docstring… Mar 20, 2018
examples Updated example Oct 31, 2015
src/numdifftools Added finite_difference.py Jun 10, 2018
.checkignore added .checkignore for quantificode Feb 9, 2016
.codeclimate.yml Removed commented out code. Set pyqt=5 in appveyor.yml Jun 9, 2018
.coveragerc Removed exclusion of tests/ directory from test coverage. Jun 6, 2018
.gitattributes Restructerd folders of numdifftools Dec 16, 2014
.gitignore Updated to pyscaffold 3.0 Mar 18, 2018
.landscape.yml Removed unecessary parenthesis and code. pep8 Aug 29, 2016
.pylintrc Updated .pylintrc and .travis.yml Feb 25, 2017
.travis.yml New attempt to make travis not crash. Jun 10, 2018
AUTHORS.rst Updated to pyscaffold 3.0 Mar 18, 2018
CHANGELOG.rst Renamed CHANGES.rst to CHANGELOG.rst Mar 20, 2018
LICENSE.txt Updated to pyscaffold 3.0 Mar 18, 2018
MANIFEST.in include LICENSE.txt in distributions Dec 21, 2017
README.rst Removed link to depsy Jun 6, 2018
appveyor.yml Updated .travis.yml and .appveyor.yml Jun 9, 2018
requirements.readthedocs.txt Updated sphinx to 1.6.7 Jun 6, 2018
requirements.txt Removed line_profiler from requirements.txt Jun 8, 2018
setup.cfg Removed line_profiler from requirements.txt Jun 8, 2018
setup.py Another attempt to fix issue #37 Jun 8, 2018
sonar-project.properties Updated project path Mar 21, 2018
sonar-project_readme.txt Added nd_statsmodels.py Jan 20, 2017
tox.ini Reverted --pep8 option Apr 5, 2017

README.rst

Numdifftools

pkg_img tests_img tests2_img docs_img Code Health coverage_img versions_img

Numdifftools is a suite of tools written in _Python to solve automatic numerical differentiation problems in one or more variables. Finite differences are used in an adaptive manner, coupled with a Richardson extrapolation methodology to provide a maximally accurate result. The user can configure many options like; changing the order of the method or the extrapolation, even allowing the user to specify whether complex-step, central, forward or backward differences are used.

The methods provided are:

  • Derivative: Compute the derivatives of order 1 through 10 on any scalar function.
  • directionaldiff: Compute directional derivative of a function of n variables
  • Gradient: Compute the gradient vector of a scalar function of one or more variables.
  • Jacobian: Compute the Jacobian matrix of a vector valued function of one or more variables.
  • Hessian: Compute the Hessian matrix of all 2nd partial derivatives of a scalar function of one or more variables.
  • Hessdiag: Compute only the diagonal elements of the Hessian matrix

All of these methods also produce error estimates on the result.

Numdifftools also provide an easy to use interface to derivatives calculated with in _AlgoPy. Algopy stands for Algorithmic Differentiation in Python. The purpose of AlgoPy is the evaluation of higher-order derivatives in the forward and reverse mode of Algorithmic Differentiation (AD) of functions that are implemented as Python programs.

Getting Started

Visualize high order derivatives of the tanh function

>>> import numpy as np
>>> import numdifftools as nd
>>> import matplotlib.pyplot as plt
>>> x = np.linspace(-2, 2, 100)
>>> for i in range(10):
...    df = nd.Derivative(np.tanh, n=i)
...    y = df(x)
...    h = plt.plot(x, y/np.abs(y).max())
>>> plt.show()
https://raw.githubusercontent.com/pbrod/numdifftools/master/examples/fun.png

Compute 1'st and 2'nd derivative of exp(x), at x == 1:

>>> fd = nd.Derivative(np.exp)        # 1'st derivative
>>> fdd = nd.Derivative(np.exp, n=2)  # 2'nd derivative
>>> np.allclose(fd(1), 2.7182818284590424)
True
>>> np.allclose(fdd(1), 2.7182818284590424)
True

Nonlinear least squares:

>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = nd.Jacobian(fun)
>>> np.allclose(np.abs(Jfun([1,2,0.75])), 0) # should be numerically zero
True

Compute gradient of sum(x**2):

>>> fun = lambda x: np.sum(x**2)
>>> dfun = nd.Gradient(fun)
>>> dfun([1,2,3])
array([ 2.,  4.,  6.])

Compute the same with the easy to use interface to AlgoPy:

>>> import numdifftools.nd_algopy as nda
>>> import numpy as np
>>> fd = nda.Derivative(np.exp)        # 1'st derivative
>>> fdd = nda.Derivative(np.exp, n=2)  # 2'nd derivative
>>> np.allclose(fd(1), 2.7182818284590424)
True
>>> np.allclose(fdd(1), 2.7182818284590424)
True

Nonlinear least squares:

>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = nda.Jacobian(fun, method='reverse')
>>> np.allclose(np.abs(Jfun([1,2,0.75])), 0) # should be numerically zero
True

Compute gradient of sum(x**2):

>>> fun = lambda x: np.sum(x**2)
>>> dfun = nda.Gradient(fun)
>>> dfun([1,2,3])
array([ 2.,  4.,  6.])

See also

scipy.misc.derivative

Documentation and code

Numdifftools works on Python 2.7+ and Python 3.0+.

Official releases available at: http://pypi.python.org/pypi/numdifftools pkg_img

Official documentation available at: http://numdifftools.readthedocs.io/en/latest/ docs_img

Bleeding edge: https://github.com/pbrod/numdifftools.

Installation

If you have pip installed, then simply type:

$ pip install numdifftools

to get the lastest stable version. Using pip also has the advantage that all requirements are automatically installed.

Unit tests

To test if the toolbox is working paste the following in an interactive python session:

import numdifftools as nd
nd.test(coverage=True, doctests=True)

Acknowledgement

The numdifftools package for Python was written by Per A. Brodtkorb based on the adaptive numerical differentiation toolbox written in Matlab by John D'Errico [DErrico2006].

Numdifftools has as of version 0.9 been extended with some of the functionality found in the statsmodels.tools.numdiff module written by Josef Perktold [Perktold2014].

References

[DErrico2006]D'Errico, J. R. (2006), Adaptive Robust Numerical Differentiation http://www.mathworks.com/matlabcentral/fileexchange/13490-adaptive-robust-numerical-differentiation
[Perktold2014]Perktold, J (2014), numdiff package http://statsmodels.sourceforge.net/0.6.0/_modules/statsmodels/tools/numdiff.html
[Lantoine2010]Gregory Lantoine (2010), A methodology for robust optimization of low-thrust trajectories in multi-body environments, Phd thesis, Georgia Institute of Technology
[LantoineEtal2012]Gregory Lantoine, R.P. Russell, and T. Dargent (2012) Using multicomplex variables for automatic computation of high-order derivatives, ACM Transactions on Mathematical Software, Vol. 38, No. 3, Article 16, April 2012, 21 pages, http://doi.acm.org/10.1145/2168773.2168774
[Luna-ElizarrarasEtal2012]M.E. Luna-Elizarraras, M. Shapiro, D.C. Struppa1, A. Vajiac (2012), CUBO A Mathematical Journal, Vol. 14, No 2, (61-80). June 2012.
[Verheyleweghen2014]Adriaen Verheyleweghen, (2014) Computation of higher-order derivatives using the multi-complex step method, Project report, NTNU

Note

This project has been set up using PyScaffold 3.0. For details and usage information on PyScaffold see http://pyscaffold.readthedocs.org/.