Skip to content

Commit

Permalink
Changed Example and Reference to Examples and References in docstring…
Browse files Browse the repository at this point in the history
…s to comply with numpydoc-style.
  • Loading branch information
pbrod committed Mar 20, 2018
1 parent e79dfe0 commit a4bf4ba
Show file tree
Hide file tree
Showing 10 changed files with 62 additions and 65 deletions.
2 changes: 1 addition & 1 deletion docs/changelog.rst
@@ -1,2 +1,2 @@
.. _changes:
.. include:: ../CHANGES.rst
.. include:: ../CHANGELOG.rst
28 changes: 14 additions & 14 deletions src/numdifftools/core.py
Expand Up @@ -88,8 +88,8 @@ def _assert(cond, msg):
For all methods one should be careful in decreasing the step size too much
due to round-off errors.
%(extra_note)s
Reference
---------
References
----------
Ridout, M.S. (2009) Statistical applications of the complex-step method
of numerical differentiation. The American Statistician, 63, 66-74
Expand Down Expand Up @@ -128,8 +128,8 @@ class Derivative(_Limit):
der : ndarray
array of derivatives
""", example="""
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools as nd
Expand Down Expand Up @@ -485,8 +485,8 @@ def directionaldiff(f, x0, vec, **options):
dder: scalar
estimate of the first derivative of f in the specified direction.
Example
-------
Examples
--------
At the global minimizer (1,1) of the Rosenbrock function,
compute the directional derivative in the direction [1 2]
Expand Down Expand Up @@ -535,8 +535,8 @@ class Jacobian(Derivative):
with the Jacobian of each observation with shape xk x nobs x xk. I.e.,
the Jacobian of the first observation would be [:, 0, :]
""", example="""
Example
-------
Examples
--------
>>> import numdifftools as nd
#(nonlinear least squares)
Expand Down Expand Up @@ -683,8 +683,8 @@ class Gradient(Jacobian):
If x0 is an n x m array, then fun is assumed to be a function of n * m
variables.
""", example="""
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools as nd
>>> fun = lambda x: np.sum(x**2)
Expand Down Expand Up @@ -739,8 +739,8 @@ class Hessdiag(Derivative):
also suffer more from numerical problems. First order methods is usually
not recommended.
""", example="""
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools as nd
>>> fun = lambda x : x[0] + x[1]**2 + x[2]**3
Expand Down Expand Up @@ -850,8 +850,8 @@ class Hessian(Hessdiag):
where :math:`e_j` is a vector with element :math:`j` is one and the rest
are zero and :math:`d_j` is a scalar spacing :math:`steps_j`.
""", example="""
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools as nd
Expand Down
16 changes: 8 additions & 8 deletions src/numdifftools/extrapolation.py
Expand Up @@ -210,8 +210,8 @@ class EpsAlg(object):
This implementaion is from [1]_
Reference
---------
References
----------
.. [1] E. J. Weniger (1989)
"Nonlinear sequence transformations for the acceleration of
convergence and the summation of divergent series"
Expand Down Expand Up @@ -335,8 +335,8 @@ def dea3(v0, v1, v2, symmetric=False):
convergence. The routine is based on the epsilon algorithm of
P. Wynn, see [1]_.
Example
-------
Examples
--------
# integrate sin(x) from 0 to pi/2
>>> import numpy as np
Expand All @@ -356,8 +356,8 @@ def dea3(v0, v1, v2, symmetric=False):
--------
dea
Reference
---------
References
----------
.. [1] C. Brezinski and M. Redivo Zaglia (1991)
"Extrapolation Methods. Theory and Practice", North-Holland.
Expand Down Expand Up @@ -407,8 +407,8 @@ class Richardson(object):
we can fit a polynomial to that sequence of approximations.
This is exactly what this class does.
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools as nd
>>> n = 3
Expand Down
20 changes: 10 additions & 10 deletions src/numdifftools/fornberg.py
Expand Up @@ -60,8 +60,8 @@ def fd_weights_all(x, x0=0, n=1):
---------
fd_weights
Reference
---------
References
----------
B. Fornberg (1998)
"Calculation of weights_and_points in finite difference formulas",
SIAM Review 40, pp. 685-691.
Expand Down Expand Up @@ -107,8 +107,8 @@ def fd_weights(x, x0=0, n=1):
order of derivative. Note for n=0 this can be used to evaluate the
interpolating polynomial itself.
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools.fornberg as ndf
>>> x = np.linspace(-1, 1, 5) * 1e-3
Expand Down Expand Up @@ -146,8 +146,8 @@ def fd_derivative(fx, x, n=1, m=2):
vector function f(x) using the Fornberg finite difference method.
Restrictions: 0 < n < len(x) and 2*mm+2 <= len(x)
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools.fornberg as ndf
>>> x = np.linspace(-1, 1, 25)
Expand Down Expand Up @@ -363,8 +363,8 @@ def taylor(fun, z0=0, n=1, r=0.0061, num_extrap=3, step_ratio=1.6, **kwds):
an answer will still be computed and returned but should be used with
caution.
Example
-------
Examples
--------
Compute the first 6 taylor coefficients 1 / (1 - z) expanded round z0 = 0:
>>> import numdifftools.fornberg as ndf
Expand Down Expand Up @@ -519,8 +519,8 @@ def derivative(fun, z0, n=1, **kwds):
an answer will still be computed and returned but should be used with
caution.
Example
-------
Examples
--------
To compute the first five derivatives of 1 / (1 - z) at z = 0:
Compute the first 6 taylor derivatives of 1 / (1 - z) at z0 = 0:
Expand Down
2 changes: 1 addition & 1 deletion src/numdifftools/info.py
Expand Up @@ -157,7 +157,7 @@
Numdifftools has as of version 0.9 been extended with some of the functionality
found in the statsmodels.tools.numdiff module written by Josef Perktold
[Perktold2014]_.
[Perktold2014]_ and in the project report of [Verheyleweghen2014]_.
References
Expand Down
4 changes: 2 additions & 2 deletions src/numdifftools/limits.py
Expand Up @@ -272,8 +272,8 @@ class Limit(_Limit):
accurate. The `step_ratio` MUST be a scalar larger than 1. A value in the
range [2,100] is recommended. 4 seems a good compromise.
Example
-------
Examples
--------
Compute the limit of sin(x)./x, at x == 0. The limit is 1.
>>> import numpy as np
Expand Down
28 changes: 14 additions & 14 deletions src/numdifftools/nd_algopy.py
Expand Up @@ -41,8 +41,8 @@
for gradient-based optimization algorithms. Algoritmic differentiation
solves all of these problems.
Reference
---------
References
----------
Sebastian F. Walter and Lutz Lehmann 2013,
"Algorithmic differentiation in Python with AlgoPy",
in Journal of Computational Science, vol 4, no 5, pp 334 - 344,
Expand Down Expand Up @@ -164,8 +164,8 @@ class Derivative(_Derivative):
der : ndarray
array of derivatives
""", example="""
Example
-------
Examples
--------
# 1'st and 2'nd derivative of exp(x), at x == 1
>>> import numpy as np
Expand Down Expand Up @@ -226,8 +226,8 @@ class Gradient(_Derivative):
grad : array
gradient
""", example="""
Example
-------
Examples
--------
>>> import numdifftools.nd_algopy as nda
>>> fun = lambda x: np.sum(x**2)
>>> df = nda.Gradient(fun, method='reverse')
Expand Down Expand Up @@ -285,8 +285,8 @@ class Jacobian(Gradient):
jacob : array
Jacobian
""", example="""
Example
-------
Examples
--------
>>> import numdifftools.nd_algopy as nda
#(nonlinear least squares)
Expand Down Expand Up @@ -364,8 +364,8 @@ class Hessian(_Derivative):
hess : ndarray
array of partial second derivatives, Hessian
""", extra_note='', example="""
Example
-------
Examples
--------
>>> import numdifftools.nd_algopy as nda
# Rosenbrock function, minimized at [1,1]
Expand Down Expand Up @@ -428,8 +428,8 @@ class Hessdiag(Hessian):
hessdiag : ndarray
Hessian diagonal array of partial second order derivatives.
""", extra_note='', example="""
Example
-------
Examples
--------
>>> import numdifftools.nd_algopy as nda
# Rosenbrock function, minimized at [1,1]
Expand Down Expand Up @@ -501,8 +501,8 @@ def directionaldiff(f, x0, vec, **options):
dder: scalar
estimate of the first derivative of fun in the specified direction.
Example
-------
Examples
--------
At the global minimizer (1,1) of the Rosenbrock function,
compute the directional derivative in the direction [1 2]
Expand Down
5 changes: 3 additions & 2 deletions src/numdifftools/nd_scipy.py
Expand Up @@ -98,8 +98,9 @@ class Gradient(Jacobian):
x * _EPS**(1/3) for method==`central`.
method : {'central', 'complex', 'forward'}
defines the method used in the approximation.
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools.nd_scipy as nd
>>> fun = lambda x: np.sum(x**2)
Expand Down
12 changes: 5 additions & 7 deletions src/numdifftools/nd_statsmodels.py
Expand Up @@ -33,8 +33,8 @@ class Hessian(_Common):
method : {'central', 'complex', 'forward'}
defines the method used in the approximation.
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools.nd_statsmodels as nd
Expand Down Expand Up @@ -149,8 +149,9 @@ class Gradient(Jacobian):
x * _EPS**(1/3) for method==`central`.
method : {'central', 'complex', 'forward'}
defines the method used in the approximation.
Example
-------
Examples
--------
>>> import numpy as np
>>> import numdifftools.nd_statsmodels as nd
>>> fun = lambda x: np.sum(x**2)
Expand Down Expand Up @@ -235,9 +236,6 @@ def approx_fprime(x, f, epsilon=None, args=(), kwargs=None, centered=True):
with the Jacobian of each observation with shape xk x nobs x xk. I.e.,
the Jacobian of the first observation would be [:, 0, :]
Example
-------
"""
kwargs = {} if kwargs is None else kwargs
n = len(x)
Expand Down
10 changes: 4 additions & 6 deletions src/numdifftools/step_generators.py
Expand Up @@ -75,9 +75,8 @@ class BasicMaxStepGenerator(object):
offset : real scalar, optional, default 0
offset to the base step
Example
-------
Examples
--------
>>> from numdifftools.step_generators import BasicMaxStepGenerator
>>> step_gen = BasicMaxStepGenerator(base_step=2.0, step_ratio=2,
... num_steps=4)
Expand Down Expand Up @@ -126,9 +125,8 @@ class BasicMinStepGenerator(BasicMaxStepGenerator):
offset : real scalar, optional, default 0
offset to the base step
Example
-------
Examples
--------
>>> from numdifftools.step_generators import BasicMinStepGenerator
>>> step_gen = BasicMinStepGenerator(base_step=0.25, step_ratio=2,
... num_steps=4)
Expand Down

0 comments on commit a4bf4ba

Please sign in to comment.