Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOC: fix documentation for scipy.optimize.brenth #14729

Merged
merged 2 commits into from
Nov 10, 2021
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
49 changes: 27 additions & 22 deletions scipy/optimize/_zeros_py.py
Original file line number Diff line number Diff line change
Expand Up @@ -792,11 +792,15 @@ def brenth(f, a, b, args=(),

A variation on the classic Brent routine to find a zero of the function f
between the arguments a and b that uses hyperbolic extrapolation instead of
inverse quadratic extrapolation. There was a paper back in the 1980's ...
inverse quadratic extrapolation. Bus & Dekker (1975) guarantee convergence
for this method, claiming that the upper bound of function evaluations here
is 4 or 5 times lesser than that for bisection.
f(a) and f(b) cannot have the same signs. Generally, on a par with the
brent routine, but not as heavily tested. It is a safe version of the
secant method that uses hyperbolic extrapolation. The version here is by
Chuck Harris.
secant method that uses hyperbolic extrapolation.
The version here is by Chuck Harris, and implements Algorithm M of
[BusAndDekker1975]_, where further details (convergence properties,
additional remarks and such) can be found

Parameters
----------
Expand Down Expand Up @@ -843,6 +847,26 @@ def brenth(f, a, b, args=(),
Object containing information about the convergence. In particular,
``r.converged`` is True if the routine converged.

See Also
--------
fmin, fmin_powell, fmin_cg, fmin_bfgs, fmin_ncg : multivariate local optimizers
leastsq : nonlinear least squares minimizer
fmin_l_bfgs_b, fmin_tnc, fmin_cobyla : constrained multivariate optimizers
basinhopping, differential_evolution, brute : global optimizers
fminbound, brent, golden, bracket : local scalar minimizers
fsolve : N-D root-finding
brentq, brenth, ridder, bisect, newton : 1-D root-finding
fixed_point : scalar fixed-point finder

References
----------
.. [BusAndDekker1975]
Bus, J. C. P., Dekker, T. J.,
"Two Efficient Algorithms with Guaranteed Convergence for Finding a Zero
of a Function", ACM Transactions on Mathematical Software, Vol. 1, Issue
4, Dec. 1975, pp. 330-345. Section 3: "Algorithm M".
:doi:`10.1145/355656.355659`

Examples
--------
>>> def f(x):
Expand All @@ -858,25 +882,6 @@ def brenth(f, a, b, args=(),
>>> root
1.0

See Also
--------
fmin, fmin_powell, fmin_cg,
fmin_bfgs, fmin_ncg : multivariate local optimizers

leastsq : nonlinear least squares minimizer

fmin_l_bfgs_b, fmin_tnc, fmin_cobyla : constrained multivariate optimizers

basinhopping, differential_evolution, brute : global optimizers

fminbound, brent, golden, bracket : local scalar minimizers

fsolve : N-D root-finding

brentq, brenth, ridder, bisect, newton : 1-D root-finding

fixed_point : scalar fixed-point finder

"""
if not isinstance(args, tuple):
args = (args,)
Expand Down