scipy.optimize.brent(func, args=(), brack=None, tol=1.48e-08, full_output=0, maxiter=500)[source]#

Given a function of one variable and a possible bracket, return a local minimizer of the function isolated to a fractional precision of tol.

funccallable f(x,*args)

Objective function.

argstuple, optional

Additional arguments (if present).

bracktuple, optional

Either a triple (xa, xb, xc) satisfying xa < xb < xc and func(xb) < func(xa) and  func(xb) < func(xc), or a pair (xa, xb) to be used as initial points for a downhill bracket search (see scipy.optimize.bracket). The minimizer x will not necessarily satisfy xa <= x <= xb.

tolfloat, optional

Relative error in solution xopt acceptable for convergence.

full_outputbool, optional

If True, return all output args (xmin, fval, iter, funcalls).

maxiterint, optional

Maximum number of iterations in solution.


Optimum point.


(Optional output) Optimum function value.


(Optional output) Number of iterations.


(Optional output) Number of objective function evaluations made.

See also


Interface to minimization algorithms for scalar univariate functions. See the ‘Brent’ method in particular.


Uses inverse parabolic interpolation when possible to speed up convergence of golden section method.

Does not ensure that the minimum lies in the range specified by brack. See scipy.optimize.fminbound.


We illustrate the behaviour of the function when brack is of size 2 and 3 respectively. In the case where brack is of the form (xa, xb), we can see for the given values, the output does not necessarily lie in the range (xa, xb).

>>> def f(x):
...     return (x-1)**2
>>> from scipy import optimize
>>> minimizer = optimize.brent(f, brack=(1, 2))
>>> minimizer
>>> res = optimize.brent(f, brack=(-1, 0.5, 2), full_output=True)
>>> xmin, fval, iter, funcalls = res
>>> f(xmin), fval
(0.0, 0.0)