SciPy

Optimization and Root Finding (scipy.optimize)

SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programing, constrained and nonlinear least-squares, root finding and curve fitting.

Common functions and objects, shared across different solvers, are:

show_options([solver, method, disp]) Show documentation for additional options of optimization solvers.
OptimizeResult Represents the optimization result.
OptimizeWarning

Optimization

Scalar Functions Optimization

minimize_scalar(fun[, bracket, bounds, …]) Minimization of scalar function of one variable.

The minimize_scalar function supports the following methods:

Local (Multivariate) Optimization

minimize(fun, x0[, args, method, jac, hess, …]) Minimization of scalar function of one or more variables.

The minimize function supports the following methods:

Constraints are passed to minimize function as a single object or as a list of objects from the following classes:

NonlinearConstraint(fun, lb, ub[, jac, …]) Nonlinear constraint on the variables.
LinearConstraint(A, lb, ub[, keep_feasible]) Linear constraint on the variables.

Simple bound constraints are handled separately and there is a special class for them:

Bounds(lb, ub[, keep_feasible]) Bounds constraint on the variables.

Quasi-Newton strategies implementing HessianUpdateStrategy interface can be used to approximate the Hessian in minimize function (available only for the ‘trust-constr’ method). Available quasi-Newton methods implementing this interface are:

BFGS([exception_strategy, min_curvature, …]) Broyden-Fletcher-Goldfarb-Shanno (BFGS) Hessian update strategy.
SR1([min_denominator, init_scale]) Symmetric-rank-1 Hessian update strategy.

Global Optimization

basinhopping(func, x0[, niter, T, stepsize, …]) Find the global minimum of a function using the basin-hopping algorithm
brute(func, ranges[, args, Ns, full_output, …]) Minimize a function over a given range by brute force.
differential_evolution(func, bounds[, args, …]) Finds the global minimum of a multivariate function.

Least-squares and Curve Fitting

Nonlinear Least-Squares

least_squares(fun, x0[, jac, bounds, …]) Solve a nonlinear least-squares problem with bounds on the variables.

Linear Least-Squares

nnls(A, b[, maxiter]) Solve argmin_x || Ax - b ||_2 for x>=0.
lsq_linear(A, b[, bounds, method, tol, …]) Solve a linear least-squares problem with bounds on the variables.

Curve Fitting

curve_fit(f, xdata, ydata[, p0, sigma, …]) Use non-linear least squares to fit a function, f, to data.

Root finding

Scalar functions

brentq(f, a, b[, args, xtol, rtol, maxiter, …]) Find a root of a function in a bracketing interval using Brent’s method.
brenth(f, a, b[, args, xtol, rtol, maxiter, …]) Find root of f in [a,b].
ridder(f, a, b[, args, xtol, rtol, maxiter, …]) Find a root of a function in an interval.
bisect(f, a, b[, args, xtol, rtol, maxiter, …]) Find root of a function within an interval.
newton(func, x0[, fprime, args, tol, …]) Find a zero using the Newton-Raphson or secant method.

Fixed point finding:

fixed_point(func, x0[, args, xtol, maxiter, …]) Find a fixed point of the function.

Linear Programming

linprog(c[, A_ub, b_ub, A_eq, b_eq, bounds, …]) Minimize a linear objective function subject to linear equality and inequality constraints.

The linprog function supports the following methods:

The simplex method supports callback functions, such as:

linprog_verbose_callback(xk, **kwargs) A sample callback function demonstrating the linprog callback interface.

Assignment problems:

linear_sum_assignment(cost_matrix) Solve the linear sum assignment problem.

Utilities

Finite-Difference Approximation

approx_fprime(xk, f, epsilon, *args) Finite-difference approximation of the gradient of a scalar function.
check_grad(func, grad, x0, *args, **kwargs) Check the correctness of a gradient function by comparing it against a (forward) finite-difference approximation of the gradient.

Hessian Approximation

LbfgsInvHessProduct(sk, yk) Linear operator for the L-BFGS approximate inverse Hessian.
HessianUpdateStrategy Interface for implementing Hessian update strategies.

Benchmark Problems

rosen(x) The Rosenbrock function.
rosen_der(x) The derivative (i.e.
rosen_hess(x) The Hessian matrix of the Rosenbrock function.
rosen_hess_prod(x, p) Product of the Hessian matrix of the Rosenbrock function with a vector.

Legacy Functions

The functions below are not recommended for use in new scripts; all of these methods are accessible via a newer, more consistent interfaces, provided by the interfaces above.

Optimization

General-purpose multivariate methods:

fmin(func, x0[, args, xtol, ftol, maxiter, …]) Minimize a function using the downhill simplex algorithm.
fmin_powell(func, x0[, args, xtol, ftol, …]) Minimize a function using modified Powell’s method.
fmin_cg(f, x0[, fprime, args, gtol, norm, …]) Minimize a function using a nonlinear conjugate gradient algorithm.
fmin_bfgs(f, x0[, fprime, args, gtol, norm, …]) Minimize a function using the BFGS algorithm.
fmin_ncg(f, x0, fprime[, fhess_p, fhess, …]) Unconstrained minimization of a function using the Newton-CG method.

Constrained multivariate methods:

fmin_l_bfgs_b(func, x0[, fprime, args, …]) Minimize a function func using the L-BFGS-B algorithm.
fmin_tnc(func, x0[, fprime, args, …]) Minimize a function with variables subject to bounds, using gradient information in a truncated Newton algorithm.
fmin_cobyla(func, x0, cons[, args, …]) Minimize a function using the Constrained Optimization BY Linear Approximation (COBYLA) method.
fmin_slsqp(func, x0[, eqcons, f_eqcons, …]) Minimize a function using Sequential Least SQuares Programming
differential_evolution(func, bounds[, args, …]) Finds the global minimum of a multivariate function.

Univariate (scalar) minimization methods:

fminbound(func, x1, x2[, args, xtol, …]) Bounded minimization for scalar functions.
brent(func[, args, brack, tol, full_output, …]) Given a function of one-variable and a possible bracket, return the local minimum of the function isolated to a fractional precision of tol.
golden(func[, args, brack, tol, …]) Return the minimum of a function of one variable using golden section method.

Least-Squares

leastsq(func, x0[, args, Dfun, full_output, …]) Minimize the sum of squares of a set of equations.

Root Finding

General nonlinear solvers:

fsolve(func, x0[, args, fprime, …]) Find the roots of a function.
broyden1(F, xin[, iter, alpha, …]) Find a root of a function, using Broyden’s first Jacobian approximation.
broyden2(F, xin[, iter, alpha, …]) Find a root of a function, using Broyden’s second Jacobian approximation.

Large-scale nonlinear solvers:

newton_krylov(F, xin[, iter, rdiff, method, …]) Find a root of a function, using Krylov approximation for inverse Jacobian.
anderson(F, xin[, iter, alpha, w0, M, …]) Find a root of a function, using (extended) Anderson mixing.

Simple iteration solvers:

excitingmixing(F, xin[, iter, alpha, …]) Find a root of a function, using a tuned diagonal Jacobian approximation.
linearmixing(F, xin[, iter, alpha, verbose, …]) Find a root of a function, using a scalar Jacobian approximation.
diagbroyden(F, xin[, iter, alpha, verbose, …]) Find a root of a function, using diagonal Broyden Jacobian approximation.

Additional information on the nonlinear solvers