NonlinearConstraint#
- class scipy.optimize.NonlinearConstraint(fun, lb, ub, jac='2-point', hess=None, keep_feasible=False, finite_diff_rel_step=None, finite_diff_jac_sparsity=None)[source]#
Nonlinear constraint on the variables.
The constraint has the general inequality form:
lb <= fun(x) <= ub
Here the vector of independent variables x is passed as ndarray of shape (n,) and
funreturns a vector with m components.It is possible to use equal bounds to represent an equality constraint or infinite bounds to represent a one-sided constraint.
- Parameters:
- funcallable
The function defining the constraint. The signature is
fun(x) -> array_like, shape (m,).- lb, ubarray_like
Lower and upper bounds on the constraint. Each array must have the shape (m,) or be a scalar, in the latter case a bound will be the same for all components of the constraint. Use
np.infwith an appropriate sign to specify a one-sided constraint. Set components of lb and ub equal to represent an equality constraint. Note that you can mix constraints of different types: interval, one-sided or equality, by setting different components of lb and ub as necessary.- jac{callable, ‘2-point’, ‘3-point’, ‘cs’}, optional
Method of computing the Jacobian matrix (an m-by-n matrix, where element (i, j) is the partial derivative of f[i] with respect to x[j]). The keywords {‘2-point’, ‘3-point’, ‘cs’} select a finite difference scheme for the numerical estimation. A callable must have the following signature:
jac(x) -> {ndarray, sparse array}, shape (m, n)
Default is ‘2-point’.
- hess{callable, ‘2-point’, ‘3-point’, ‘cs’, HessianUpdateStrategy, None}, optional
Method for computing the Hessian matrix. The keywords {‘2-point’, ‘3-point’, ‘cs’} select a finite difference scheme for numerical estimation. Alternatively, objects implementing
HessianUpdateStrategyinterface can be used to approximate the Hessian. Currently available implementations are:A callable must return the Hessian matrix of
dot(fun, v)and must have the following signature:hess(x, v) -> {LinearOperator, sparse array, array_like}, shape (n, n). Herevis ndarray with shape (m,) containing Lagrange multipliers.- keep_feasiblearray_like of bool, optional
Whether to keep the constraint components feasible throughout iterations. A single value sets this property for all components. Default is False. Has no effect for equality constraints. Note that finite difference approximation of the Jacobian may still violate the constraint; it is recommended to provide an analytical Jacobian function to handle this case.
- finite_diff_rel_stepNone or array_like, optional
Relative step size for the finite difference approximation. Default is None, which will select a reasonable value automatically depending on a finite difference scheme.
- finite_diff_jac_sparsity{None, array_like, sparse array}, optional
Defines the sparsity structure of the Jacobian matrix for finite difference estimation, its shape must be (m, n). If the Jacobian has only few non-zero elements in each row, providing the sparsity structure will greatly speed up the computations. A zero entry means that a corresponding element in the Jacobian is identically zero. If provided, forces the use of ‘lsmr’ trust-region solver. If None (default) then dense differencing will be used.
Notes
Finite difference schemes {‘2-point’, ‘3-point’, ‘cs’} may be used for approximating either the Jacobian or the Hessian. We, however, do not allow its use for approximating both simultaneously. Hence whenever the Jacobian is estimated via finite-differences, we require the Hessian to be estimated using one of the quasi-Newton strategies.
The scheme ‘cs’ is potentially the most accurate, but requires the function to correctly handles complex inputs and be analytically continuable to the complex plane. The scheme ‘3-point’ is more accurate than ‘2-point’ but requires twice as many operations.
Whilst
NonlinearConstraintcan be used to specify constraints for many different optimizers, the class is not responsible for enforcing those constraints, that is done by the individual minimizer. Importantly, the keep_feasible keyword is only ever used within the trust-constr optimizer, the keep_feasible keyword is not used by otherminimizemethods. The other methods may, or may not, keep solutions strictly feasible during operation.Examples
Constrain
x[0] < sin(x[1]) + 1.9>>> from scipy.optimize import NonlinearConstraint >>> import numpy as np >>> con = lambda x: x[0] - np.sin(x[1]) >>> nlc = NonlinearConstraint(con, -np.inf, 1.9)