scipy.special.rel_entr#

scipy.special.rel_entr(x, y, out=None) = <ufunc 'rel_entr'>#

Elementwise function for computing relative entropy.

\[\begin{split}\mathrm{rel\_entr}(x, y) = \begin{cases} x \log(x / y) & x > 0, y > 0 \\ 0 & x = 0, y \ge 0 \\ \infty & \text{otherwise} \end{cases}\end{split}\]
Parameters:
x, yarray_like

Input arrays

outndarray, optional

Optional output array for the function results

Returns:
scalar or ndarray

Relative entropy of the inputs

Notes

New in version 0.15.0.

This function is jointly convex in x and y.

The origin of this function is in convex programming; see [1]. Given two discrete probability distributions \(p_1, \ldots, p_n\) and \(q_1, \ldots, q_n\), the definition of relative entropy in the context of information theory is

\[\sum_{i = 1}^n \mathrm{rel\_entr}(p_i, q_i).\]

To compute the latter quantity, use scipy.stats.entropy.

See [2] for details.

References

[1]

Boyd, Stephen and Lieven Vandenberghe. Convex optimization. Cambridge University Press, 2004. DOI:https://doi.org/10.1017/CBO9780511804441