scipy.special.

log_softmax#

scipy.special.log_softmax(x, axis=None)[source]#

Compute the logarithm of the softmax function.

In principle:

log_softmax(x) = log(softmax(x))

but using a more accurate implementation.

Parameters:
xarray_like

Input array.

axisint or tuple of ints, optional

Axis to compute values along. Default is None and softmax will be computed over the entire array x.

Returns:
sndarray or scalar

An array with the same shape as x. Exponential of the result will sum to 1 along the specified axis. If x is a scalar, a scalar is returned.

Notes

log_softmax is more accurate than np.log(softmax(x)) with inputs that make softmax saturate (see examples below).

Added in version 1.5.0.

log_softmax has experimental support for Python Array API Standard compatible backends in addition to NumPy. Please consider testing these features by setting an environment variable SCIPY_ARRAY_API=1 and providing CuPy, PyTorch, JAX, or Dask arrays as array arguments. The following combinations of backend and device (or other capability) are supported.

Library

CPU

GPU

NumPy

n/a

CuPy

n/a

PyTorch

JAX

Dask

See Support for the array API standard for more information.

Examples

>>> import numpy as np
>>> from scipy.special import log_softmax
>>> from scipy.special import softmax
>>> np.set_printoptions(precision=5)
>>> x = np.array([1000.0, 1.0])
>>> y = log_softmax(x)
>>> y
array([   0., -999.])
>>> with np.errstate(divide='ignore'):
...   y = np.log(softmax(x))
...
>>> y
array([  0., -inf])