scipy.special.xlogy#

scipy.special.xlogy(x, y, out=None) = <ufunc 'xlogy'>#

Compute x*log(y) so that the result is 0 if x = 0.

Parameters:
xarray_like

Multiplier

yarray_like

Argument

outndarray, optional

Optional output array for the function results

Returns:
zscalar or ndarray

Computed x*log(y)

Notes

The log function used in the computation is the natural log.

New in version 0.13.0.

Examples

We can use this function to calculate the binary logistic loss also known as the binary cross entropy. This loss function is used for binary classification problems and is defined as:

\[L = 1/n * \sum_{i=0}^n -(y_i*log(y\_pred_i) + (1-y_i)*log(1-y\_pred_i))\]

We can define the parameters x and y as y and y_pred respectively. y is the array of the actual labels which over here can be either 0 or 1. y_pred is the array of the predicted probabilities with respect to the positive class (1).

>>> import numpy as np
>>> from scipy.special import xlogy
>>> y = np.array([0, 1, 0, 1, 1, 0])
>>> y_pred = np.array([0.3, 0.8, 0.4, 0.7, 0.9, 0.2])
>>> n = len(y)
>>> loss = -(xlogy(y, y_pred) + xlogy(1 - y, 1 - y_pred)).sum()
>>> loss /= n
>>> loss
0.29597052165495025

A lower loss is usually better as it indicates that the predictions are similar to the actual labels. In this example since our predicted probabilities are close to the actual labels, we get an overall loss that is reasonably low and appropriate.