scipy.special.xlogy(x, y, out=None) = <ufunc 'xlogy'>#

Compute x*log(y) so that the result is 0 if x = 0.





outndarray, optional

Optional output array for the function results

zscalar or ndarray

Computed x*log(y)


The log function used in the computation is the natural log.

New in version 0.13.0.


We can use this function to calculate the binary logistic loss also known as the binary cross entropy. This loss function is used for binary classification problems and is defined as:

\[L = 1/n * \sum_{i=0}^n -(y_i*log(y\_pred_i) + (1-y_i)*log(1-y\_pred_i))\]

We can define the parameters x and y as y and y_pred respectively. y is the array of the actual labels which over here can be either 0 or 1. y_pred is the array of the predicted probabilities with respect to the positive class (1).

>>> import numpy as np
>>> from scipy.special import xlogy
>>> y = np.array([0, 1, 0, 1, 1, 0])
>>> y_pred = np.array([0.3, 0.8, 0.4, 0.7, 0.9, 0.2])
>>> n = len(y)
>>> loss = -(xlogy(y, y_pred) + xlogy(1 - y, 1 - y_pred)).sum()
>>> loss /= n
>>> loss

A lower loss is usually better as it indicates that the predictions are similar to the actual labels. In this example since our predicted probabilties are close to the actual labels, we get an overall loss that is reasonably low and appropriate.