kl_divergence#

kl_divergence(distribution_A, distribution_B)#

Compute the Kullback-Leibler Divergence between two distributions. Also known as relative entropy, the Kullback-Leibler Divergence is a measure of how one probability distribution diverges from a second, expected probability distribution. It is defined as the expectation of the logarithm of the ratio of the two probability distributions.

Parameters:
distribution_Aarray-like

The first distribution.

distribution_Barray-like

The second distribution.

Returns:
float

The Kullback-Leibler Divergence between the two distributions.

Notes

NaN values are ignored in the distributions. Distributions will be normalized to sum to 1.