Kullback-Leiber Divergence

Definition

For two distributions p and q, the Kullback-Leiber divergence (KL divergence) is defined as

DKL(p∣∣q):=H(p,q)H(p)=Ep[logpq].

The KL divergence can be used to measure how much p differs from q.

Properties

DKL(p1∣∣q1)+DKL(p2∣∣q2)=DKL(p1p2∣∣q1q2).

KL divergence is not a Metric Distance

KL divergence is not a metric distance, because it does not satisfies the symmetric axiom and the triangle inequality. Jensen–Shannon Divergence, or JS divergence, is a symmetrized version of it.