Kullback-Leiber Divergence
Definition
For two distributions
The KL divergence can be used to measure how much
Properties
- From Gibb's inequality,
, and the equality holds if and only if . - The KL divergence is additive for independent distributions
and :
KL divergence is not a Metric Distance
KL divergence is not a metric distance, because it does not satisfies the symmetric axiom and the triangle inequality. Jensen–Shannon Divergence, or JS divergence, is a symmetrized version of it.