« Back to Glossary Index

Kullback-Leibler Divergence (KL Divergence) is a measure used in statistics and information theory to quantify how one probability distribution diverges from a second, expected probability distribution. It indicates the amount of information lost when approximating one distribution with another.

Applications/Use Cases:

  • Machine Learning: Evaluating how well a model’s predicted probability distribution matches the true distribution of data.
  • Data Compression: Assessing the efficiency of data encoding schemes.
  • Anomaly Detection: Identifying unusual patterns by comparing observed data distributions to expected ones.
« Back to Glossary Index