Chapter 2 Entropy, Relative Entropy & Mutual Information
Last updated
Was this helpful?
Last updated
Was this helpful?
DEF Entropy of a discrete random variable X is defined by
PROP
DEF Joint Entropy of a pair of discrete random variables with a joint distribution is defined as .
DEF Conditional Entropy is defined as:
THEOREM .
COR
PROOF
DEF Kullback-Leibler Distance / Relative Entropy .
PROP .
DEF Mutual Information is the relative entropy between the joint distribution and the product distribution :.
PROP .
PROP .
PROP
It's easy to see that relations between entropy, joint entropy, conditional entropy and mutual information can be represented by a Venn graph.