Average mutual information
Definition: I(X;Y) = H(X) – H(X½Y)
Properties
- I(X;Y) >= 0
- With equality iff x and y are statistically independent
- I(X;Y) <= H(X)
- With equality iff x is a function of y
- x cannot convey more information about y than its own entropy
- I(X;Y) <= H(Y)