Entropy
The entropy is defined for a probability distribution over symbols {a1, a2, ... , am}.
H is the average number of bits required to code up a symbol, given all we know is the probability distribution of the symbols.
H is the Shannon lower bound on the average number of bits to code a symbol in this source model.
Stronger models of entropy include context.