First-Order Entropy of a String
Suppose we are given a string x1x2...xn in an alphabet {a1,a2,...,am} where P(ai) is the probability of symbol i.
The first-order entropy of x1x2...xn is
H(x1x2...xn) is a lower bound on the number of bits to code the string x1x2...xn given only the probabilities of the symbols. This is the Shannon lower bound.