Including Context
Suppose we add a one symbol context. That is in compressing a string x1x2...xn we want to take into account xk-1 when encoding xk.
- New model, so entropy based on just independent probabilities of the symbols doesn’t hold. The new entropy model (2nd order entropy) has for each symbol a probability for each other symbol following it.
- Example: {a,b,c}