Information Theory
Developed by Shannon in the 1940’s and 50’s
Attempts to explain the limits of communication using probability theory.
Example: Suppose English text is being sent
- Suppose a “t” is received. Given English, the next symbol being a “z” has very low probability, the next symbol being a “h” has much higher probability. Receiving a “z” has much more information in it than receiving a “h”. We already knew it was more likely we would receive an “h”.