Comments on the theorem (con’t)
Regardless of our tolerance for error, in the limit we need H bits per outcome to specify the ensemble X
- H is the amount of information in X
- Also called the entropy, self-information, or uncertainty of X
- For large N, is approximately a constant function of ?
-
- This example: H= 0.469 bits
- Errors between 0% and 100% are asymptotically close to 0.469 bits