Notes on entropy (con’t)
Entropy is a measure of uncertainty
- Certain events convey zero information
- Likely events convey little information
- Unlikely events convey large amounts of information
Entropy is an average measure
What is the information in a specific event?
- Consider X = {a1, a2, a3, …aK}
- When event ak occurs, the information is log[1/P(ak)]