Definition of information
The average self-information or entropy of a set X
P ? probability
Examples: What is the entropy of the set [0,1] if
P(0) = P(1) = 0.5?
P(0) = 3/4 and P(1) = 1/4?
Previous slide
Next slide
Back to first slide
View graphic version