Huffman Code vs. Entropy
Entropy
H = -(.4 x log2(.4) + .1 x log2(.1) + .3 x log2(.3)
+ .1 x log2(.1) + .1 x log2(.1))
= 2.05 bits per symbol
Huffman Code
HC = .4 x 1 + .1 x 4 + .3 x 2 + .1 x 3 + .1 x 4
= 2.1 bits per symbol
pretty good!
P(a) =.4, P(b)=.1, P(c)=.3, P(d)=.1, P(e)=.1
Previous slide
Next slide
Back to first slide
View graphic version