Example: Channel capacity (con’t)
The number of possible message strings is 2RT
Assume no noise (no errors)
T is the time to send the string
The maximum entropy of the source is Ho = log(2RT ) bits
The source rate is (1/T) Ho = R bits per second
The entropy of the noise (per transmitted bit) is
Hn = qlog[1/q] + (1–q)log[1/(1–q)]
The channel capacity C = R – RHn = R(1 – Hn)
C is always less than R (a fixed fraction of R)!!!
We must add code bits to correct the received message