CSE599 Lecture3: Information theory

Click here to start

Table of Contents

Information theory and CSE599 
What is “computation”
Information and Complexity
Axioms of probability theory
Results from probability theory
Combinatorics
4 postulates of information theory
Deterministic information Ho
Probabilistic information Hd
Entropy definition of information
Notes on entropy
Notes on entropy (con’t)
Class example
Joint ensembles
Entropies of joint ensembles
Notes on joint entropies
Average mutual information
Random variables: Mean and variance
Shannon’s source-coding theorem
Comments on the theorem
Comments on the theorem (con’t)
Source coding (data compression)
Source-coding definitions
Huffman coding
Constructing a Huffman code
Limitations of Huffman coding
Information channels
Example: Joint ensemble
Example: Channel capacity
Example: Channel capacity (con’t)
How many code bits must we add?
Shannon’s Channel-Coding Theorem
The Channel-Coding Theorem (con’t)
Error-correction codes
Hamming codes
Hamming codes (con’t)
Hamming codes (con’t)
Hamming codes (con’t)
Communication is a discrete, finite process
Kolmogorov-Chaitin complexity
Author: Chris Diorio 

Email: diorio@cs.washington.edu 

Home Page: http://www.cs.washington.edu/education/courses/599/99sp/