Coding Theory Efficient and Reliable Transfer of Information Information Theory Algebraic Coding Theory
Information Source Encoder Communication Channel Decoder Information Sink Noise
What is Information
Information Theory Information is a decrease in uncertainty What is uncertainty?
Uncertainty Number of Possibilities 2 ways 3 ways Together: 6 ways Prefer additive measure take logarithm of number of symbols M
Symbols with unequal probability
Average uncertainty per symbol: Shannon Entropy
Mutual Information Change in uncertainty Change in uncertainty in one random variable from knowledge of another
Molecular Information Theory Applied to DNA Sequences Pattern of information, or label, to ribosome to start translation at given site
Channel Capacity Maximum rate at which information can be sent and recovered with vanishing probability of error Channel Coding Theorem
Source Coding Can’t Compress information below entropy Optimal Prefix-Free Codes for Representation Code Lengths are greater than or equal to entropy of source
Rate Distortion Encoding bits of information per symbol bit How many given a certain distortion (error) p 1 1-p
Algorithmic Information Theory Kolmogorov Descriptive complexity of an object is the length of the shortest computer program that describes the object If equal to length of object then it is random Pi