Presentation is loading. Please wait.

Presentation is loading. Please wait.

UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley www.eecs.berkeley.edu/~wlr.

Similar presentations


Presentation on theme: "UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley www.eecs.berkeley.edu/~wlr."— Presentation transcript:

1 UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley www.eecs.berkeley.edu/~wlr

2 UCB Outline Claude Shannon Entropy Source Coding Channel Coding Separation Theorem

3 UCB Claude Shannon 4/30/1916 – 2/24/2001 37: Boolean Algebra  Logical Circuits 48: Mathematical Theory of Communications

4 UCB Entropy How much information is required to convey the value of a random variable? Key insight: The quantity of information is related to the uncertainty of that value. Example 1: Coin Flip = One bit of information Example 2: Two coin flips = 2 bits Example 3: N equally likely values = log 2 (N) bits

5 UCB Source Coding How do we encode the values to be able to convey them with the minimum number of bits? Key idea: Look at sequence of outcomes X(1), X(2), …, X(n) where X(m) is in {1, 2, …, K} For n large, there are only 2 nH equally likely sequences where H is smaller than log 2 K In fact, H = -  p i log 2 (p i )

6 UCB Source Coding (c’d) Example: P(1) = p = 1 – P(0) H = - p log 2 p – (1 – p)log 2 (1 – p) H p 1 0 0.510

7 UCB Source Coding (c’d) Thus, for large n: 2 n n-bit words 2 nH equally likely n-bit words.

8 UCB Channel Capacity Question: How fast can one transmit bits reliably through a noisy channel? Naïve answer: No reliable transmission is possible. Shannon’s formulation: What is the possible rate, in the long term, if one wants the bit error rate to be arbitrarily small? Shannon’ s answer: Channel Capacity

9 UCB Channel Capacity (c’d) Example: 0 1 0 1 SentReceived p p 1 - p C p 1 0 0.510

10 UCB Channel Capacity (c’d) Justification: Choose 2 nK n-bit words (fair coin flips) 2 n equally likely n-bit words 2 nH equally likely n-bit words for one word sent. => 2 n /2 nH = 2 n(1 – H) = 2 nC distinguishable codewords

11 UCB Separation Theorem Source with Entropy H bits per symbol Channel with Capacity C bps Can send C/H symbols per seconds First code symbols (n symbols => nH bits) Then code channel (send bits with suitable codewords) Hence: Separate source and channel coding!


Download ppt "UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley www.eecs.berkeley.edu/~wlr."

Similar presentations


Ads by Google