UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley www.eecs.berkeley.edu/~wlr.

Slides:



Advertisements
Similar presentations
Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Advertisements

Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Entropy and Information Theory
Probabilistic verification Mario Szegedy, Rutgers www/cs.rutgers.edu/~szegedy/07540 Lecture 4.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Information Theory EE322 Al-Sanie.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Some Common Binary Signaling Formats: NRZ RZ NRZ-B AMI Manchester.
Day 2 Information theory ( 信息論 ) Civil engineering ( 土木工程 ) Cultural exchange.
Chapter 6 Information Theory
Poorvi Vora/CTO/IPG/HP 01/03 1 The channel coding theorem and the security of binary randomization Poorvi Vora Hewlett-Packard Co.
Fundamental limits in Information Theory Chapter 10 :
UCB PHYSICAL LAYER Jean Walrand U.C. Berkeley
UCB Error Control in TCP Jean Walrand U.C. Berkeley
UCB Sharing a Channel Jean Walrand U.C. Berkeley
2/28/03 1 The Virtues of Redundancy An Introduction to Error-Correcting Codes Paul H. Siegel Director, CMRR University of California, San Diego The Virtues.
Molecular Information Theory Niru Chennagiri Probability and Statistics Fall 2004 Dr. Michael Partensky.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Reliability and Channel Coding
Information Capacity and Communication Systems By : Mr. Gaurav Verma Asst. Prof. ECE Dept. NIEC.
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
Noise, Information Theory, and Entropy
Basic Concepts in Information Theory
Analysis of Iterative Decoding
STATISTIC & INFORMATION THEORY (CSNB134)
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Information and Coding Theory Transmission over noisy channels. Channel capacity, Shannon’s theorem. Juris Viksna, 2015.
(Important to algorithm analysis )
Channel Capacity.
Rei Safavi-Naini University of Calgary Joint work with: Hadi Ahmadi iCORE Information Security.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Introduction to Digital and Analog Communication Systems
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Information Theory The Work of Claude Shannon ( ) and others.
DIGITAL COMMUNICATIONS Linear Block Codes
Coding Theory Efficient and Reliable Transfer of Information
The Classically Enhanced Father Protocol
Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Review: The physical layer. Bandwidth/Capacity bit rate/baud rate How to determine the number of bits per symbol? Relation between the bandwidth and capacity:
Combinatorics (Important to algorithm analysis ) Problem I: How many N-bit strings contain at least 1 zero? Problem II: How many N-bit strings contain.
Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel.
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
INFORMATION THEORY Pui-chor Wong.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Wireless and Mobile Networks (ELEC6219) Session 4: Efficiency of a link. Data Link Protocols. Adriana Wilde and Jeff Reeve 22 January 2015.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Powerpoint Templates Computer Communication & Networks Week # 04 1 Lecture only.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Statistical methods in NLP Course 2
Shannon Entropy Shannon worked at Bell Labs (part of AT&T)
Information Theory Michael J. Watts
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
A Brief Introduction to Information Theory
CSCD 433 Network Programming
Grids A1,1 A1,2 A1,3 A1,4 A2,1 A2,2 A2,3 A2,4 A3,1 A3,2 A3,3 A3,4 A4,1 A4,2 A4,3 A4,4.
Entropy CSCI284/162 Spring 2009 GWU.
Lecture 2: Basic Information Theory
Presentation transcript:

UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley

UCB Outline Claude Shannon Entropy Source Coding Channel Coding Separation Theorem

UCB Claude Shannon 4/30/1916 – 2/24/ : Boolean Algebra  Logical Circuits 48: Mathematical Theory of Communications

UCB Entropy How much information is required to convey the value of a random variable? Key insight: The quantity of information is related to the uncertainty of that value. Example 1: Coin Flip = One bit of information Example 2: Two coin flips = 2 bits Example 3: N equally likely values = log 2 (N) bits

UCB Source Coding How do we encode the values to be able to convey them with the minimum number of bits? Key idea: Look at sequence of outcomes X(1), X(2), …, X(n) where X(m) is in {1, 2, …, K} For n large, there are only 2 nH equally likely sequences where H is smaller than log 2 K In fact, H = -  p i log 2 (p i )

UCB Source Coding (c’d) Example: P(1) = p = 1 – P(0) H = - p log 2 p – (1 – p)log 2 (1 – p) H p

UCB Source Coding (c’d) Thus, for large n: 2 n n-bit words 2 nH equally likely n-bit words.

UCB Channel Capacity Question: How fast can one transmit bits reliably through a noisy channel? Naïve answer: No reliable transmission is possible. Shannon’s formulation: What is the possible rate, in the long term, if one wants the bit error rate to be arbitrarily small? Shannon’ s answer: Channel Capacity

UCB Channel Capacity (c’d) Example: SentReceived p p 1 - p C p

UCB Channel Capacity (c’d) Justification: Choose 2 nK n-bit words (fair coin flips) 2 n equally likely n-bit words 2 nH equally likely n-bit words for one word sent. => 2 n /2 nH = 2 n(1 – H) = 2 nC distinguishable codewords

UCB Separation Theorem Source with Entropy H bits per symbol Channel with Capacity C bps Can send C/H symbols per seconds First code symbols (n symbols => nH bits) Then code channel (send bits with suitable codewords) Hence: Separate source and channel coding!