Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)

Slides:



Advertisements
Similar presentations
Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Advertisements

Topics discussed in this section:
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
CY2G2 Information Theory 1
Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.
Sampling and Pulse Code Modulation
CMP206 – Introduction to Data Communication & Networks Lecture 3 – Bandwidth.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Information Theory EE322 Al-Sanie.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Some Common Binary Signaling Formats: NRZ RZ NRZ-B AMI Manchester.
Chain Rules for Entropy
Lab 2 COMMUNICATION TECHNOLOGY II. Capacity of a System The bit rate of a system increases with an increase in the number of signal levels we use to denote.
Chapter 6 Information Theory
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Fundamental limits in Information Theory Chapter 10 :
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
Chapter 2: Fundamentals of Data and Signals. 2 Objectives After reading this chapter, you should be able to: Distinguish between data and signals, and.
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.
Recognition stimulus input Observer (information transmission channel) response Response: which category the stimulus belongs to ? What is the “information.
Noise, Information Theory, and Entropy
§1 Entropy and mutual information
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
4/11/40 page 1 Department of Computer Engineering, Kasetsart University Introduction to Computer Communications and Networks CONSYL Transmission.
Tahereh Toosi IPM. Recap 2 [Churchland and Abbott, 2012]
CY2G2 Information Theory 5
Computer Communication & Networks Lecture # 05 Physical Layer: Signals & Digital Transmission Nadeem Majeed Choudhary
Channel Capacity.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
1.Check if channel capacity* can cope with source information rate, if yes, source coding can proceed. Understand why. (i) Calculate Source entropy from.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Introduction to Digital and Analog Communication Systems
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Information Theory The Work of Claude Shannon ( ) and others.
DIGITAL COMMUNICATIONS Linear Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
CHAPTER 5 SIGNAL SPACE ANALYSIS
Electromagnetic Spectrum
Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.
1 Central Limit Theorem The theorem states that the sum of a large number of independent observations from the same distribution has, under certain general.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
1 st semester 1436/  When a signal is transmitted over a communication channel, it is subjected to different types of impairments because of imperfect.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
The Channel and Mutual Information
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Advanced Computer Networks
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Lecture 3 Appendix 1 Computation of the conditional entropy.
Mutual Information and Channel Capacity Multimedia Security.
Powerpoint Templates Computer Communication & Networks Week # 04 1 Lecture only.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
Channel Capacity Bandwidth – In cycles per second of Hertz – Constrained by transmitter and medium Data rate – In bits per second – Rate at which data.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Information Theory Michael J. Watts
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Subject Name: Information Theory Coding Subject Code: 10EC55
Nyquist and Shannon Capacity
CSCD 433 Network Programming
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Topics discussed in this section:
Presentation transcript:

Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)

P(Yj| Xi) = Pe if i≠j; P(Yj| Xi) = 1- Pe if i=j; X0 P(Y0|X0) Y0 P(X0) P(Y0) P(Y1|X0) P(Y0|X1) X1 P(Y1|X1) Y1 P(X1) P(Y1) Forward transition probabilities of the noisy binary symmetric channel. Binary Symmetric Channel (BSC). Transmitted source symbols Xi; i=0,1. Received symbols Yj; j=0,1;, Given P(Xi), source probability; P(Yj| Xi) = Pe if i≠j; P(Yj| Xi) = 1- Pe if i=j; where Pe is the given error rate.

Calculate the average mutual information (the average amount of source information acquired per received symbol, as distinguished for that per source symbol, which was given by the entropy H(X)). Step 1: P(Yj,Xi)=P(Xi)P(Yj|Xi); i=0,1. j=0,1. Step 2: P(Yj) =ΣX P(Yj,Xi) ; j=0,1. (Logically the probability of receiving a particular Yj is the sum of all joint probabilities over the range of Xi. (i.e. the prob of receiving 1 is the sum of the probability of sending 1 receiving 1 plus the probability of sending 0, receiving 1. that is, the sum of the probability of sending 1 and receiving correctly plus the probability of sending 0, receiving wrongly. )

Step 3: I(Yj,Xi) = log {P(Xi|Yj)/P(Xi) } =log {P(Yj,Xi)/ [P(Xi)P(Yj)]} ; i=0,1. j=0,1. (This quantifies the amount of information conveyed, when Xi is transmitted, and Yj is received. Over a perfect noiseless channel, this is self information of Xi, because each received symbol uniquely identifies a transmitted symbol with P(Xi|Yj)=1; If it is very noisy, or communication breaks down P(Yj,Xi)=P(Xi)P(Yj), this is zero, no information has been transferred). Step 4: I(X,Y)= I(Y ,X ) = ΣX ΣY P(Yj,Xi)log{P(Xi|Yj)/P(Xi)} = ΣX ΣY P(Yj,Xi) log {P(Yj,Xi)/ [P(Xi)P(Yj)]}

CY2G2 Information Theory 4 Equivocation Represents the destructive effect of noise, or additional information needed to make the reception correct y x Noisy channel transmitter receiver Noiseless channel Hypothetical observer The observer looks at the transmitted and received digits; if they are the same, reports a ‘1’, if different a ‘0’.

The information sent by the observer is easily evaluated as x M S y observer 1 The information sent by the observer is easily evaluated as -[p(0)logp(0)+p(1)logp(1)] applied to the binary string. The probability of ‘0’ is just the channel error probability. Example: A binary system produces Marks and Spaces with equal probabilities, 1/8 of all pulses being received in error. Find the information sent by the observer. The information sent by observer is -[7/8log (7/8)+1/8log(1/8)]=0.54 bits since the input information is 1 bit/symbol, the net information is 1-0.54=0.46 bits, agreeing with previous results.

General expression for equivocation The noise in the system has destroyed 0.55 bits of information, or that the equivocation is 0.55 bits. General expression for equivocation Consider a specific pair of transmitted and received digits {x,y} Noisy channel probability change p(x)p(x|y) Receiver : probability correction p(x|y)1 The information provided by the observer =-log( p(x|y) ) Averaging over all pairs probability of a given pair General expression for equivocation

The information transferred via the noisy channel (in the absence of the observer) Information loss due to noise (equivocation) Information in noiseless system (source entropy)

Example: A binary system produces Marks with probability of 0 Example: A binary system produces Marks with probability of 0.7 and Spaces with probability 0.3, 2/7 of the Marks are received in error and 1/3 of the Spaces. Find the information transfer using the expression for equivocation. x M S y x M S y P(x) 0.7 0.3 P(y) 0.6 0.4 P(x|y) 5/6 1/2 1/6 P(y|x) 5/7 2/7 2/3 1/3 P(xy) 0.5 0.2 0.1 I(xy) terms 0.126 -0.997 0.147 -0.085

CY2G2 Information Theory 4 Summary of basic formulae by Venn Diagram H(x|y) I(xy) H(y|x) H(y) H(x)

Quantity Definition Source information Received information Mutual information Average mutual information Source entropy Destination entropy Equivocation Error entropy

Channel capacity C=max I(xy)  that is maximum information transfer Binary Symmetric Channels The noise in the system is random, then the probabilities of errors in ‘0’ and ‘1’ is the same. This is characterised by a single value p of binary error probability. Channel capacity of this channel

Channel capacity of BSC channel Mutual information increases as error rate decreases 0 0 p(0) x (transmit) y (receive) p(1)=1-p(0) 1 1