Presentation is loading. Please wait.

Presentation is loading. Please wait.

X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.

Similar presentations


Presentation on theme: "X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet."— Presentation transcript:

1

2 X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet. Current output symbol (y k ) depends only on current input symbol x j.

3 x 0 = 0 x 1 = 1 y 0 = 0 y 1 = 1 P(y 0 /x 0 ) = P(0/0) = 1 P(y 1 /x 1 ) = P(1/1) = 1 P(y 0 /x 1 ) = P(0/1) = 0 P(y 1 /x 0 ) = P(1/0) = 0 TransmittedReceived

4 The conditional probability P(y k /x j ) is the probability of receiving a certain symbol y k given a certain symbol x j was transmitted. Ex: In a Noiseless channel:  The Probability of receiving a 0 given that a 0 was transmitted = P(0/0) = 1  The Probability of receiving a 0 given that a 1 was transmitted = P(0/1) = 0  The Probability of receiving a 1 given that a 0 was transmitted = P(1/0) = 0  The Probability of receiving a 1 given that a 1 was transmitted = P(1/1) = 1

5 x 0 = 0 x 1 = 1 y 0 = 0 y 1 = 1 P(y 0 /x 0 ) = P(0/0) = 1- P e P(y 1 /x 1 ) = P(1/1) = 1- P e P(y 0 /x 1 ) = P(0/1) = P e P(y 1 /x 0 ) = P(1/0) = P e TransmittedReceived

6 x 0 = 0 x 1 = 1 y 0 = 0 y 1 = 1 P(y 0 /x 0 ) = P(0/0) = 1- P e P(y 1 /x 1 ) = P(1/1) = 1- P e P(y 0 /x 1 ) = P(0/1) = P e P(y 1 /x 0 ) = P(1/0) = P e Fixed Output Fixed Input

7 X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Fixed Output Fixed Input

8 The probability of the each symbol emitted from the source at the transmitter side. P(x j ) = P(X=x j ) The probability of receiving a certain symbol y k given a certain symbol x j was transmitted. P(y k / x j ) = P( Y=y k / X=x j )

9 The probability of sending a certain symbol x j,and receiving a certain symbol y k. P(x j, y k ) = P(X= x j, Y=y k ) =P(Y=y k / X= x j ) P(X= x j ) =P(y k /x j ) P(x j )

10 The probability of receiving a certain symbol y k. P(y k ) = P(Y=y k ) = x 0 = 0 x 1 = 1 y 0 = 0 y 1 = 1 O R

11 P(x j, y k ) = P(y k, x j ) P(y k /x j ) P(x j )=P(x j /y k ) P(y k ) P(x j /y k ) = =

12 x 0 = 0 x 1 = 1 y 0 = 0 y 1 = 1 P(y 0 /x 0 ) = P(0/0) = 1- P e P(y 1 /x 1 ) = P(1/1) = 1- P e P(y 0 /x 1 ) = P(0/1) = P e P(y 1 /x 0 ) = P(1/0) = P e =

13 = x 0 = 0 x 1 = 1 y 0 = 0 y 1 = 1 P(y 0 /x 0 ) = P(0/0) = 1- q P(y 2 /x 0 ) = P(e/0) = q TransmittedReceived y 2 = e P(y 1 /x 1 ) = P(1/1) = 1- q P(y 2 /x 1 ) = P(e/1) = q

14 The average information transmitted over the channel per symbol. H(X) = The average information lost due to the channel per symbol, given that a certain symbol y k is received. H(X/y k ) =

15 The mean of the entropy over all the received symbols. = = P(x j, y k ) = H(X/Y) = H(X/Y) O R Equivocation of X with respect to Y

16 = = P(y k, x j ) H(Y/X) = H(Y/X) O R Equivocation of Y with respect to X = H(Y/X) = H(Y/ x j ) =

17 The average information the receiver receives per symbol. I(X,Y) = H(X) – H(X/Y)

18 = - =1 = = = = - -

19  I(X,Y) = H(X) – H(X/Y)  I(Y,X) = H(Y) – H(Y/X)  I(X,Y) = I(Y,X)  I(X,Y) = I (Y, X) = H(X) + H(Y) – H(X,Y) Where H(X,Y)= I(X,Y) H(X)H(Y) H(X/Y)H(Y/X) H(X,Y)

20  The channel capacity of a discrete memoryless channel is defined as the maximum rate at which the information can be transmitted through the channel.  It is the maximum mutual information over all the possible distributions of input probabilities P(x j ) C =

21 x 0 = 0 x 1 = 1 y 0 = 0 y 1 = 1 P e ” = 1- P e PePe PePe P (x 0 ) = P 0 I(X,Y) = P (x 1 ) = P 0 ” = 1- P 0 = + + + k=0, j=0 : k=0, j=1: k=1, j=0: k=1, j=1:

22 + = + + + - = - = = + + + 00 = = + I(X,Y)

23 + I(X,Y) = C = I(X,Y) is maximum when all the transmitted symbols are equiprobable. i.e. P(x 0 ) = P(x 1 ) = 0.5 P 0 ” = 1- P 0 = 0.5 C = I(X,Y) = + C = 1

24 P (x 0 ) = P 0 I(X,Y) = P (x 1 ) = P 0 ” = 1- P 0 = + + + k=0, j=0: k=0, j=1: k=1, j=0: k=1, j=1: x 0 = 0 x 1 = 1 y 0 = 0 y 1 = 1 y 2 = e k=2, j=0: + k=2, j=1: + q” = 1- q q q

25 0 I(X,Y) C = I(X,Y) is maximum when all the transmitted symbols are equiprobable. i.e. P(x 0 ) = P(x 1 ) = 0.5 P 0 ” = 1- P 0 = 0.5 C = I(X,Y) C 0 = ++ +

26 Given J input symbols, and K output symbols, the channel capacity of a symmetric discrete memoryless channel is given by:


Download ppt "X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet."

Similar presentations


Ads by Google