Download presentation

Presentation is loading. Please wait.

Published byGarrison Cunnington Modified over 2 years ago

1
Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

2
content Some examples of channels Additive coding for the broadcasting Superposition coding for multi-access Coding for the two-way channel Coding for the switching channel Some more

3
Goal of the lectures: Introduction of some classical models two-way; two access; broadcast; Problems connected: calculation and formulation of capacity Development of coding strategies

4
Time Sharing (TDMA) User 1 User 2 User 3 Time sharing: easy to organize inefficient if not many users active efficiency depends on channel message idle Common channel

5
Two-way X1X2 Y1Y2 X1 and X2 communicate by observing Y1 and Y2 R1= I(X1;Y2|X2) R2= I(X2;Y1|X1) Maximize (R1,R2) over any input distribution P(X1,X2) I(X1;Y2|X2) := H(X1|X2)-H(X1|X2,Y2)

6
Note: I(X1;Y2|X2) := H(X1|X2)-H(X1|X2,Y2) H(X1|X2) = minimum average # bits needed to specify X1 given X2 H(X1|X2,Y2) = minimum average # bits needed to specify X1 given X2 and the observation Y2 Difference = what we learned from the transmission over the channel = the reduction in average specification length of X1|X2

7
Example: AND channel X1 X2 Y X1 01 X2 0 0 0 1 0 1 Y When X1 = 0, he does not know X2 X1 = 1, he knows X2 Same for X2

8
A coding example X1 01 10 01 01 00 X2 10 00 1 if y = 0 Transmit inverse (red) If y = 0 Inputs are known Rate: 1/( 2*3/4 + 1*¼) = 4/7 = 0.57 > 1 !! X1 X2 Y

9
Another coding example X1 0 110 10 0 0 X2 0 101 01 1 00 0 0 1 00 1 1 Rate: log 2 3/( 2*3/9 + 3 * 6/9 ) =.59 > 1 !! X1 X2 Y

10
dependent inputs X1 and X2 P(X1=0, X2=0) = 0 P(X1=0, X2=1) = P(X1=1, X2=0) = p P(X1=1,X2=1) = 1-2p Then, P(X1=1) = P(X1=1, X2=0) + P(X1=1, X2=1) = 1-p. R2 = R1 = I(X2;Y|X1) = I(X1;Y|X2) = H(Y|X1) = (1-p)h(p/(1-p)) The maximum = 0.694 0 1 0 p 1 p 1-2p

11
Note: P(Y=0|X1=1) = P(Y=0, X1=1)/P(X1=1) = p/(1-p) P(Y=1|X1=1) = P(Y=1, X1=1)/P(X1=1) = (1-2p)/(1-p)

12
A lower bound Let X1 and X2 transmit independently P(X1 = 1) = 1 – P(X1=0) = a P(X2 = 1) = 1 – P(X2=0) = a Then: R1 = I(X1;Y|X2) = H(Y|X2) – H(Y|X1,X2) = ah(a) = R2 The maximum = 0.616 > 4/7 X1 X2 Y

13
The upper (outer) bound inner The inner bound ( for independent transmission) is outer < the Shannon outer bound ( X1 and X2 dependent). For R1 = R2inner rate = 0.616 outer rate 0.694 The exact capacity is unknown! X1 X2 Y

14
bounds R1 1 1 R2 Outer bound inner bound 0 X1 X2 Y

15
Broadcast X Z Y Z transmits information to X same information to Y R1 I( X; Z ) R2 I( Y; Z ) R1 + R2 I ( Z; (X,Y)) = I( Z; X)

16
broadcast X Z Y Z transmits information to X different information to Y R1 I( X; Z ) R2 I( Y; Z ) R1 + R2 I( Z; (X,Y) ) = I( Z; X) + I(Z;Y|X)

17
example: Blackwell BC ZXY000101211ZXY000101211 R1 I( X; Z ) = H(X)-H(X|Z) R2 I( Y; Z ) =H(Y) –H(Y|Z) R1 + R2 I( Z; (X,Y)) log 2 3 X Z Y

18
example Y 00/11 01/10 00 00 10 X 01 12 02 10 21 20 Z Z X Y 0 0 0 1 0 1 2 1 1 I(Y;Z) = 1 I(X;Z) = log 2 3 R sum = (1+ log 2 3)/2 = 1.29 bit/tr. X Z Y

19
2-access channel X1 X2 Y X1 and X2 want to communicate with Y at the same time! Obvious bound on the sum rate: R1+R2 H(Y) – H(Y|X1,X2) H(Y)

20
Two-access models Switchingtwo-Adder x1 x2y x1 x2 y 00 000 01 011 100101 111112 Y Y X1 X2 X1 X2 y y 01 12 0 1

21
Two-adder (1) Capacity region 0 1 1 2 X1 0 1 0 X2 1 X1 X2 Y R1 1 from X1 to Y R2 1 from X2 to Y R1+R2 H(Y) 1.5 0.5 1 R2 R1 1 0.5 0 timesharing

22
Two-adder (2) Coding strategy: -error User 1: transmit at rate R = 1 bit, i.e. P(X1 = 0) = ½ User 2: sees erasure channel. 0 User 2 1 012012.5 Max H(X)-H(X|Y) = ½ Hence: rate pair (R1,R2) = ( 1, ½ ) X1 X2 Y

23
Two-adder (3) A simple a-symmetric strategy: 0-error X1 X2 000 001 010 011 100 101 110 000000 001 010 011 100 101 110 111 111 112 121 122 211 212 221 Efficiency = 1/3 log 2 14 = 1.27 X1 X2 Y

24
Two-adder with feedback (1) Question: can we enlarge the capacity region? Yes (Wolf)! Is this a surprise? R1 1 0.5 0 0.5 1 R2 Capacity region characterization: modified by Cover-Leung Willems X1 X2 Y

25
Two-adder with feedback (2) 0 1 1 2 0 1 0101 X1 X2 N independent transmissions uncertainty Solve uncertainty insteps Total efficiency: = 1.52! Joint output selection X1 X2 Y

26
Switching channel (1) X101 X20 0 1 1 Y X2={0,1} X1={0,1} { ,0,1} tri-state logic P( \ pass info ) = ( 1-a, a ) R sum (max) = a + h(a) log 2 3

27
simple coding example (2) For n = 2: code 1:111001 code 200000 0 11111 1 Sum rate: R sum = (log 2 3 + 1)/2 = 1.3 X2={0,1} X1={0,1} { ,0,1}

28
Switching channel with general coding (3) Strategy: code 1 (0 0 1 0... 0 1 0 ) < d min zeros Linear code 2 (0 1 1 0... 0 0 0 ) k n – d min + 1 receive: ( 1 ... 0 ) correct d min - 1 erasures = C!! PERFORMANCE: = C!! n

29
Extensions (2) T-user ADDER + {0,1} {0,1,, T}

30
T-user ADDER + (1) Input output User 1{ 00 11} { 10 01 11 20 21 31 12 22 } User 2 { 10 01} User 3 { 00 10} Efficiency: 3 * ½ = 1.5

31
T-user ADDER + (2) Input User 1 { 000 111} User 2{ 110 001} User 3{ 000 001 010 100 101 011} Outputs: { 110 111 120 210 211 121 etc. } Efficiency = ( 2 + log 2 6 )/3 = 1.53 bits/tr. record: 1.551 by van Tilborg (1991)

Similar presentations

Presentation is loading. Please wait....

OK

INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.

INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on world bank and imf Ppt on power generation through shock absorbers Ppt on circuit breaker Ppt on object-oriented concepts in java Ppt on social contract theory john Ppt on idiopathic thrombocytopenia purpura therapy Ppt on computer languages wikipedia Ppt on articles of association example Ppt on zfs file system Ppt on business plan