Presentation is loading. Please wait.

Presentation is loading. Please wait.

Institute for Experimental Mathematics Ellernstrasse 29 45326 Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.

Similar presentations


Presentation on theme: "Institute for Experimental Mathematics Ellernstrasse 29 45326 Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003."— Presentation transcript:

1 Institute for Experimental Mathematics Ellernstrasse 29 45326 Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003

2 University Duisburg-Essen digital communications group 2 Content we describe  a simple transmission model  detection  error probability  compare coded with uncoded

3 University Duisburg-Essen digital communications group 3 Figure 1 Source transmitter channel receiver message signal signal estimate i = 1,,M s(i) r i‘ The optimum receiver maximizes the probability of being correct

4 University Duisburg-Essen digital communications group 4 Optimum Receiver Suppose: - with every possible received signal r we connect a message estimate P( correct | r ) = P( i transmitted | r and estimate is i ) Optimum detection rule: for a given r, set estimate is i if P(i transmitted | r and estimate is i )  P(k transmitted | r and estimate is k ) for all k  i message received k i j region of received signals with estimate k

5 University Duisburg-Essen digital communications group 5 Maximum likelihood receiver optimum receiver The optimum receiver gives as estimate the i that maximizes P(i) p( r| i ) ( note p(r) independent from index i ) ML receiver The ML receiver gives as estimate the i that maximizes p( r| i ) Bayes rule

6 University Duisburg-Essen digital communications group 6 Binary signaling 1 0 n(t) s(t) s(t)+n(t) = r(t) y i sample every T seconds Hard decision: y i > 0output symbol 1 y i < 0 output symbol 0 Error probability p:= probability ( 1(0) transmitted, 0 (1) decided ) T Energy E

7 University Duisburg-Essen digital communications group 7 Binary Symmetric Channel model 0101 0101 1-p p 1-p Famous satellite transmission model: BSC

8 University Duisburg-Essen digital communications group 8 We transmit a series of symbols or codewords MessageC channelRestimate 0 1-p 0 p 1 1-p 1 Let R have d differences with C, thenP( R | C ) = p d (1-p) n-d hence: hence: Find C with minimum d (for p < ½ )

9 University Duisburg-Essen digital communications group 9 The noise n(t) is from an Additive White Gaussian Noise Source A noise process is called Gaussian if is Gaussian distributed for every g(t). White: noise samples are statisically independent from each other, i.e. p n (a,b) = p n (a) p n (b)

10 University Duisburg-Essen digital communications group 10 for our example Average noise level Variance of the noise Probability density Expected detector output

11 University Duisburg-Essen digital communications group 11 intermezzo

12 University Duisburg-Essen digital communications group 12 Cont‘d

13 University Duisburg-Essen digital communications group 13 1-dimensional model of transmission Decide 1 Decide 0

14 University Duisburg-Essen digital communications group 14 Obvious receiver Gaussian noise with power spectral density 2  2 n(t) 2T=1/B s(t) Filter  pass basic signal components  sine waves with frequency   B; 1   Gaussian noise with E[(n‘(t)) 2 ] =  B 2  2 n‘(t) Error probability: for  = 1: same as optimum (signal not reproducable) for  > 1: loss sample moments

15 University Duisburg-Essen digital communications group 15

16 University Duisburg-Essen digital communications group 16 Filtering causes interference Limit highest frequency eye diagram = overlapping registration of the signal in time

17 University Duisburg-Essen digital communications group 17 Amplitude shift keying ASK 00 01 11 10 Homework: calculate error probabilities for average energy E Why did we choose this assignment?

18 University Duisburg-Essen digital communications group 18 Code-uncoded transmission bandwidth expansion Uncoded: Uncoded: k information bits in time T = k  with total energy kE b Coded: Coded: n code digits in time T = n  s with total energy n  s = k  k   n  s ss Increased noise power by factor n/k

19 University Duisburg-Essen digital communications group 19 Code-uncoded transmission no bandwidth expansion Uncoded: Uncoded: k information bits in time T = k  with total energy kE b Coded: Coded: n code digits in time T = n  with total energy n E s = k E b k    n  Reduced symbol energy

20 University Duisburg-Essen digital communications group 20 Uncoded error probability Homework: derive this upper bound

21 University Duisburg-Essen digital communications group 21 For a code with minimum distance d min Coded error probability  R*d/2 > 1 !! CODING GAIN: if product R*d min /2 > 1 !! Hence: look for codes - with large d Hence: look for codes - with large d min - at high ratio k/n = R - at high ratio k/n = R

22 University Duisburg-Essen digital communications group 22 „Waterfall curves“ 10log 10 Eb/  2 deciBell (dB) Pe 04812 10 0 10 -1 10 -2 10 -3 10 -4

23 University Duisburg-Essen digital communications group 23 Use „soft-decision“, to do better? nini c i = {, } ML decoder: Choose the code vector C I of length n that maximizes p ( R | C I ) receive r i

24 University Duisburg-Essen digital communications group 24 Soft decoding cont‘d Choose the codeword that minimizes the „Squared Euclidean Distance“

25 University Duisburg-Essen digital communications group 25 performance Assume we transmit the all zero vector i.e. c i = i = 1,2,..., n. Let D be the set of d different positions

26 University Duisburg-Essen digital communications group 26 Performance cont‘d Note: we have the sum of d independent Gaussian random variables

27 University Duisburg-Essen digital communications group 27 For code with minimum distance d min Factor of 2 CODING GAIN! We transmit k bits of information with total energy kE b in n transmissions with total energy nE Thus: E = kE b /n and

28 University Duisburg-Essen digital communications group 28 example Single parity check code: n-1 information bits; 1 parity bit minimum distance = 2 No hard decision gain Soft decision gain 10 log 10 2 := 3 dB Check!

29 University Duisburg-Essen digital communications group 29 Sum of 2 Gaussian Random Variables Motivation: any process that combines many random variables will produce random variables that are approximately Gaussian. (Central Limit Theorem) Let z = x+y, where x and y are independent Gaussian Random Variables. E(x) = E(y) = 0; E(x 2 ) =  2 ; E(y 2 ) =  2. Then:z is a Gaussian random variable: E(z)=0; E(z 2 ) =  2 +  2. Homework: calculate the probablity density function for z.

30 University Duisburg-Essen digital communications group 30 Sum of 2 Gaussian Random Variables Sketch of proof:


Download ppt "Institute for Experimental Mathematics Ellernstrasse 29 45326 Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003."

Similar presentations


Ads by Google