Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Data Communication: the discrete channel model A.J. Han Vinck University of Essen April 2005.

Similar presentations


Presentation on theme: "Introduction to Data Communication: the discrete channel model A.J. Han Vinck University of Essen April 2005."— Presentation transcript:

1 Introduction to Data Communication: the discrete channel model A.J. Han Vinck University of Essen April 2005

2 content communication model transmission model MAP-ML receiver burst error model Interleaving: block-convolutional several models

3 The communication model source data reduction/ compression data protection sink Message construction decoder k n k K‘

4 Point-to-point transmitterreceiverchannel physicalmodem message  bits Signal generator Signal processor bits  message

5 transmission model (OSI) Data Link Control Physical link Unreliable trans- mission of bits Transmission of reliable packets

6 transmission channel model input x i P(y|x i )output y transition probabilities memoryless: - output only on input - input and output alphabet finite

7 binary symmetric channel model (BSC) Error Source + e xixi Output Input E is the binary error sequence s.t. P(1) = 1-P(0) = p X i is the binary information sequence for message i Y is the binary output sequence 1-p 0 p 1 1-p y i = x i  e

8 Error probability (MAP) Suppose decision is message i for a received vector Y then, the probability of a correct decision = P( X i transmitted | Y received ) Hence, decide i that maximizes P( X i transmitted | Y received ) (Maximum Aposteriori Probability, MAP)

9 Maximum Likelihood (ML) receiver find i that maximizes P( X i | Y ) = P( X i, Y ) / P( Y ) = P( Y |X i ) P ( X i ) / P( Y ) for equally likely X i this is equivalent to find maximum P( Y | X i )

10 example For p = 0.1 and X 1 = ( 0 0 ); P( X 1 = 1/3 ) X 2 = ( 1 1 ); P( X 0 = 2/3) Give your MAP and ML decision for Y = ( 0 1 )

11 Something to think about message  bits bits  message message  compression  protection of bits correction of incorrect bits  decompression  message Error correctionMPEG, JPEG, etc Compression reduces bit rate Protection increases bit rate

12 Bit protection Obtained by Error Control Codes (ECC) Forward Error Correction (FEC) Error Detection and feedback (ARQ) Performance depends on error statistics! Error models are very important

13 Error control code with rate k/n message estimate channel decoder n Code word in receive There are 2 k code words of length n 2k2k Code book contains all processing

14 example Transmit: 0 0 0or 1 1 1 How many errors can we correct? How many errors can we detect? Transmit: A = 00000; B = 01011; C = 10101; D = 11110 How many errors can we correct? How many errors can we detect? What is the difference?

15 A simple error detection method row parity Fill row wise Transmit column wise RESULT: RESULT: any burst of length L can be detected L 0 1 1 0 0 1 1 0 0 1 1 0 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 1 1 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 1 1 What happens with bursts of length larger than L?

16 Modeling: binary transmission channel test sequence error sequence Problem: 0 1 0 0 0 1 0 1 0 0 0 0 1 1 1 Determination of burst and guard space burstguardburst

17 modeling How to model scratches on a CD? Answer is important for the design of ECC

18 CDDVD Blue Laser Density increases sensitivity

19 Modeling: networking Ack/Nack - 1-error causes retransmission - long packets always have an error - short packets with ECC give lower efficiency packet Suppose that a packet arrives correctly with probability Q. What is then the throughput as a funtion of Q?

20 burst error model Error Source Random Random error channel; outputs independent P(0) = 1- P(1); Burst Burst error channel; outputs dependent Error Source P(0 | state = bad ) = P(1|state = bad ) = 1/2; P(0 | state = good ) = 1 - P(1|state = good ) = 0.999 State info: good or bad goodbad transition probability P gb P bg P gg P bb

21 question goodbad P gb P bg P gg P bb P(0 | state = bad ) = P(1|state = bad ) = 1/2; P(0 | state = good ) = 1 - P(1|state = good ) = 0.99 What is the average for P(0) for: P gg = 0.9, P gb = 0.1; P bg = 0.99, P bb = 0.01 ? Indicate how you can you extend the model?

22 Interleaving: block Channel models are difficult to derive: - burst definition ? (a burst starts and ends with a 1) - random (?) and burst errors ? for practical reasons: convert burst into random error read in row wise transmit column wise 1001110011 0100101001 1000010000 0011000110 1001110011

23 Example (from Timo Korhonen, Helsinki) In fading channels received data can experience burst errors that destroy large number of consecutive bits. This is harmful for channel coding Interleaving distributes burst errors along data stream A problem of interleaving is introduced extra delay Example below shows block interleaving: time received power Reception after fading channel 1 0 0 0 1 1 1 0 1 0 1 1 1 0 0 0 1 1 0 0 1 1 0 0 0 1 1 1 0 1 0 1 1 1 0 0 0 1 1 0 0 1 1 0 0 0 1 0 0 0 1 0 1 1 1 1 0 1 1 0 1 0 1 Received interleaved data: Block deinterleaving : Recovered data:

24 example Consider the code C = { 000, 111 } A burst error of length 3 can not be corrected. Let us use a block interleaver 3X3 A1A2A3B1B2B3C1C2C3 2 errors A1A2A3B1B2B3C1C2C3 Interleaver A1B1C1A2B2C2A3B3C3A1B1C1A2B2C2A3B3C3 Deinterleaver A1A2A3B1B2B3C1C2C3 1 error

25 De-Interleaving: block read in column wise this row contains 1 error 1001110011 0100101001 1eeee1eeee ee110ee110 1001110011 read out row wise

26 Interleaving: convolutional input sequence 0 input sequence 1delay of b elements  input sequence m-1delay of (m-1)b elements Example:b = 5, m = 3 in out

27 Interleaving: destroys memory Message interleaver channel interleaver -1 message encoder decoder bursty „random error“ Note: interleaving brings encoding and decoding delay Homework: compare the block and convolutional interleaving w.r.t. delay

28 Middleton type of burst channel model Select channel k with probability Q(k) Transition probability P(0) 0101 0101 … channel 1 channel 2 channel k has transition probability p(k)

29 Impulsive Noise Classification (a) Single transient model Parameters of single transient : - peak amplitude - pseudo frequency f 0 =1/T 0 - damping factor - duration - Interarrival Time Measurements carried out by France Telecom in a house during 40 h 2 classes of pulses (on 1644 pulses) : single transient and burst

30 the Z-channel Application in optical communications 0101 0 (light on) 1 (light off) p 1-p x y P( x = 0 ) = 1 - P( x = 1) =P 0

31 the erasure channel Application: cdma detection, disk arrays 0101 0E10E1 1-e e 1-e x y P( x = 0) = 1 – P( x = 1 ) = P 0 Disk 1 Disk 2 Disk 3 Disk 4 Disk 5 Known position of error

32 From Gaussian to binary to erasure + e x i = +/- Output Input y i = x i + e + -- + + - + -- + + - E E E

33 A Simple code For low packet loss rates (e.g. 5%), sending duplicates is expensive (wastes bandwidth) XOR code XOR a group of data pkts together to produce repair pkt Transmit data + XOR: can recover 1 lost pkt 10101  11100 00111 1100010110   10101  111000011111000 10110  

34 Channel with insertions and deletions Bad synchronization or clock recovery at receiver: insertion 1 0 0 0 1 1 1 0 0 1 0  1 0 0 1 0 1 1 1 0 0 1 0 deletion 1 0 0 0 1 1 1 0 0 1 0  1 0 0 1 1 1 0 0 1 0 Problem: finding start and end of messages

35 Channel with insertions and deletions Due to errors in bit pattern flag= 1 1 1 1 1 0, avoid 1 1 1 1 1 in frame 0 1 1 1 1 1 0 0 1 1 0 1  0 1 1 1 1 0 1 0 0 1 1 0 1 0 1 1 0 1 0 1 0 0 1 1 0 1  insertion 0 1 1 1 0 0 0 0 1 1 0 1  0 1 1 1 1 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 1 0 1  deletion

36 Channels with interference Example (optical channel) Error probability depends on symbols in neighboring slots

37 Channels with memory (ex: recording) Example: Y i = X i + X i-1 X i  { +1, -1 } XiXi X i-1 Y i  { +2, 0, -2 }

38 tasks Construct a probability transformer from uniform to Gaussian Give an overview of burst error models, statistics of important parameters


Download ppt "Introduction to Data Communication: the discrete channel model A.J. Han Vinck University of Essen April 2005."

Similar presentations


Ads by Google