Presentation is loading. Please wait.

Presentation is loading. Please wait.

Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding.

Similar presentations


Presentation on theme: "Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding."— Presentation transcript:

1 Turbo Codes COE 543 Mohammed Al-Shammeri

2 Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding TTurbo Codes Performance TTurbo Coding Application CConclusion Remarks

3 Introduction  Motivation Can we have error free communication as much as possible. Can we reach Shannon Limit?  Objectives Studying channel coding Understanding channel capacity Ways to increase data rate Provide reliable communication link

4 Turbo Codes History  IEEE International Comm conf 1993 in Geneva  Berrou, Glavieux. : ‘Near Shannon Limit Error-Correcting Coding : Turbo codes’  Provided virtually error free communication at data date/power efficiencies beyond most expert though

5 Turbo Codes History…  Double data throughput at a given power  Or work with half the power  The two men were not known, “most were thinking that they are wrong in calculation”  They realized that it was true.  Many companies adopted, new compnaies started: turboconcept and iCoding  0.5 dB from Shannon limit at P e 10 -6

6 Communication System  Structural modular approach  Various components  Of defined functions Channel Coding Source Coding Modulation Formatting Digitization Multiplexing Access techniques send receive

7 Channel Coding  Accounting for the channel  Can be categorized into Wave form signal design  Better detectible signals. Structured sequences  Added redundancy  Objective: provide coded signals with better distance properties

8 Binary Symmetric Channel  Special case of DMC : discrete input and discrete output; where input and output are {0,1}  Memoryless : each symbol is affected d independently  Hard decisions decoding  P is related to the bit Energy 1 00 1 1 - p p p

9 Gaussian Channel  descrete inputs with continuous property  Noise get added to the signals passing through it  Noise is a Gaussian random variable with zero mean and variance σ 2  The resulting pdf is Likelihood of u k

10 Why use ECC  Consider the following trade offs Error performance vs. bandwidth  High redendency consumes bw Power vs. bandwidth  Reduction in E b /N 0 Data rate vs. bandwidth  Higher rate

11 Shannon Theory  Started the information Theory  Stated the max data rate of a channel Error rate Power  Did not say how!  Clue : large data words in term of number of bits: distance property

12 Error Correction Mechanisms  Backward Error correction Error detection capability Communication cost Real time traffic  Forward Error Correction Detection and correction of errors More complex receivers DSP cost

13 Forward Error Correction  Block Codes Data split into blocks Checks are within the block  Convolutional code Bit streamed data Involves memory  Turbo codes Uses conv. Codes Special properties

14 Structured Redundency Channel encoder Channel encoder Input word k-bit Output word n-bit Redundancy = (n-k) Code rate = k/n codeword Code sequence

15 Coding advantages PnPn E b /N 0 dB uncoded coded 10 -8 10 -3 8 19 Coding gain

16 Coding disadvantages  More bandwidth due to redundant  Processing Delay  Design Complexity

17 Error Correction  Codewords : points in hyperspace  Noise can alter some bits : displacement  If two words are close to each other, and if an error occurs so that one can fall into the other; decoding error  Keep large differences  Decoder complexity !

18 Hyperspace and Codewords Hamming distance Same word

19 Good Codes  Random  If we set 1000 bits per word  10 301, astronomical number  No way with conventional coding schemes

20 Turbo codes  30 years ago. Forney Nonsystematic Nonrecursive combination of conv. Encoders  Berrou et al at 1993 Recursive Systematic  Based on pseudo random  Works better for high rates or high level of noise  Return to zero sequences

21 Turbo Encoder Input RSC Interleaver Systematic codeword random X Y1 Y2

22 Turbo codes  Parallel concatenated The k-bit block is encoded N times with different versions (order) Pro the sequence remains RTZ is 1/2 Nv Randomness with 2 encoders; error pro of 10 -5 Permutations are to fix d min

23 Recursive Systematic Coders Copy of the data in natural order Recursive S1S2S3 Data stream Systematic Calculated parity bits

24 Return to zero sequences  Non recursive encoder state goes to zero after v ‘0’.  RSC goes to zero with P= 1/2 v  if one wants to transform conv. into block code; it is automatically built in.  Initial state i will repeat after encoding k

25 Convolutional Encoders Input stream Modulo-2 adder Output serialized stream 4 stage Shift

26 Turbo Decoding

27  Criterion For n probabilistic processors working together to estimate common symbols, all of them should agree on the symbols with the probabilities as a single decoder could do

28 Turbo Decoder

29  The inputs to the decoders are the Log likelihood ratio (LLR) for the individual symbol d.  LLR value for the symbol d is defined ( Berrou) as

30 Turbo Decoder  The SISO decoder reevaluates the LLR utilizing the local Y1 and Y2 redundancies to improve the confidence The value z is the extrinsic value determined by the same decoder and it is negative if d is 0 and it is positive if d is 1 The updated LLR is fed into the other decoder and which calculates the z and updates the LLR for several iterations After several iterations, both decoders converge to a value for that symbol.

31 Turbo Decoding  Assume U i : modulating bit {0,1} Y i : received bit, output of a correlator. Can take any value (soft). Turbo Decoder input is the log likelihood ratio  R(u i ) = log [ P(Y i |U i =1)/(P(Y i |U i =0)]  For BPSK, R(u i ) =2 Yi/ (var) 2 For each data bit, calculate the LLR given that a sequence of bit were sent

32 Turbo Decoding  Compare the LLR output, to see if the estimate is towards 0 or 1 then take HD

33 Soft in/ Soft out processor  At the heart of the decoder  Represent all possible states of an encoder (trellis)  Number of states at a particular clock is 2 n ; n = # of flip flops used in the SR  Trellis shows: Current state Possible paths lead to this state

34 SISO  Label all branches with a branch metric  Function of processor inputs  Obtain the LLR for each data bit by traversing the Trellis  Two algorithms : Soft output Viterbi Algorithm (SOVA) Maximum a Posteriori (MAP)  Log MAP

35 How Do they Work (© IEEE spectrum)

36

37 Turbo Codes Performance

38 Turbo Codes Applications  Deep space exploration France SMART-1 probe  JPL equipped Pathfinder 1997  Mobile 3G systems In use in Japan UMTS NTT DoCoMo  Turbo codes : pictures/video/mail  Convolutional codes : voice

39 Conclusion : End of Search  Turbo codes achieved the theorical limits with small gap  Give rise to new codes : Low Density Parity Check (LDPC)  Need Improvements in decoding delay


Download ppt "Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding."

Similar presentations


Ads by Google