MAP decoding: The BCJR algorithm

Slides:



Advertisements
Similar presentations
Convolutional Codes Mohammad Hanaysheh Mahdi Barhoush.
Advertisements

Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Iterative Equalization and Decoding
Convolutional Codes Representation and Encoding  Many known codes can be modified by an extra code symbol or by deleting a symbol * Can create codes of.
Decoding of Convolutional Codes  Let C m be the set of allowable code sequences of length m.  Not all sequences in {0,1}m are allowable code sequences!
Inserting Turbo Code Technology into the DVB Satellite Broadcasting System Matthew Valenti Assistant Professor West Virginia University Morgantown, WV.
6.375 Project Arthur Chang Omid Salehi-Abari Sung Sik Woo May 11, 2011
Maximum Likelihood Sequence Detection (MLSD) and the Viterbi Algorithm
1 Channel Coding in IEEE802.16e Student: Po-Sheng Wu Advisor: David W. Lin.
Cellular Communications
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Part 4 b Forward-Backward Algorithm & Viterbi Algorithm CSE717, SPRING 2008 CUBS, Univ at Buffalo.
Hidden Markov Models Lecture 5, Tuesday April 15, 2003.
Hidden Markov Models Lecture 5, Tuesday April 15, 2003.
Figure 6.1. A convolutional encoder. Figure 6.2. Structure of a systematic convolutional encoder of rate.
EE436 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Data Compression Basics & Huffman Coding
ECED 4504 Digital Transmission Theory
1 Systematic feedback (recursive) encoders G’(D) = [1,(1 + D 2 )/(1 + D + D 2 ),(1 + D)/(1 + D + D 2 ) ] Infinite impulse response (not polynomial) Easier.
1 Channel Coding (II) Cyclic Codes and Convolutional Codes.
1 (Chapter 15): Concatenated codes Simple (classical, single-level) concatenation Length of concatenated code: n 1 n 2 Dimension of concatenated code:
III. Turbo Codes.
1 SNS COLLEGE OF ENGINEERING Department of Electronics and Communication Engineering Subject: Digital communication Sem: V Cyclic Codes.
1 –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system.
A Novel technique for Improving the Performance of Turbo Codes using Orthogonal signalling, Repetition and Puncturing by Narushan Pillay Supervisor: Prof.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Iterative decoding If the output of the outer decoder were reapplied to the inner decoder it would detect that some errors remained, since the columns.
ADVANTAGE of GENERATOR MATRIX:
1 Coded modulation So far: Binary coding Binary modulation Will send R bits/symbol (spectral efficiency = R) Constant transmission rate: Requires bandwidth.
VIRGINIA POLYTECHNIC INSTITUTE & STATE UNIVERSITY MOBILE & PORTABLE RADIO RESEARCH GROUP MPRG Combined Multiuser Detection and Channel Decoding with Receiver.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Last time, we talked about:
Low Density Parity Check codes
The parity bits of linear block codes are linear combination of the message. Therefore, we can represent the encoder by a linear system described by matrices.
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Error Correction Code (2)
Muhammad Shoaib Bin Altaf. Outline Motivation Actual Flow Optimizations Approach Results Conclusion.
Wireless Communication Research Lab. CGU What is Convolution Code? 指導教授:黃文傑 博士 學生:吳濟廷
Dr. Muqaibel \ EE430 Convolutional Codes 1 Convolutional Codes.
Log-Likelihood Algebra
Implementation of Turbo Code in TI TMS320C8x Hao Chen Instructor: Prof. Yu Hen Hu ECE734 Spring 2004.
A Bandwidth Efficient Pilot Symbol Technique for Coherent Detection of Turbo Codes over Fading Channels Matthew C. Valenti Dept. of Comp. Sci. & Elect.
1 Reliability-Based SD Decoding Not applicable to only graph-based codes May even help with some algebraic structure SD alternative to trellis decoding.
Overview of MB-OFDM UWB Baseband Channel Codec for MB-OFDM UWB 2006/10/27 Speaker: 蔡佩玲.
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
Classical Coding for Forward Error Correction Prof JA Ritcey Univ of Washington.
1 Convolutional Codes An (n,k,m) convolutional encoder will encode a k bit input block into an n-bit ouput block, which depends on the current input block.
FEC decoding algorithm overview VLSI 자동설계연구실 정재헌.
The Viterbi Decoding Algorithm
An Efficient Software Radio Implementation of the UMTS Turbo Codec
Pipelined Architectures for High-Speed and Area-Efficient Viterbi Decoders Chen, Chao-Nan Chu, Hsi-Cheng.
Hidden Markov Models - Training
S Digital Communication Systems
COS 463: Wireless Networks Lecture 9 Kyle Jamieson
Error Correction Code (2)
Error Correction Code (2)
Time Varying Convolutional Codes for Punctured Turbocodes
Error Correction Code (2)
COS 463: Wireless Networks Lecture 9 Kyle Jamieson
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Error Correction Coding
IV. Convolutional Codes
Presentation transcript:

MAP decoding: The BCJR algorithm Maximum A posteriori Probability Bahl-Cocke-Jelinek-Raviv (1974) Baum-Welch algorithm (1963?)* Decoder inputs: Received sequence r (soft or hard) A priori L-values La(ul) = ln( P(ul=1)/ P(ul=0) ) Decoder outputs A posteriori L-values L(ul) = ln( P(ul=1 | r)/ P(ul=0 | r) ) > 0: ul is most likely to be 1 < 0: ul is most likely to be 0

BCJR (Continued)

BCJR (Continued)

BCJR (Continued)

BCJR (Continued) AWGN

MAP algorithm Initialize Forward recursion 0(s) and backward recursion N(s) Compute branch metrics {l(s’,s)} Carry out forward recursion {l+1(s)} based on {l(s)} Carry out backward recursion {l-1(s)} based on {l(s)} Compute APP L-values Complexity: approximately 3xViterbi Requires detailed knowledge of SNR Viterbi just maximizes rv, does not require exact SNR

BCJR (Continued) Information bits Termination bits

BCJR (Continued)

BCJR (Continued)

Log-MAP algorithm Initialize Forward recursion 0*(s) and backward recursion N*(s) Compute branch metrics {l*(s,s’)} Carry out forward recursion {l+1*(s)} based on {l*(s)} Carry out backward recursion {l-1*(s)} based on {l*(s)} Compute APP L-values Advantages over MAP algorithm: Easier to implement Numerically stable

Max-log-MAP algorithm Replace max* by max: delete table lookup correction term Advantage: Simpler, faster (?) Forward and backward pass equivalent to Viterbi decoder Disadvantage: Less accurate (but the correction term is limited in size by ln 2)

Example: log-MAP

Example: log-MAP +0.48 +0.62 -1,02    Assume Es/N0=1/4 = - 6.02 dB R=3/8 so Eb/N0=2/3 = - 1.76dB -0,45 0,45 3,44 3,02 1.34 0,44 1,59 3,06 1,60 -0,25 +0,25 -0,75 +0,75 +0,35 -0,35 +1,45 -1,45 -0,45 +0,45 1,60

Example: Max-log-MAP -0,07 +0,10 -0,40    Assume Es/N0=1/4 = - 6.02 dB R=3/8 so Eb/N0=2/3 = - 1.76dB -0,45 0,45 3,31 2,34 1.20 -0,20 1,25 3,05 1,60 -0,25 +0,25 -0,75 +0,75 +0,35 -0,35 +1,45 -1,45 -0,45 +0,45 1,60

Punctured convolutional codes Recall that an (n,k) convolutional code has a decoder trellis with 2k branches going into each state More complex decoding Solutions Syndrome decoding Bit-oriented trellis Punctured codes Start with low-rate convolutional mother code (rate 1/n?) Puncture (delete) some code bits according to predetermined pattern Punctured bits are not transmitted. Hence the code rate is increased, but the distance between codewords is reduced Decoder inserts dummy bits with neutral metrics contribution

Example: Rate 2/3 punctured from rate 1/2 dfree = 3 The punctured code is also a convolutional code Note the bit-level trellis

Example: Rate 3/4 punctured from rate 1/2 dfree = 3

More on punctured convolutional codes Rate compatible punctured convolutional codes For applications that needs to support several code rates For example adaptive coding Sequence of codes obtained by repeated puncturing Advantage: One decoder can decode all codes in family Disadvantage: Resulting codes may be sub-optimum Puncturing patterns Usually periodic puncturing maps Found by computer search Care must be exercised to avoid catastrophic encoders

Best punctured codes 1 10 5 3 !4 2 6 7 5 7 17 6 27

Tailbiting convolutional codes Purpose: Avoid the terminating tail (rate loss) without distance loss? Cannot avoid distance loss completely Tailbiting codes: Paths through trellis Codewords can start in any state Gives 2 as many codewords But each codeword must end in the same state that it started from …Gives 2- as many codewords Tailbiting codes increasingly popular for moderate length purposes DVB: Turbo codes with tailbiting component codes

”Circular” trellis Decoding Try all possible starting states (multiplies complexity by 2 ) Suboptimum Viterbi: Let decoding proceed until best ending state found, continue ”one round” from there MAP: Similar

Example: Feedforward encoder Feedforward: always possible to find an information vector that ends in the proper state-> can start in any state and end in any state

Example: Feedback encoder Feedback encoder: Not always possible, for every length, to construct a tailbiting systematic recursive encoder For each u: must find unique starting state L*=6 not OK L*=5 OK

Tailbiting codes as block codes Tailbiting codes are block codes

Suggested exercises 12.27-12.39