Presentation is loading. Please wait.

Presentation is loading. Please wait.

MAP decoding: The BCJR algorithm

Similar presentations


Presentation on theme: "MAP decoding: The BCJR algorithm"— Presentation transcript:

1 MAP decoding: The BCJR algorithm
Maximum A posteriori Probability Bahl-Cocke-Jelinek-Raviv (1974) Baum-Welch algorithm (1963?)* Decoder inputs: Received sequence r (soft or hard) A priori L-values La(ul) = ln( P(ul=1)/ P(ul=0) ) Decoder outputs A posteriori L-values L(ul) = ln( P(ul=1 | r)/ P(ul=0 | r) ) > 0: ul is most likely to be 1 < 0: ul is most likely to be 0

2 BCJR (Continued)

3 BCJR (Continued)

4 BCJR (Continued)

5 BCJR (Continued) AWGN

6 MAP algorithm Initialize Forward recursion 0(s) and backward recursion N(s) Compute branch metrics {l(s’,s)} Carry out forward recursion {l+1(s)} based on {l(s)} Carry out backward recursion {l-1(s)} based on {l(s)} Compute APP L-values Complexity: approximately 3xViterbi Requires detailed knowledge of SNR Viterbi just maximizes rv, does not require exact SNR

7 BCJR (Continued) Information bits Termination bits

8 BCJR (Continued)

9 BCJR (Continued)

10 Log-MAP algorithm Initialize Forward recursion 0*(s) and backward recursion N*(s) Compute branch metrics {l*(s,s’)} Carry out forward recursion {l+1*(s)} based on {l*(s)} Carry out backward recursion {l-1*(s)} based on {l*(s)} Compute APP L-values Advantages over MAP algorithm: Easier to implement Numerically stable

11 Max-log-MAP algorithm
Replace max* by max: delete table lookup correction term Advantage: Simpler, faster (?) Forward and backward pass equivalent to Viterbi decoder Disadvantage: Less accurate (but the correction term is limited in size by ln 2)

12 Example: log-MAP

13 Example: log-MAP +0.48 +0.62 -1,02    Assume Es/N0=1/4 = - 6.02 dB
R=3/8 so Eb/N0=2/3 = dB -0,45 0,45 3,44 3,02 1.34 0,44 1,59 3,06 1,60 -0,25 +0,25 -0,75 +0,75 +0,35 -0,35 +1,45 -1,45 -0,45 +0,45 1,60

14 Example: Max-log-MAP -0,07 +0,10 -0,40   
Assume Es/N0=1/4 = dB R=3/8 so Eb/N0=2/3 = dB -0,45 0,45 3,31 2,34 1.20 -0,20 1,25 3,05 1,60 -0,25 +0,25 -0,75 +0,75 +0,35 -0,35 +1,45 -1,45 -0,45 +0,45 1,60

15 Punctured convolutional codes
Recall that an (n,k) convolutional code has a decoder trellis with 2k branches going into each state More complex decoding Solutions Syndrome decoding Bit-oriented trellis Punctured codes Start with low-rate convolutional mother code (rate 1/n?) Puncture (delete) some code bits according to predetermined pattern Punctured bits are not transmitted. Hence the code rate is increased, but the distance between codewords is reduced Decoder inserts dummy bits with neutral metrics contribution

16 Example: Rate 2/3 punctured from rate 1/2
dfree = 3 The punctured code is also a convolutional code Note the bit-level trellis

17 Example: Rate 3/4 punctured from rate 1/2
dfree = 3

18 More on punctured convolutional codes
Rate compatible punctured convolutional codes For applications that needs to support several code rates For example adaptive coding Sequence of codes obtained by repeated puncturing Advantage: One decoder can decode all codes in family Disadvantage: Resulting codes may be sub-optimum Puncturing patterns Usually periodic puncturing maps Found by computer search Care must be exercised to avoid catastrophic encoders

19 Best punctured codes 1 10 5 3 !4 2 6 7 5 7 17 6 27

20 Tailbiting convolutional codes
Purpose: Avoid the terminating tail (rate loss) without distance loss? Cannot avoid distance loss completely Tailbiting codes: Paths through trellis Codewords can start in any state Gives 2 as many codewords But each codeword must end in the same state that it started from …Gives 2- as many codewords Tailbiting codes increasingly popular for moderate length purposes DVB: Turbo codes with tailbiting component codes

21 ”Circular” trellis Decoding
Try all possible starting states (multiplies complexity by 2 ) Suboptimum Viterbi: Let decoding proceed until best ending state found, continue ”one round” from there MAP: Similar

22 Example: Feedforward encoder
Feedforward: always possible to find an information vector that ends in the proper state-> can start in any state and end in any state

23 Example: Feedback encoder
Feedback encoder: Not always possible, for every length, to construct a tailbiting systematic recursive encoder For each u: must find unique starting state L*=6 not OK L*=5 OK

24 Tailbiting codes as block codes
Tailbiting codes are block codes

25 Suggested exercises


Download ppt "MAP decoding: The BCJR algorithm"

Similar presentations


Ads by Google