Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system.

Similar presentations


Presentation on theme: "1 –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system."— Presentation transcript:

1 1 http://www.ii.uib.no/~paale/oblig.xhtml –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system in Matlab using the Communication Blockset. The system must simulate communication over an AWGN channel using either of these codes: –Block code –Convolutional code –PCCC –LDPC –You are free to implement any of these coding techniques, as long as the requirements below are fullfilled: –Information length k = 1024 Block length n = 3072 E b / N o = 1.25 –We will test your answers on our own computer and evaluate them based on bit error rate versus CPU time usage according to the following formula: p = T ⋅ BER

2 2 http://www.ii.uib.no/~paale/oblig.xhtml How to create and run simulations in MATLAB from scratch –Run matlab from a command window. –Type in simulink in MATLAB's command window. –Choose File -> New -> Model in Simulink's main window. –Create the model by dragging and dropping elements into it. How to finish this exercise starting from a demo –Run matlab from a command window. –Type in sccc_sim in MATLAB's command window. A ready- made demo of an SCCC opens. –Study the demo closely and modify it to your needs.

3 3 Design of turbo codes Turbo codes can be designed for performance at Low SNR High SNR Choices: Constituent codes, interleaver Errors that an ML decoder would also make Errors due to imperfectness of decoding algorithm

4 4 Design of turbo codes for high SNR Goal: Maximize minimum distance d Minimize the number of codewords A d of weight d In general, design code for a thin weight spectrum Use recursive component encoders! Simple approach: Concentrate on the weight two-inputs Simple (but flawed!) approach: This applies if the interleavers are chosen at random. But it is possible (and even easy) to avoid the problem

5 5 Weight-two inputs Assume primitive feedback polynomial Two weight input vectors that will take the encoder out of and back to the initial state: (1, 0,0, 1)  corresponds to parity weight z min (1, 0,0, 0,0,0, 1) In general, 1 followed by 3m-1 0s followed by a 1 Even more general, 1 followed by (2 -1)m-1 0s followed by a 1 d eff = 2 + 2 z min

6 6 Theorem on z min Theorem: Let G(D) = [ 1, e(D)/d(D) ], where the denominator polynomial d(D) is primitive of degree. Then z min = 2 -1 + 2s, s=1 if e(D) has constant coefficient 1 and degree, s=0 otherwise. Proof: d(D) is the generator polynomial of a cyclic (2 -1, 2 -1- ) Hamming code q(D)=(D 2 -1 +1) / d(D) is the generator polynomial of a cyclic (2 -1, ) maximum-length code of minimum distance 2 -1 deg e(D) < : e(D) q(D) is a codeword in the maximum-length code and so has weight 2 -1 deg e(D) = : e(D)=DD -1 + e (2) (D). c 1 (D) =D -1 q(D) and c 2 (D) = e (2) (D)q(D) are both codewords in the maximum-length code and so have weight 2 -1. Dc 1 (D) = [cycl. shift of c 1 (D)]+ D 2 -1 +1, so e(D) q(D) is [a codeword with const.coeff=0] + 1 + D 2 -1.

7 7 Convolutional Recursive Encoders for PCCC codes Max 6 10 13 16 18 5 8 10 12 13 15 3 5 6 7 8 10

8 8 Convolutional Recursive Encoders for PCCC codes Max 2 3 4 5 6 7 2 3 4 5

9 9 Choice of component codes The listed codes may not have the best free distance, but have a better mapping (compared to ”optimum” CCs) of input weights to output weights The overall turbo code performance depends also on the actual interleaver used

10 10 Choice of interleaver Pseudorandom interleavers with enhanced requirements: Interleavers that avoid problem with weight-2 inputs:* If | i-j | = (2 -1)m, then |  (i)-  (j) |  (2 -1)n (for n+m small) S-random interleaver: If | i-j |  S, then |  (i)-  (j) |  S Interleavers specialized to accommodate the actual encoders* Maintains a list of ”critical” sets of positions, which are the information symbols of low weight words Do not map one critical set into another

11 11 Design of turbo codes for low SNR The foregoing discussion assumes that the decoding is close to maximum likelihood. This is not the case for very low SNRs Goal for low SNR: Optimize interchange of information between the constituent decoders Analyze this interchange by using density evolution or EXIT charts

12 12 EXtrinsic Information Transfer charts Approach: A SISO block produces a more accurate information about the transmitted information at its output than what is available at the input The amount of information can be precisely quantified using information theory The entropy H(X) of a stochastic variable X is given as H(X) = -  x P(X=x)log(P(X=x)). It is a measure of uncertainty The mutual information I(X;Y) = H(X)-H(X|Y) For a specified SNR (and thus a known information about the u l due to the channel values): I a (u l,L a (u l )) I e (u l,L e (u l )) EXIT chart: I e (u l,L e (u l )) as a function of I a (u l,L a (u l )) by log-MAP

13 13 EXIT curves Obtained by simulations (But much simpler than turbo code simulations)

14 14 EXIT charts Next, plot the EXIT curve for one SNR, together with its mirror image. These curves represent the EXIT curves of the two constituent decoders Open tunnel: Decoding will proceed to convergence

15 15 EXIT charts EXIT chart for another SNR: Closed tunnel: Decoding will get stuck

16 16 SNR Threshold SNR Threshold: The smallest SNR with an open EXIT chart tunnel Defines the start of the waterfall region Tunnel opens Non-convergence becomes a small problem

17 17 EXIT chart A property of the constituent encoder Can be used to find good constituent encoders for low SNRs In general, simple is good (flatter EXIT curve) Can be used for codes with different constituent encoders too. The constituent coders can in this case be fitted to each other’s EXIT curve, providing a lower SNR threshold It is assumed that the interleavers are very long, so that a Gaussian Approximation applies: Errors in the extrinsic values occur according to a Gaussian distribution

18 18 Iterative decoding Decoding examples Some observations

19 19 Decoding example

20 20 Decoding example: K=4

21 21 The effect of many iterations

22 22 Iterative decoding: Stopping criteria Fixed number of iterations Hard-decisions If the hard decisions of the two extrinsic value vectors coincide; assume that convergence has been achieved Cross-entropy Outer error-detecting codes

23 23 Iterative decoding: Some observations Parallel implementations: The constituent decoders can work in parallel Final decision can be taken from a posteriori values of either constituent decoder; their average; or the extrinsic values The decoder may sometimes, depending on the SNR and on the occurence of structural faults in the interleaver, oscillate between correct and incorrect decisions Max-log-MAP can be shown to be equivalent to SOVA Max-log-MAP is (a little!) simpler to implement than log-MAP, but suffers a penalty of about 0.5 dB

24 24 Suggested exercises 16.16-16.30


Download ppt "1 –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system."

Similar presentations


Ads by Google