Presentation is loading. Please wait.

Presentation is loading. Please wait.

Optimizing LDPC Codes for message-passing decoding. Jeremy Thorpe Ph.D. Candidacy 2/26/03.

Similar presentations


Presentation on theme: "Optimizing LDPC Codes for message-passing decoding. Jeremy Thorpe Ph.D. Candidacy 2/26/03."— Presentation transcript:

1 Optimizing LDPC Codes for message-passing decoding. Jeremy Thorpe Ph.D. Candidacy 2/26/03

2 Overview  Research Projects  Background to LDPC Codes  Randomized Algorithms for designing LDPC Codes  Open Questions and Discussion

3 Data Fusion for Collaborative Robotic Exploration  Developed a version of the Mastermind game as a model for autonomous inference.  Applied the Belief Propagation algorithm to solve this problem.  Showed that the algorithm had an interesting performance-complexity tradeoff.  Published in JPL's IPN Progress Reports.

4 Dual-Domain Soft-in Soft-out Decoding of Conv. Codes  Studied the feasibility of using the Dual SISO algorithm for high rate turbo-codes.  Showed that reduction in state-complexity was offset by increase in required numerical accuracy.  Report circulated internally at DSDD/HIPL S&S Architecture Center, Sony.

5 Short-Edge Graphs for Hardware LDPC Decoders.  Developed criteria to predict performance and implementational simplicity of graphs of Regular (3,6) LDPC codes.  Optimized criteria via randomized algorithm (Simulated Annealing).  Achieved codes of reduced complexity and superior performance to random codes.  Published in ISIT 2002 proceedings.

6 Evalutation of Probabilistic Inference Algorithms  Characterize the performance of probabilistic algorithms based on observable data  Axiomatic definition of "optimal characterization"  Existence, non-existence, and uniqueness proofs for various axiom sets  Unpublished

7 Optimized Coarse Quantizers for Message-Passing Decoding  Mapped 'additive' domains for variable and check node operations  Defined quantized message passing rule in these domains  Optimized quantizers for 1-bit to 4-bit messages  Submitted to ISIT 2003

8 Graph Optimization using Randomized Algorithms  Introduce Proto-graph framework  Use approximate density evolution to predict performance of particular graphs  Use randomized algorithms to optimize graphs (Extends short-edge work)  Achieves new asymptotic performance- complexity mark

9 Bacground to LDPC codes

10 The Channel Coding Strategy  Encoder chooses the m th codeword in codebook C and transmits it across the channel  Decoder observes the channel output y and generates m’ based on the knowledge of the codebook C and the channel statistics. Decoder Encoder Channel

11 Linear Codes  A linear code C (over a finite field) can be defined in terms of either a generator matrix or parity-check matrix.  Generator matrix G (k×n)  Parity-check matrix H (n-k×n)

12 LDPC Codes  LDPC Codes -- linear codes defined in terms of H  H has a small average number of non-zero elements per row or column.

13 Graph Representation of LDPC Codes  H is represented by a bipartite graph.  There is an edge from v to c if and only if:  A codeword is an assignment of v's s.t.: Variable nodes Check nodes... v c

14 Message-Passing Decoding of LDPC Codes  Message Passing (or Belief Propagation) decoding is a low-complexity algorithm which approximately answers the question “what is the most likely x given y?”  MP recursively defines messages m v,c (i) and m c,v (i) from each node variable node v to each adjacent check node c, for iteration i=0,1,...

15 Two Types of Messages...  Likelihood Ratio  For y 1,...y n independent conditionally on x:  Probability Difference  For x 1,...x n independent:

16 ...Related by the Biliniear Transform  Definition:  Properties:

17 Message Domains Likelihood Ratio Log Likelihood RatioLog Prob. Difference Probability Difference

18 Variable to Check Messages  On any iteration i, the message from v to c is:  In the additive domain:... v c

19 Check to Variable Messages  On any iteration, the message from c to v is:  In the additive domain:... v c

20 Decision Rule  After sufficiently many iterations, return the likelihood ratio:

21 Theorem about MP Algorithm  If the algorithm stops after r iterations, then the algorithm returns the maximum a posteriori probability estimate of x v given y within radius r of v.  However, the variables within a radius r of v must be dependent only by the equations within radius r of v, v r...

22 Regular (λ,ρ) LDPC codes  Every variable node has degree λ, every check node has degree ρ.  Best rate 1/2 code is (3,6), with threshold 1.09 dB.  This code had been invented by 1962 by Robert Gallager.

23 Regular LDPC codes look the same from anywhere!  The neighborhood of every edge looks the same.  If the all-zeros codeword is sent, the distribution of any message depends only on its neighborhood.  We can calculate a single message distribution once and for all for each iteration.

24 Analysis of Message Passing Decoding (Density Evolution)  We assume that the all-zeros codeword was transmitted (requires a symmetric channel).  We compute the distribution of likelihood ratios coming from the channel.  For each iteration, we compute the message distributions from variable to check and check to variable.

25 D.E. Update Rule  The update rule for Density Evolution is defined in the additive domain of each type of node.  Whereas in B.P, we add (log) messages:  In D.E, we convolve message densities:

26 Familiar Example:  If one die has density function given by:  The density function for the sum of two dice is given by the convolution: 136542 24765381012119

27 D.E. Threshold  Fixing the channel message densities, the message densities will either "converge" to minus infinity, or they won't.  For the gaussian channel, the smallest SNR for which the densities converge is called the density evolution threshold.

28 D.E. Simulation of (3,6) codes  Threshold for regular (3,6) codes is 1.09 dB  Set SNR to 1.12 dB (.03 above threshold)  Watch fraction of "erroneous messages" from check to variable

29 Improvement vs. current error fraction for Regular (3,6)  Improvement per iteration is plotted against current error fraction  Note there is a single bottleneck which took most of the decoding iterations

30 Irregular (λ, ρ) LDPC codes  a fraction λ i of variable nodes have degree i. ρ i of check nodes have degree i.  Edges are connected by a single random permutation.  Nodes have become specialized.... Variable nodes Check nodes π λ3λ3 λnλn ρ4ρ4 λ2λ2 ρmρm

31 D.E. Simulation of Irregular Codes (Maximum degree 10)  Set SNR to 0.42 dB (~.03 above threshold)  Watch fraction of erroneous check to variable messages.  This Code was designed by Richardson et. al.

32 Comparison of Regular and Irregular codes  Notice that the Irregular graph is much flatter  Note: Capacity achieving LDPC codes for the erasure channel were designed by making this line exactly flat

33 Constructing LDPC code graphs from a proto-graph  Consider a bipartite graph G, called a "proto-graph"  Generate a graph G α called an "expanded graph" replace each node by α nodes. replace each edge by α edges, permuted at random α=2 =G =G 2

34 Local Structure of G α  The structure of the neighborhood of any edge in G α can be found by examining G  The neighborhod of radius r of a random edge is increasingly probably loop-free as α→∞.

35 Density Evolution on G  For each edge (c,v) in G, compute:  and:

36 Density Evolution without convolution  One-dimensional approximation to D.E, which requires: A statistic that is approximately additive for check nodes A statistic that is approximately additive for variable nodes A way to go between these two statistics A way to characterize the message distribution from the channel

37 Optimizing a Proto Graph using Simulated Annealing  Simulated Annealing is an iterative algorithm that approximately minimizes an energy function  Requirements: A space S over which to find the optimum point An energy function E(s):S→R A random perturbation function p(s):S→S A "temperature profile" t(i)

38 Optimization Space  Graphs with a fixed number of variable and check nodes (rate is fixed)  Optionally, we can add untransmitted (state) variables to the code  Typical Parameters 32 transmitted variables 5 untransmitted variables 21 parity checks

39 Energy function  Ideal: density evolution threshold.  Practical: Approximate density evolution threshold Number of iterations to converge to fixed error probability at fixed SNR

40 Perturbations  Types of operation Add an edge Delete an edge Swap two edges  Note: Edge swapping operation not necessary to span the space

41 Basic Simulated Annealing Algorithm  Take s 0 = a random point in S  For each iteration i, define s i ' = p(s i )  if E(s i ') < E(s i ) set s i+1 = s i '  if E(s i ') > E(s i ) set s i+1 = s i ' w.p.

42 Degree Profile of Optimized Code  The optimized graph has a large fraction of degree 1 variables  Check variables range from degree 3 to degree 8  (recall that the graph is not defined by the degree profile)

43 Threshold vs. Complexity  Designed codes of rate.5 with threshold 8 mB from channel capacity on AWGN channel  Low complexity (maximum degree = 8)

44 Improvement vs. Error Fraction Comparison to Regular (3,6)  The regular (3,6) code has a dramatic bottleneck.  The irregular code with maximum degree 10 is flatter, but has a bottleneck.  The optimized proto-graph based code is nearly flat for a long stretch.

45 Simulation Results  n=8192, k=4096  Achieves bit error rate of about 4×10 -4 at SNR=0.8dB.  Beats the performance of n=10000 code in [1] by a small margin.  There is evidence that there is an error floor

46 Review  We Introduced the idea of LDPC graphs based on a proto-graph  We designed proto-graphs using the Simulated Annealing algorithm, using a fast approximation to density evolution  The design handily beats other published codes of similar maximum degree

47 Open Questions  What's the ultimate limit to the performance vs. maximum degree tradeoff?  Can we find a way to achieve the same tradeoff without randomized algorithms?  Why do optimizing distributions sometimes force the codes to have low-weight codewords?

48 A Big Question  Can we derive the shannon limit in the context of MP decoding of LDPC codes, so that we can meet the inequalities with equality?

49 Free Parameters within S.A.  Rate  Maximum check, variable degrees  Proto-graph size  Fraction of untransmitted variables  Channel Parameter (SNR)  Number of iterations in Simulated Annealing

50 Performance of Designed MET Codes  Shows performance competitive with best published codes  Block error probability <10 -5 at 1.2 dB  a soft error floor is observed at very high SNR, but not due to low-weight codewords

51 Multi-edge-type construction  Edges of a particular "color" are connected through a permutation.  Edges become specialized. Each edge type has a different message distribution each iteration.

52 MET D.E. vs. decoder simulation


Download ppt "Optimizing LDPC Codes for message-passing decoding. Jeremy Thorpe Ph.D. Candidacy 2/26/03."

Similar presentations


Ads by Google