Analysis of Iterative Decoding

Slides:



Advertisements
Similar presentations
Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Advertisements

Cyclic Code.
Error Control Code.
(speaker) Fedor Groshev Vladimir Potapov Victor Zyablov IITP RAS, Moscow.
Information Theory EE322 Al-Sanie.
1 Finite-Length Scaling and Error Floors Abdelaziz Amraoui Andrea Montanari Ruediger Urbanke Tom Richardson.
Cooperative Multiple Input Multiple Output Communication in Wireless Sensor Network: An Error Correcting Code approach using LDPC Code Goutham Kumar Kandukuri.
Chapter 6 Information Theory
Near Shannon Limit Performance of Low Density Parity Check Codes
Linear-time encodable and decodable error-correcting codes Daniel A. Spielman Presented by Tian Sang Jed Liu 2003 March 3rd.
Asymptotic Enumerators of Protograph LDPCC Ensembles Jeremy Thorpe Joint work with Bob McEliece, Sarah Fogal.
Chapter 11 Error-Control CodingChapter 11 : Lecture edition by K.Heikkinen.
1 Scalable Image Transmission Using UEP Optimized LDPC Codes Charly Poulliat, Inbar Fijalkow, David Declercq International Symposium on Image/Video Communications.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
RAPTOR CODES AMIN SHOKROLLAHI DF Digital Fountain Technical Report.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Noise, Information Theory, and Entropy
CS774. Markov Random Field : Theory and Application Lecture 10 Kyomin Jung KAIST Oct
Noise, Information Theory, and Entropy
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
Review of modern noise proof coding methods D. Sc. Valeri V. Zolotarev.
Multilevel Coding and Iterative Multistage Decoding ELEC 599 Project Presentation Mohammad Jaber Borran Rice University April 21, 2000.
Channel Coding Part 1: Block Coding
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 7 Low Density Parity Check Codes.
Channel Capacity
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
Codes Codes are used for the following purposes: - to detect errors - to correct errors after detection Error Control Coding © Erhan A. Ince Types: -Linear.
1 –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system.
MIMO continued and Error Correction Code. 2 by 2 MIMO Now consider we have two transmitting antennas and two receiving antennas. A simple scheme called.
Redundancy The object of coding is to introduce redundancy so that even if some of the information is lost or corrupted, it will still be possible to recover.
Wireless Communication Elec 534 Set I September 9, 2007 Behnaam Aazhang.
Andrea Montanari and Ruediger Urbanke TIFR Tuesday, January 6th, 2008 Phase Transitions in Coding, Communications, and Inference.
A Novel technique for Improving the Performance of Turbo Codes using Orthogonal signalling, Repetition and Puncturing by Narushan Pillay Supervisor: Prof.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Basic Characteristics of Block Codes
Introduction of Low Density Parity Check Codes Mong-kai Ku.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
§6 Linear Codes § 6.1 Classification of error control system § 6.2 Channel coding conception § 6.3 The generator and parity-check matrices § 6.5 Hamming.
DIGITAL COMMUNICATIONS Linear Block Codes
15-853:Algorithms in the Real World
Coding Theory Efficient and Reliable Transfer of Information
Quantization Codes Comprising Multiple Orthonormal Bases Alexei Ashikhmin Bell Labs  MIMO Broadcast Transmission  Quantizers Q(m) for MIMO Broadcast.
Multi-Edge Framework for Unequal Error Protecting LT Codes H. V. Beltr˜ao Neto, W. Henkel, V. C. da Rocha Jr. Jacobs University Bremen, Germany IEEE ITW(Information.
Low Density Parity Check codes
Some Computation Problems in Coding Theory
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
1 Design of LDPC codes Codes from finite geometries Random codes: Determine the connections of the bipartite Tanner graph by using a (pseudo)random algorithm.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Fidelity of a Quantum ARQ Protocol Alexei Ashikhmin Bell Labs  Classical Automatic Repeat Request (ARQ) Protocol  Quantum Automatic Repeat Request (ARQ)
Log-Likelihood Algebra
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
Information Theory & Coding for Digital Communications Prof JA Ritcey EE 417 Source; Anderson Digital Transmission Engineering 2005.
Block Coded Modulation Tareq Elhabbash, Yousef Yazji, Mahmoud Amassi.
Progress Report for the UCLA OCDMA Project UCLA Graduate School of Engineering - Electrical Engineering Program Communication Systems Laboratory Miguel.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Joint Decoding on the OR Channel Communication System Laboratory UCLA Graduate School of Engineering - Electrical Engineering Program Communication Systems.
The Viterbi Decoding Algorithm
Interleaver-Division Multiple Access on the OR Channel
Chris Jones Cenk Kose Tao Tian Rick Wesel
Cyclic Code.
Miguel Griot, Andres I. Vila Casado, and Richard D. Wesel
Uncoordinated Optical Multiple Access using IDMA and Nonlinear TCM
Lihua Weng Dept. of EECS, Univ. of Michigan
Presentation transcript:

Analysis of Iterative Decoding Alexei Ashikhmin Research Department of Mathematics of Communications Bell Laboratories

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes Binary Erasure Channel Gaussian Channel MIMO Channel Expander Codes

Shannon’s Channel Coding Theorem In 1948, Claude Shannon, generally regarded as the father of the Information Age, published the paper: “A Mathematical Theory of Communications” which laid the foundations of Information Theory.

In this remarkable paper, he formulated the notion of channel capacity, defining the maximal rate by which information can be transmitted reliably over the channel. Channel

Shannon proved that for any channel, there exists a family of codes (including linear block codes) that achieve arbitrary small probability of error at any communication rate up to the channel capacity Encoder Channel Decoder

Linear binary codes A binary linear [n,k] code C is a k-dimensional subspace of R=k/n is the code rate Example of an [6,2] code:

Repetition code of length 3 Single parity check code of length 3 sum of code bits of any codeword equals zero by mod 2, i.e. the number of ones in any codeword is even

Shannon’s Channel Coding Theorem (Cont.) Shannon proved that if R<C then a typical (random) code has the probability of error decreasing exponentially fast with the code length (SNR is the Signal to Noise Ratio)

The complexity of decoding of a random code is We need codes with a nice structure Algebraic codes (BCH, Reed-Solomon, Algebraic Geometry) have nice structure, but do not allow one to achieve capacity

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes Binary Erasure Channel AWGN and Other Channels MIMO Channel Expander Codes

Low Density Parity Check (LDPC) Codes LDPC codes can be defined with the help of bipartite graphs 1 1 Variable nodes Check nodes

LDPC codes – Definition (Cont.) Sparse graph Average variable node degree dv Average check node degree dc n is the code length m is the number of parity checks n-m is the number of information symbols

Belief Propagation Decoding We receive from the channel a vector of corrupted symbols For each symbol we compute log-likelyhood ratio

Belief Propagation Decoding (Cont.) Sparse graph

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes Binary Erasure Channel AWGN and Other Channels MIMO Channel Expander Codes

Density Evolution Analysis T.Richardson and R. Urbanke Assume that we transmit +1, -1 through Gaussian channel. Then received symbols are Gaussian random variables their log-likelihood ratios (LLR) are also Gaussian random variables

Density Evolution Analysis Sparse graph

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes Binary Erasure Channel AWGN and Other Channels MIMO Channel Expander Codes

Extrinsic Information Transfer (EXIT) Functions Stephen ten Brink in 1999 came up with EXIT functions for analysis of iterative decoding of TURBO codes Ashikhmin, Kramer, ten Brink 2002: EXIT functions analysis of LDPC codes and properties of EXIT functions in the binary erasure channel E.Sharon, A.Ashikhmin, S.Litsyn 2003: EXIT functions for continues channels I.Sutskover, S.Shamai, J.Ziv 2003: bounds on EXIT functions I.Land, S.Huettinger, P. Hoeher, J.Huber 2003: bounds on EXIT functions Others

EXIT Functions (cont) Extrinsic APP Channel Encoder Decoder Source Average a priori information: Average extrinsic information: EXIT function:

Simplex [15,4] Code and a Good Code of Infinite Length with R=4/15

Encoder 1 Source Extrinsic Channel APP Decoder 2 Communication Average a priori information: Average communication information: Average extrinsic information: EXIT function:

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes Binary Erasure Channel AWGN and Other Channels MIMO Channel Expander Codes

EXIT Function in Binary Erasure Channel A. Ashikhmin, G. Kramer, S EXIT Function in Binary Erasure Channel A.Ashikhmin, G.Kramer, S. ten Brink are split support weights (or generalized Hamming weights) of a code, i.e. the number of subspaces of the code that have dimension r and support weight i on the first n positions and support weight j on the second m positions. Let Then

Extrinsic Chan. Comm. Chan. Decoder Decoder

Examples for BEC with erasure probability q Let dv=2 and dc=4, the code rate R=1-dv/dc =1/2 and This code does not achieve capacity

In BEC with q=0.3

Area Theorems for Binary Erasure Channel Encoder Extrinsic Channel APP Decoder Source Theorem:

Code with large minimum distance Code with small minimum distance

Encoder 1 Source Extrinsic Channel APP Decoder 2 Communication Theorem: where C is the capacity of the communication channel

For successful decoding we must guarantee that EXIT functions do not intersect with each other This is possible only if the area under the variable nodes function is larger than the area under the check nodes function

To construct an LDPC code that achieves capacity in BEC we must match the EXIT functions of variable and check nodes. Tornado LDPC codes (A. Shokrollahi) Right-regular LDPC codes (A. Shokrollahi), obtained with the help of the Taylor series expansion of the EXIT functions:

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes Binary Erasure Channel AWGN and Other Channels MIMO Channel Expander Codes

AWGN and other Communication Channels E. Sharon, A. Ashikhmin, S AWGN and other Communication Channels E.Sharon, A. Ashikhmin, S. Litsyn To analyse LDPC codes we need EXIT functions of the repetition and single parity check codes in other (not BEC) channels EXIT function of repetition codes for AWGN channel where

Let be “soft bits” Let be the conditional probability density of T given 1 was transmitted. If the channel is T-consistent, i.e. if then the EXIT function of single repetition code of length n is

How accurate can we be with EXIT functions Let us take the following LDPC code: the variable nodes degree distribution the check node degree distribution According to Density Evolution analysis this code can work in AWGN channel with Eb/No=0.3dB According to the EXIT function analyses the code can work at Eb/No=0.30046dB, the difference is only 0.00046dB

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes Binary Erasure Channel Gaussian Channel MIMO Channel Expander Codes

Application to Multiple Antenna Channel Capacity of the Multiple Input Multiple Output channel grows linearly with the number of antennas We assume that detector knows coefficients

Design of LDPC Code for MIMO Channel S. ten Brink, G. Kramer, A Design of LDPC Code for MIMO Channel S. ten Brink, G. Kramer, A. Ashikhmin We construct combined EXIT function of detector and variable nodes and match it with the EXIT function of the check nodes The resulting node degree distribution is different from the AWGN channel

Probability of Error and Decoding Complexity Let A be an LDPC code with rate R=(1-)C Conjecture 1: the probability of decoding error of A decreases only polynomially with the code length Conjecture 2: the complexity of decoding behaves like

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes Binary Erasure Channel Gaussian Channel MIMO Channel Expander Codes

Exapnder Codes M.Sipser and D.A.Spielman (1996) Let us take a bipartite expander graph Assign to edges code bits such that bits on edges conneted to a left (right) node form a codeword of code C1 (C2) Bits form a code word of C2 Bits form a code word of C1 Bits form a code word of C2 Bits form a code word of C1

An (V,E) graph is called (,)-expander if every subset of at most |V| has at least |V| neighbors

Decoding of Exapnder Codes Maximum Likelihood Decoding of C2 Maximum Likelihood Decoding of C1 Maximum Likelihood Decoding of C2 Maximum Likelihood Decoding of C1

M. Sipser and D. Spielman (1996) showed that an Expander code can decode d/48 errors and that d grows linearly with N G. Zemor (2001) proved that an Expander code can decode d/4 errors A. Barg and G. Zemor proved that Expander codes have positive error exponent if R<C R. Roth and V. Skachek (2003) proved that an Expander codes can decode d/2 errors What is the complexity of decoding of Expander codes?

Complexity of Decoding of Expander Codes Let N be the entire code length, let n be the length of codes C1 and C2 We choose n=log2 N and allow N tends to infinity The complexity of ML decoding of C1 and C2 is O(2n)=O(N) The overall complexity of decoding is linear in N At the same time if R=(1-)C then the complexity of decoding is Can we replace ML decoding with decoding up to half min.dist?

Threshold of Decoding of Expander Codes Barg and Zemor: Choose C2 to be a good code with R2 1 and C1 to be a good code with R1 C (capacity). The rate of the expander code is R=R1+R2-1 C (capacity). C2 Expander Code: R1 C

Codes with Polynomial Decoding Complexity and Positive Error Exponent A.Ashikhmin and V.Skachek (preliminary results) We assume that in there exist LDPC codes such that Conjecture 1. Conjecture 2. The complexity of decoding Let us use such kind of LDPC codes as constituent codes C1 and C2 in a Expander code Cexp with rate R=(1-)C.

Theorem. The complexity of decoding of Cexp is The error exponent is where i maximizes the expression: E is small, but positive. Hence Perror=2-NE is decreasing exponentially fast with the code length N.

Concatenation of LDPC and Expander codes Random codes

Thank you for your attention

YOUR THANK ATTENTION! YOU FOR

EXIT Function source