Channel Polarization and Polar Codes

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Topics discussed in this section:
Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Chapter 8 Channel Capacity. bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty.
II. Modulation & Coding. © Tallal Elshabrawy Design Goals of Communication Systems 1.Maximize transmission bit rate 2.Minimize bit error probability 3.Minimize.
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Information Theory EE322 Al-Sanie.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Lab 2 COMMUNICATION TECHNOLOGY II. Capacity of a System The bit rate of a system increases with an increase in the number of signal levels we use to denote.
Chapter 6 Information Theory
Classical capacities of bidirectional channels Charles Bennett, IBM Aram Harrow, MIT/IBM, Debbie Leung, MSRI/IBM John Smolin,
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Fundamental limits in Information Theory Chapter 10 :
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch4: Channel.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Tracey Ho Sidharth Jaggi Tsinghua University Hongyi Yao California Institute of Technology Theodoros Dikaliotis California Institute of Technology Chinese.
2/28/03 1 The Virtues of Redundancy An Introduction to Error-Correcting Codes Paul H. Siegel Director, CMRR University of California, San Diego The Virtues.
Division of Engineering and Applied Sciences DIMACS-04 Iterative Timing Recovery Aleksandar Kavčić Division of Engineering and Applied Sciences Harvard.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
Random coding for wireless multicast Brooke Shrader and Anthony Ephremides University of Maryland Joint work with Randy Cogill, University of Virginia.
Improving the Performance of Turbo Codes by Repetition and Puncturing Youhan Kim March 4, 2005.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
ECED 4504 Digital Transmission Theory
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
Fundamentals of Digital Communication 2 Digital communication system Low Pass Filter SamplerQuantizer Channel Encoder Line Encoder Pulse Shaping Filters.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Channel Capacity
Channel Capacity.
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
§2 Discrete memoryless channels and their capacity function
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
DIGITAL COMMUNICATIONS Linear Block Codes
Coding Theory Efficient and Reliable Transfer of Information
University of Massachusetts Amherst · Department of Computer Science Square Root Law for Communication with Low Probability of Detection on AWGN Channels.
The Classically Enhanced Father Protocol
Chapter 31 INTRODUCTION TO ALGEBRAIC CODING THEORY.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
The Finite-state channel was introduced as early as 1953 [McMillan'53]. Memory captured by channel state at end of previous symbol's transmission: - S.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Reed-Solomon Codes in Slow Frequency Hop Spread Spectrum Andrew Bolstad Iowa State University Advisor: Dr. John J. Komo Clemson University.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Trellis-coded Unitary Space-Time Modulation for Multiple-Antenna Scheme under Rayleigh Flat Fading 指導教授 : 王 瑞 騰 老師 學生 : 吳委政.
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
RS – Reed Solomon Error correcting code. Error-correcting codes are clever ways of representing data so that one can recover the original information.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Coding and Interleaving
General Strong Polarization
Nyquist and Shannon Capacity
Distributed Compression For Binary Symetric Channels
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Topics discussed in this section:
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Theory of Information Lecture 13
Presentation transcript:

Channel Polarization and Polar Codes By Fakhruddin Mahmood Anlei Rao

Outline Introduction Channel Polarization Polar Codes Conclusion Channel Combining Channel Splitting Polar Codes Polar coding Successive Decoding Conclusion

Introduction Shannon’s proof of noisy channel coding theorem is the random coding method that he used to show the existence of capacity achieving code sequences. Construction of capacity-achieving code sequences has been an elusive goal Polar codes [Arikan] were the first provably capacity achieving codes for any symmetric B-DMC Low encoding and decoding complexity O(NlogN) Main idea of polar codes is based on the phenomenon of channel polarization

Introduction By recursively combining and splitting individual channels, some channels become error free while others turn into complete noise Those fraction of channels that become noiseless are given by I(W) which is the symmetric capacity I(W) is equal to Shannon capacity C under the condition that the B-DMC is symmetric Shannon capacity C is the highest rate at which reliable communication is possible across W using the inputs letters of the channel with equal probability.

Introduction Polar coding is the construction of codes that achieve I(W) by taking advantage of the polarizing effect. Basic idea is to create a coding system where each coordinate channel can be accessed individually and send data only through those whose capacity is close to I(W)

Channel Polarization An operation converting N ind. copies of B-DMC W to a polarized channel set of { }

Channel Polarization An operation converting N ind. copies of B-DMC into a polarized channel set of { } The polarized channel becomes either noisy or noiseless as block length N goes to infinity.

Channel Polarization An operation converting N ind. copies of B-DMC into a polarized channel set of { } The polarized channel becomes either noisy or noiseless as block length N goes to infinity. By sending the information bits through these noiseless channels, we can achieve the symmetric capacity of B-DMC.

Channel Polarization An operation converting N ind. copies of B-DMC into a polarized channel set of { } The polarized channel becomes either noisy or noiseless as block length N goes to infinity. By sending the information bits through these noiseless channels, we can achieve the symmetric capacity of B-DMC. Channel Polarization consists of two parts: channel combining and channel splitting

Channel Polarization Channel Combining: with the transition prob:

Channel Polarization Channel Combining: with the transition prob: : generating matrix calculated in a recursive way:

Channel Polarization Channel Combining: with the transition prob: : generating matrix calculated in a recursive way: : {1, 2, 3……N} {1, 3……N-1, 2, 4……N}

Channel Polarization Structure :

Example with N=8 Channel Combining:

Example with N=8 With simulation we can calculate the generating matrix for N=8:

Channel Polarization Channel Splitting: with the transition prob:

Example with N=8 After channel combining:

Example with N=8

Example with N=8

Example with N=8

Example with N=8

Example with N=8

Example with N=8

Polar Codes Polar Coding Based on the process of channel combining

Polar Codes Polar Coding Based on the process of channel combining Using the generating matrix for coding:

Polar Codes Polar Coding Based on the process of channel combining Using the generating matrix for coding: Choose the information set S={i: }

Polar Codes Polar Coding Based on the process of channel combining Using the generating matrix for coding: Choose the information set S={i: } Choose the frozen bits at will

Polar Codes Successive Decoding Based on the process of channel splitting

Polar Codes Successive Decoding Based on the process of channel splitting Use ML rule to make decisions

Polar Codes Successive Decoding Based on the process of channel splitting Use ML rule to make decisions Probability of block error bounded as

Polar Codes Successive Decoding Based on the process of channel splitting Use ML rule to make decisions Probability of block error bounded as Coding and decoding complexity: O(NlogN)

Example of N=8

Example of N=8

Example of N=8

Conclusion By combining and splitting the N-ind. copies of B-DMCs, we can get error free or pure-noise polarized channels. Transmitting information bits only through noiseless channels while fixing symbols transmitted through the pure-noise ones, the Shannon capacity of the symmetric B-DMC can be achieved. Polar codes, based on the phenomenon of channel polarization, are capacity-achieving for any symmetric B-DMC with low encoding and decoding complexity O(NlogN) and block error