Presentation is loading. Please wait.

Presentation is loading. Please wait.

Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.

Similar presentations


Presentation on theme: "Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver."— Presentation transcript:

1 Coding Theory

2 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver Channel encoder CRC encoder Source decoder Error control Impairments Noise Fading

3 3 Error control coding Limits in communication systems –Bandwidth limit –Power limit –Channel impairments Attenuation, distortion, interference, noise and fading Error control techniques are used in the digital communication systems for reliable transmission under these limits.

4 4 Power limit vs. Bandwidth limit

5 5 Error control coding Advantage of error control coding –In principle: Every channel has a capacity C. If you transmit information at a rate R < C, then the error-free transmission is possible. –In practice: Reduce the error rates Reduce the transmitted power requirements Increase the operational range of a communication system Classification of error control techniques –Forward error correction (FEC) –Error detection: cyclic redundancy check (CRC) –Automatic repeat request (ARQ)

6 6 History Shannon (1948) –R : Transmission rate for data –C : Channel capacity –If R < C, it is possible to transfer information at error rates that can be reduced to any desired level.

7 7 History Hamming codes (1950) –Single error correcting Convolutional codes (Elias, 1956) BCH codes (1960), RS codes (1960) –multiple error correcting Goppa codes (1970) –Generalization of BCH codes Algebraic geometric codes (1982) –Generalization of RS codes –Constructed over algebraic curves Turbo codes (1993) LDPC codes

8 8 Channel Memoryless channel –The probability of error is independent from one symbol to the next. Symmetric channel –P( i | j )=P( j | i ) for all symbol values i and j Ex) binary symmetric channel (BSC) Additive white Gaussian noise (AWGN) channel Burst error channel Compound (or diffuse) channel –The errors consist of a mixture of bursts and random errors. Many codes work best if errors are random. –Interleaver and deinterleaver are added.

9 9 Channel Random error channels –Deep-space channels –Satellite channels  Use random error correcting codes Burst error channels: channels with memory –Radio channels Signal fading due to multipath transmission –Wire and cable transmission Impulse switching noise, crosstalk –Magnetic recording Tape dropouts due to surface defects and dust particles  Use burst error correcting codes

10 10 Encoding Block codes Encoding of an [ n, k ] block code k bits n bits message or informationcodeword Redundancy: n – k Code rate: k / n Message m (m 1, m 2, …, m k ) codeword c (m 1, m 2, …, m k, p 1, p 2, …, p n - k ) Add n – k redundant parity check symbols (p 1, p 2, …, p n - k )

11 11 Decoding Decoding [ n, k ] block code –Decide what the transmitted information was –The minimum distance decoding is optimum in a memoryless channel. Received data r (r 1, r 2, …, r n ) Decoded message Correct errors and remove n – k redundant symbols Error vector e = (e 1, e 2, …, e n ) = (r 1, r 2, …, r n ) – (c 1, c 2, …, c n )

12 12 Decoding Decoding plane c1c1 r c4c4 c3c3 c2c2 c6c6 c5c5

13 13 Decoding Ex) Encoding and decoding procedure of [6, 3] code 1.Generate the information (100) in the source. 2.Transmit the codeword (100101) corresponding to (100). 3.The vector (101101) is received. 4.Choose the nearest codeword (100101) to (101101). 5.Extract the information (100) from the codeword (100101). Information 000 100 010 110 001 101 011 111 codeword 000000 100101 010011 110110 001111 101010 011100 111001 Distance from (101101) 4 1 5 4 2 3 2

14 14 Parameters of block codes Hamming distance d H (u, v) –# positions at which symbols are different in two vectors Ex) u=(1 0 1 0 0 0) v=(1 1 1 0 1 0)  d H (u, v) = 2 Hamming weight w H (u) –# nonzero elements in a vector Ex) w H (u) = 2, w H (v) = 4 Relation between hamming distance and hamming weight –Binary code: d H (u, v) = w H (u + v), where ‘+’ means exclusive OR (bit by bit) –Nonbinary code: d H (u, v) = w H (u – v)

15 15 Parameters of block codes Minimum distance d –d = min d H (c i, c j ) for all c i  c j  C Any two codewords differ in at least d places. [ n, k ] code with d  [ n, k, d ] code Error detection and correction capability –Let s = # errors to be detected t = # errors to be corrected ( s  t ) –Then, we have d  s + t + 1 Error correction capability –Any block code correcting t or less errors satisfies d  2t + 1 –Thus, we have t =  (d – 1) / 2 

16 16 Parameters of block codes Ex) d = 3, 4  t = 1 : single error correcting (SEC) codes d = 5, 6  t = 2 : double error correcting (DEC) codes d = 7, 8  t = 3 : triple error correcting (TEC) codes Coding sphere t s td cici cjcj

17 17 Code performance and coding gain Criteria for performance in the coded system –BER: bit error rate in the information after decoding, P b –SNR: signal to noise ratio, E b / N 0 E b = signal energy per bit N 0 = one-sided noise power spectral density in the channel –Coding gain (for a given BER) G = ( E b / N 0 ) without FEC – ( E b / N 0 ) with FEC [dB] At a given BER, P b, we can save the transmission power by G [dB] over the uncoded system.

18 18 Minimum distance decoding Maximum-likelihood decoding (MLD) – : estimated message after decoding – : estimated codeword in the decoder Assume that c was transmitted. –A decoding error occurs if. Conditional error probability of the decoder, given r : Error probability of the decoder:, where P(r) is independent of decoding rule

19 19 Minimum distance decoding Optimum decoding rule: minimize error probability, P(E) –This can be obtained by min r P(E | r), which is equivalent to Optimum decoding rule is –argmax c P(c | r) : Maximum a posteriori probability (MAP) –argmax c P(r | c) : Maximum likelihood (ML) Bayes’ rule –If equiprobable c, MAP = ML

20 20 Problems Basic problems in coding –Find good codes –Find their decoding algorithm –Implement the decoding algorithms Cost for forward error correction schemes –If we use [ n, k ] code, the transmission rate increase from k to n. Increase of channel bandwidth by n / k or decrease of message transmission rate by k / n. Cost for FEC

21 21 Classification Classification of FEC –Block codes Hamming, BCH, RS, Golay, Goppa, Algebraic geometric codes (AGC) Tree codes Convolutional codes –Linear codes Hamming, BCH, RS, Golay, Goppa, AGC, etc. Nonlinear codes Nordstrom-Robinson, Kerdock, Preparata, etc. –Systematic codes vs. Nonsystematic codes


Download ppt "Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver."

Similar presentations


Ads by Google