Presentation is loading. Please wait.

Presentation is loading. Please wait.

Error-Correcting Codes:

Similar presentations


Presentation on theme: "Error-Correcting Codes:"— Presentation transcript:

1 Error-Correcting Codes:
Progress & Challenges Madhu Sudan MIT CSAIL

2 Communication in presence of noise
We are now ready We are not ready Noisy Channel Sender Receiver If information is digital, reliability is critical

3 Shannon’s Model: Probabilistic Noise
Sender Receiver Encode (expand) Noisy Channel Decode (compress?) E : k ! n D : n ! k Probabilistic Noise: E.g., every letter flipped to random other letter of w.p. p Focus: Design good Encode/Decode algorithms.

4 Hamming Model: Worst-case error
Errors: Upto worst-case errors Focus: Code (Note: Not encoding/decoding) Goal: Design code so as to correct any pattern of errors. t C = f E ( x ) : 2 k g t

5 Problems in Coding Theory, Broadly
Combinatorics: Design best possible error-correcting codes. Probability/Algorithms: Design algorithms correcting random/worst-case errors.

6 Combinatorial Results
Part I (of III): Combinatorial Results

7 Hamming Notions ¢ ( x ; y ) = j f i g ¢ ( C ) = m i n f g o d e s t a
Hamming Distance: Distance of Code: Main question: Asymptotically: ( x ; y ) = j f i g ( C ) = m i n x ; y 2 f g o d e s t a c + 1 r . F o u r P a m e t s : L n g h , M l k D i c d p b q = j . H w y ? W # " L e t R = k n , d H o w ; q r l a ?

8 Simple results B a l ( x ; r ) = f y 2 § j ¢ · g H ( ± ) = c s . t V o
Ball: Volume of Ball: Entropy function: Hamming (Packing) Bound: (No code can have too many codewords) B a l ( x ; r ) = f y 2 n j g H q ( ) = c s . t V o l ; n V o l ( q ; n r ) = j B a x i . S o q k H ( = 2 ) n

9 Simple results (contd.)
Gilbert-Varshamov (Greedy) Bound: L e t C : k ! n b m a x i l o f d s c = . T h r u 1 w v S o q k H ( n ) . O r : R 1

10 Simple results (Summary)
For the best code: After fifty years of research … We still don’t know. 1 H q ( ) R = 2 Which is right?

11 Binary case: ± = 1 2 ¡ ² , ! . ­ ( ² ) · R ~ O G V / C h e r n o ® L P
Case of large distance: Case of small (relative) distance: Case of constant distance: = 1 2 , ! . ( 2 ) R ~ O G V / C h e r n o L P B o u n d N o b u n d e t r h a R 1 ( ) H = 2 . H a m i n g d 2 l o g n k ( 1 ) BCH H a m i n g

12 Binary case (Closer look):
For general Can we do better? Twice as many codewords? (won’t change asymptotics of ) Recent progress [Jiang-Vardy]: n ; d # C o d e w r s 2 n = V l ( ; 1 ) R ; # C o d e w r s 2 n = V l ( ; 1 )

13 Proof idea of [Jiang-Vardy]:
L o k a t H m i n g d s c e 1 r p h : V e r t i c s = f ; 1 g n , u $ v ( ) < d C o d e = I n p t s i h g r a G V B o u n d : I . S s i z e # v r t c / g J i a n g - V r d y : N o t c e # l s m . U s e [ A K S ] F o r g a p h w i t n ( m l # f ) , b u d v y c .

14 Major questions in binary codes:
Give explicit construction meeting GV bound. Is Hamming tight when Is LP Bound tight? I n p a r t i c u l , g v e o d s f = 1 2 ( ! ) R . ! ? D e s i g n c o d f t a , w h r R = 1 ( + ) l 2 m < . H a m i n g : c = 1 2

15 Combinatorics (contd.): q-ary case
Fix Surprising result (’80s): (Also a negative surprise: BCH codes only yield ) a n d l e t q ! 1 ( h x ) 1 O ( = l o g q ) R GV bound Plotkin A l g e b r a i c G o m t y v s d w h R 1 p q R = 1 d ( q ) l o g n Not Hamming

16 Major questions: q-ary case
H a v e R = 1 f ( q ) W h i c s t e f a d y n g u o ( ) r w l ? G i v e a n t u r l x p o f w h y ( q ) = O 1 F i x d , a n l e t q ! 1 s o w y h f . H k g b v ? D p r c 2

17 Correcting Random Errors
Part II (of III): Correcting Random Errors

18 Recall Shannon ² § - s y m e t r i c h a n l w o p : T ¾ 2 b 1 ¡ d f g
f g = ( q ) . S h a n o s C d i g T e r m : t R = 1 H q ( p ) , 8 > . I f R = 1 H q ( p ) , t h e n o r v y x i s E : ! a d D u c P [ C l ] . C o n v e r s d i g T h m : a t R = 1 H q ( p ) + , f > . S o n m y s t e r i ?

19 Constructive versions
S h a n o s f u c t i : E p k e d r m , D b . C a n w e g t p o l y m i c u b E ; D ? [ F o r n e y 6 ] : G a v p l m i t c u b E ; D . ( U s d g f R - S , ) D i d n t c o m p l e y s a f r . W h ? [ S p i e l m a n 9 4 + B r g - Z o 7 ] L t c u b E ; D . ( s f ) [ B e r o u t a l . 9 2 ] T b c d s + i f p g n N h m ( M x / )

20 What is satisfaction? ² P r a c t i l n e s : I o g f p m . = 1 , q 2
Articulated by [Luby,Mitzenmacher,Shokrollahi,Spielman ’96] P r a c t i l n e s : I o g f p m . = 1 , q 2 D d b y 6 W h k ? F o r [ n e y ] a d s u c : D i g m p l x t ( 1 = H ) k . R 9 % f 2 T h e r i g t q u s o n : G d c m p l y ( 1 = ) v ; w H R .

21 Current state of the art
Luby et al.: Propose study of codes based on irregular graphs (“Irregular LDPC Codes”).

22 LDPC Codes n l e f t v r i c s 1 n ¡ k r i g h t v e c s 1 C o d e w r
1 C o d e w r = / 1 a s i g n m t l f h b v c p y . 1 D e n s E : f ; 1 g k ! . 1 1 R i g h t v e r c s a p y k . G l o w d n H L - D P C

23 LDPC Codes D e c o d i n g I t u : P a r y h k f l s ) m b p . F w 1 1
1 1 [ G a l g e r 6 3 . S i p s - m n 9 2 ] : C o c t ( 1 ) f 1 C u r e n t h o p : P i c k g d s a f l y w / m 1

24 Current state of the art
Luby et al.: Propose study of codes based on irregular graphs (“Irregular LDPC Codes”). No theorems so far for erroneous channels. Strong analysis for (much) simpler case of erasure channels (symbols are erased); decoding time (Easy to get “composition” based algorithms with decoding time ) Do have some proposals for errors as well (with analysis by Luby et al., Richardson & Urbanke), but none known to converge to Shannon limit. O ( n l o g 1 = ) O ( n p o l y 1 = )

25 Still open ² T h e r i g t q u s o n : ¡ G d c m ¢ p l y ( 1 = ) v ; w
Articulated by [Luby,Mitzenmacher,Shokrollahi,Spielman ’96] T h e r i g t q u s o n : G d c m p l y ( 1 = ) v ; w H R .

26 Correcting Adversarial Errors
Part III: Correcting Adversarial Errors

27 Motivation: As notions of communication/storage get more complex, modeling error as oblivious (to message/encoding/decoding) may be too simplistic. Need more general models of error + encoding/decoding for such models. Most pessimistic model: errors are worst-case.

28 Gap between worst-case & random errors
In Shannon model, with binary channel: Can correct upto 50% (random) errors. In Hamming model, for binary channel: Code with more than n codewords has distance at most 50%. So it corrects at most 25% worst-case errors. Need new approaches to bridge gap. ( 1 = q f r a c t i o n e s , h l - y . ) ( 1 2 = q ) f r a c t i o n e s - y .

29 Approach: List-decoding
Main reason for gap between Shannon & Hamming: The insistence on uniquely recovering message. List-decoding: Relaxed notion of recovery from error. Decoder produces small list (of L) codewords, such that it includes message. Code is (p,L) list-decodable if it corrects p fraction error with lists of size L.

30 List-decoding Main reason for gap between Shannon & Hamming: The insistence on uniquely recovering message. List-decoding [Elias ’57, Wozencraft ’58]: Relaxed notion of recovery from error. Decoder produces small list (of L) codewords, such that it includes message. Code is (p,L) list-decodable if it corrects p fraction error with lists of size L.

31 What to do with list? Probabilistic error: List has size one w.p. nearly 1 General channel: Need side information of only O(log n) bits to disambiguate [Guruswami ’03] (Alt’ly if sender and receiver share O(log n) bits, then they can disambiguate [Langberg ’04]). Computationally bounded error: Model introduced by [Lipton, Ding Gopalan L.] List-decoding results can be extended (assuming PKI and some memory at sender) [Micali et al.]

32 List-decoding: State of the art
[Zyablov-Pinsker/Blinovskii – late 80s] Matches Shannon’s converse perfectly! (So can’t do better even for random error!) But [ZP/B] non-constructive! T h e r x i s t c o d f a 1 H q ( p ) ; O - l b .

33 Algorithms for List-decoding
Not examined till ’88. First results: [Goldreich-Levin] for “Hadamard’’ codes (non-trivial in their setting). More recent work: [S.’96, Shokrollahi-Wasserman ’98, Guruswami-S.’99, Parvaresh-Vardy ’05, Guruswami-Rudra ’06] – Decode algebraic codes. [Guruswami-Indyk ’00-’02] – Decode graph-theoretic codes. [TaShma-Zuckerman ’02, Trevisan ’03] – Propose new codes for list-decoding.

34 Results in List-decoding
Q-ary case: Binary case: [ G u r s w a m i - R d 6 ] C o e f t c n g 1 h q = ( ) . C o n v e r g s t S h a c p i y ! 9 C o d e s f r a t c i n g 1 2 . c = 4 : G u r s w a m i e t l . 2 c ! 3 : I m p l i e d b y P a r v s h - V 5

35 Few lines about Guruswami-Rudra
Code = Collated Reed-Solomon Code + Concatenation. A l p h a b e t = F C q ; ! 1 , c o n s . C o d e m a p s K ! N f r q = . M e s a g : D r C K p o l y n m i v F q . E n c o d i g : F r s t p a q e l S ; 1 N , w h j = C . y f 2 P ( )

36 Few lines about Guruswami-Rudra
Special properties: Is this code combinatorially good? Algorithmically good!! (uses ideas from [S’96,GS’98,PV’05 + new ones]. Can concatenate to reduce alphabet size. o f K , S i s S i = f C ; : 1 g . s a t i e x q = m o d h ( ) f r u c b l g C K . D o B a l s f r d i u ( 1 ) N K h v e w c ?

37 Few lines about Guruswami-Rudra
Warnings: K, N, partition all very special. A l p h a b e t = F C q ; ! 1 , c o n s . C o d e m a p s K ! N f r q = . M e s a g : D r C K p o l y n m i v F q . Encoding: \\ \indent First partition $\F_q$ into {\red special} sets $S_0,S_1,\ldots,S_N$, \\ \indent \indent with $|S_1| = \cdots = |S_N| = C$. \\ \indent Say $S_1 = \{\alpha_1,\ldots,\alpha_C\}$, $S_2 = \{\alpha_{C+1},\ldots,\alpha_{2C}\}$ etc.\\ \indent Encoding of $P$\\ \indent \indent $\langle \langle P(x_1),\ldots,P(x_C) \rangle, \langle P(x_{C+1}),\ldots,P(x_{2C}) \rangle \cdots \rangle$

38 Major open question ² C o n s t r u c ( p ; O 1 ) l i - d e a b y f ¡
H w h m g . N o t e : I f r u n i g m s p l y ( 1 = ) h a d b w .

39 Conclusions Many mysteries in combinatorial setting.
Significant progress in algorithmic setting, but many important open questions as well.


Download ppt "Error-Correcting Codes:"

Similar presentations


Ads by Google