Presentation is loading. Please wait.

Presentation is loading. Please wait.

Local Algorithms & Error-correction Madhu Sudan Microsoft Research July 25, 2011 1 Local Error-Correction.

Similar presentations


Presentation on theme: "Local Algorithms & Error-correction Madhu Sudan Microsoft Research July 25, 2011 1 Local Error-Correction."— Presentation transcript:

1 Local Algorithms & Error-correction Madhu Sudan Microsoft Research July 25, 2011 1 Local Error-Correction

2 July 25, 2011 Local Error-Correction 2 Prelude Algorithmic Problems in Coding Theory New Paradigm in Algorithms The Marriage: Local Error-Detection & Correction

3 July 25, 2011Local Error-Correction3 Algorithmic Problems in Coding Theory Code: Σ = finite alphabet (e.g., {0,1}, {A … Z}) Code: Σ = finite alphabet (e.g., {0,1}, {A … Z}) E:Σ k  Σ n ; Image(E) = C µ Σ n E:Σ k  Σ n ; Image(E) = C µ Σ n R(C) = k/n; δ(C) = normalized Hamming distance R(C) = k/n; δ(C) = normalized Hamming distance Encoding: Encoding: Fix code C and associated E. Fix code C and associated E. Given m 2 Σ k, compute E(m). Given m 2 Σ k, compute E(m). Error-detection (є-Testing): Error-detection (є-Testing): Given x 2 Σ n, decide if 9 m s.t. x = E(m). Given x 2 Σ n, decide if 9 m s.t. x = E(m). Given x, decide if 9 m s.t. δ(x,E(m)) ≤ є. Given x, decide if 9 m s.t. δ(x,E(m)) ≤ є. Error-correction (Decoding): Error-correction (Decoding): Given x 2 Σ n, compute (all) m s.t. Given x 2 Σ n, compute (all) m s.t. δ(x,E(m)) ≤ є (if any exist). δ(x,E(m)) ≤ є (if any exist).

4 Answer 2: YES, if we are willing to Answer 2: YES, if we are willing to 1.Present input implicitly (by an oracle). 2.Represent output implicitly 3.Compute function on approximation to input. Extends to computing relations as well. July 25, 2011Local Error-Correction4 Sublinear time algorithmics Given f:{0,1} k  {0,1} n can f be “computed” in o(k,n) time? Given f:{0,1} k  {0,1} n can f be “computed” in o(k,n) time? Answer 1: Clearly NO, since that is the time it takes to even read the input/write the output Answer 1: Clearly NO, since that is the time it takes to even read the input/write the output f x f(x) x-oracle j xjxj i f(x) i f(x’) i where x’ ¼ x

5 July 25, 2011Local Error-Correction5 Sub-linear time algorithms Initiated in late eighties in context of Initiated in late eighties in context of Program checking [ BlumKannan,BlumLubyRubinfeld ] Program checking [ BlumKannan,BlumLubyRubinfeld ] Interactive Proofs/PCPs [ BabaiFortnowLund ] Interactive Proofs/PCPs [ BabaiFortnowLund ] Now successful in many more contexts Now successful in many more contexts Property testing/Graph-theoretic algorithms Property testing/Graph-theoretic algorithms Sorting/Searching Sorting/Searching Statistics/Entropy computations Statistics/Entropy computations (High-dim.) Computational geometry (High-dim.) Computational geometry Many initial results are coding-theoretic! Many initial results are coding-theoretic!

6 July 25, 2011Local Error-Correction6 Sub-linear time algorithms & Coding Encoding: Not reasonable to expect in sub-linear time. Encoding: Not reasonable to expect in sub-linear time. Testing? Decoding? – Can be done in sublinear time. Testing? Decoding? – Can be done in sublinear time. In fact many initial results do so! In fact many initial results do so! Codes that admit efficient … Codes that admit efficient … … testing: Locally Testable Codes (LTCs) … testing: Locally Testable Codes (LTCs) … decoding: Locally Decodable Codes (LDCs). … decoding: Locally Decodable Codes (LDCs).

7 July 25, 2011Local Error-Correction7 Rest of this talk Definitions of LDCs and LTCs Definitions of LDCs and LTCs Quick description of known results Quick description of known results The first result: Hadamard codes The first result: Hadamard codes Some basic constructions Some basic constructions Recent constructions of LDCs. Recent constructions of LDCs. [Kopparty-Saraf-Yekhanin ‘11] [Kopparty-Saraf-Yekhanin ‘11] [Yekhanin ‘07,Raghavendra ‘08,Efremenko ‘09] [Yekhanin ‘07,Raghavendra ‘08,Efremenko ‘09]

8 July 25, 2011 Local Error-Correction 8 Definitions

9 July 25, 2011 Local Error-Correction 9 Locally Decodable Code n w C:Σ k Σ n is (q,є)-Locally Decodable if 9 decoder D s.t. given i 2 [k], and oracle w : [n]  Σ s.t. 9 m s.t. δ(w,C(m)) ≤ є ≤ δ(C)/2, D(i) outputs m i D(i) reads q(n) random positions of w and outputs m i w.p. ≥ 2/3. What if є> δ(C)/2? Might need to report a list of codewords.

10 July 25, 2011Local Error-Correction10 Locally List-Decodable Code n w C is (є,L)-list-decodable if 8 w 2 Σ n # codewords c 2 C s.t. δ(w,c) ≤ є is at most L. C is (q,є,L)-locally-list-decodable if 9 decoder D s.t. given oracle w: [n] \to Σ, 8 m \in Σ k, s.t. δ(w,C(m)) ≤ є, 9 j 2 [L] s.t., 8 i \in [k], D w (i,j) output m i w.p. 2/3. D(i,j) reads q(n) random positions of w and outputs m i w.p. ≥ 2/3.

11 July 25, 2011Local Error-Correction11 History of definitions Constructions predate formal definitions Constructions predate formal definitions [Goldreich-Levin ’89]. [Goldreich-Levin ’89]. [Beaver-Feigenbaum ’90, Lipton ’91]. [Beaver-Feigenbaum ’90, Lipton ’91]. [Blum-Luby-Rubinfeld ’90]. [Blum-Luby-Rubinfeld ’90]. Hints at definition (in particular, interpretation in the context of error-correcting codes): [Babai- Fortnow-Levin-Szegedy ’91]. Hints at definition (in particular, interpretation in the context of error-correcting codes): [Babai- Fortnow-Levin-Szegedy ’91]. Formal definitions Formal definitions [S.-Trevisan-Vadhan ’99] (local list-decoding). [S.-Trevisan-Vadhan ’99] (local list-decoding). [Katz-Trevisan ’00] [Katz-Trevisan ’00]

12 July 25, 2011Local Error-Correction12 Locally Testable Codes                   n w “Weak” definition: hinted at in [BFLS], explicit in [RS’96, Arora’94, Spielman’94, FS’95]. C is (q,є)-Locally Testable if  tester T s.t. T reads q(n) positions (probabilistically): If w 2 C, T accepts w.p. 1. If δ(w,C) > є, T rejects w.p. ≥ ½.

13 July 25, 2011Local Error-Correction13 Strong Locally Testable Codes n w “Strong” Definition: [Goldreich-S. ’02] C is (q,є)-(strongly) Locally Testable if  tester T s.t. T reads q(n) positions (probabilistically): If w 2 C, T accepts w.p. 1. 8 w 2 Σ n, T rejects w.p. ≥ Ω(δ(w,C)).

14 July 25, 2011 Local Error-Correction 14 Motivations

15 July 25, 2011Local Error-Correction15 Local Decoding: Worst-case vs. Average-case Suppose C µ Σ N is locally-decodable for N = 2 n. (Furthermore assume can locally decode all bits of the codeword, and not just message bits.) Suppose C µ Σ N is locally-decodable for N = 2 n. (Furthermore assume can locally decode all bits of the codeword, and not just message bits.) c 2 C can be viewed as c: {0,1} n  Σ. c 2 C can be viewed as c: {0,1} n  Σ. Local decoding ~ ) can compute c(x), 8 x, if can compute c(x’) for most x’. Local decoding ~ ) can compute c(x), 8 x, if can compute c(x’) for most x’. Relates average case complexity to worst-case complexity. [Lipton, STV]. Relates average case complexity to worst-case complexity. [Lipton, STV]. Alternate interpretation: Alternate interpretation: Can compute c(x) without revealing x. Can compute c(x) without revealing x. Leads to Instance Hiding Schemes [BF], Private Information Retrieval [CGKS]. Leads to Instance Hiding Schemes [BF], Private Information Retrieval [CGKS].

16 July 25, 2011Local Error-Correction16 Motivation for Local-testing No generic applications known. No generic applications known. However, However, Interesting phenomenon on its own. Interesting phenomenon on its own. Intangible connection to Probabilistically Checkable Proofs (PCPs). Intangible connection to Probabilistically Checkable Proofs (PCPs). Potentially good approach to understanding limitations of PCPs (though all resulting work has led to improvements). Potentially good approach to understanding limitations of PCPs (though all resulting work has led to improvements).

17 July 25, 2011Local Error-Correction17 Contrast between decoding and testing Decoding: Property of words near codewords. Decoding: Property of words near codewords. Testing: Property of words far from code. Testing: Property of words far from code. Decoding: Decoding: Motivations happy with n = quasi-poly(k), and q = poly log n. Motivations happy with n = quasi-poly(k), and q = poly log n. Lower bounds show q = O(1) and n = nearly- linear(k) impossible. Lower bounds show q = O(1) and n = nearly- linear(k) impossible. Testing: Better tradeoffs possible! Likely more useful in practice. Testing: Better tradeoffs possible! Likely more useful in practice. Even conceivable: n = O(k) with q = O(1)? Even conceivable: n = O(k) with q = O(1)?

18 July 25, 2011 Local Error-Correction 18 Some LDCs and LTCs

19 July 25, 2011Local Error-Correction19 Hadamard (1 st Order RM) Codes Messages: Messages: ( Coefficients of ) Linear functions {L :F 2 k  F 2 }. ( Coefficients of ) Linear functions {L :F 2 k  F 2 }. Encoding: Encoding: Evaluations of L on all of F 2 k. Evaluations of L on all of F 2 k. Parameters: Parameters: k bit messages  2 k bit codewords. k bit messages  2 k bit codewords. Locality: Locality: 2-Locally Decodable [Folklore/Exercise] 2-Locally Decodable [Folklore/Exercise] 3-Locally Testable [BlumLubyRubinfeld] 3-Locally Testable [BlumLubyRubinfeld]

20 July 25, 2011Local Error-Correction20 Hadamard (1 st Order RM) Codes Summary: Summary: There exist infinite families of codes There exist infinite families of codes With constant locality (for testing and correcting). With constant locality (for testing and correcting).

21 July 25, 2011Local Error-Correction21 Codes via Multivariate Polynomials Message: Coefficients of degree t, m-variate polynomial over (finite field) F Message: Coefficients of degree t, m-variate polynomial over (finite field) F ( (generalized) Reed-Muller Code) ( (generalized) Reed-Muller Code) Encoding: Evaluations of P over all of F m Encoding: Evaluations of P over all of F m Parameters: k ¼ (t/m) m ; n = F m ; δ(C) ¼ 1 - t/F. Parameters: k ¼ (t/m) m ; n = F m ; δ(C) ¼ 1 - t/F. P F F m

22 July 25, 2011Local Error-Correction22 Basic insight to locality m-variate polynomial of degree t, restricted to m’ < m dim. affine subspace is poly of deg. t. m-variate polynomial of degree t, restricted to m’ < m dim. affine subspace is poly of deg. t. Local Decoding: Local Decoding: Given oracle for w ¼ P, and x 2 F m Given oracle for w ¼ P, and x 2 F m Pick subspace A through x. Pick subspace A through x. Query w on A and decode for P| A Query w on A and decode for P| A Query complexity: q = F m’ ; Time = poly(q); m’ = o(m) ) sublinear! Query complexity: q = F m’ ; Time = poly(q); m’ = o(m) ) sublinear! Local Testing: Local Testing: Verify w restricted to subspace is of degree t. Verify w restricted to subspace is of degree t. Same complexity; Analysis much harder. Same complexity; Analysis much harder.

23 July 25, 2011Local Error-Correction23 Polynomial Codes Many parameters: m, t, F Many parameters: m, t, F Many tradeoffs possible: Many tradeoffs possible: Locality (log k) 2 with n = k 4 ; Locality (log k) 2 with n = k 4 ; Locality є.k with n = O(k); Locality є.k with n = O(k); Locality (constant) q, with n = exp(k (1/q-1) ) Locality (constant) q, with n = exp(k (1/q-1) )

24 July 25, 2011Local Error-Correction24 Are Polynomial Codes (Roughly) Best? No! [Ambainis97] [GoldreichS.00] … No! [Ambainis97] [GoldreichS.00] … No!! [ Beimel,Ishai,Kushilevitz,Raymond ] No!! [ Beimel,Ishai,Kushilevitz,Raymond ] Really … Seriously … No!!!! Really … Seriously … No!!!! [ Yekhanin07,Raghavendra08,Efremenko09 ] [ Yekhanin07,Raghavendra08,Efremenko09 ] [ Kopparty-Saraf-Yekhanin ‘10 ] [ Kopparty-Saraf-Yekhanin ‘10 ]

25 July 25, 2011 Local Error-Correction 25 Recent LDCs - I [Kopparty-Saraf-Yekhanin ‘10] s

26 The Concern Poor rate of polynomial codes: Poor rate of polynomial codes: Best rate (for any non-trivial locality): ½ Best rate (for any non-trivial locality): ½ (bivariate polynomials, √n locality). (bivariate polynomials, √n locality). Locality n є : Rate є (1/ є) Locality n є : Rate є (1/ є) (use 1/є variables). (use 1/є variables). Practical codes use high rates (say 80%) Practical codes use high rates (say 80%) July 25, 2011Local Error-Correction26

27 Bivariate Polynomials Use t = (1 - ρ).F ; ρ  0 Use t = (1 - ρ).F ; ρ  0 Yields δ(C) ¼ ρ. Yields δ(C) ¼ ρ. # coefficients: k < ½.(1- ρ) 2.F 2 # coefficients: k < ½.(1- ρ) 2.F 2 Encoding length: n = F 2. Encoding length: n = F 2. Rate ¼ ½.(1 - ρ) 2 Rate ¼ ½.(1 - ρ) 2 Can’t use degree > F; Hence Rate F; Hence Rate < ½ ! July 25, 2011Local Error-Correction27

28 Mutliplicity Codes Idea: Idea: Encode polynomial P(x,y) by its evaluations, and evaluations of its (partial) derivatives! Encode polynomial P(x,y) by its evaluations, and evaluations of its (partial) derivatives! Sample parameters: Sample parameters: n = 3F 2 (F 2 evaluations of {P + P x + P y }). n = 3F 2 (F 2 evaluations of {P + P x + P y }). However, degree can now be larger than F. However, degree can now be larger than F. t = 2.(1 - ρ).F ) δ(C) = ρ. t = 2.(1 - ρ).F ) δ(C) = ρ. k = 2. (1 - ρ) 2. F 2 ; Rate ¼ 2/3. k = 2. (1 - ρ) 2. F 2 ; Rate ¼ 2/3. Locality = O(F) = O(√k) Locality = O(F) = O(√k) Getting better: Getting better: With more multipicity, rate goes up. With more multipicity, rate goes up. With more variables, locality goes down. With more variables, locality goes down. July 25, 2011Local Error-Correction28

29 Multiplicity Codes: The Theorem Theorem: Theorem: 8 ®, ¯ > 0, 8 ®, ¯ > 0, 9 δ > 0 and LDC C: {0,1} k  {0,1} n with 9 δ > 0 and LDC C: {0,1} k  {0,1} n with Rate ≥ 1 - ®, Rate ≥ 1 - ®, Distance ≥ δ, Distance ≥ δ, Locality ≤ k ¯ (decodable with k ¯ queries). Locality ≤ k ¯ (decodable with k ¯ queries). July 25, 2011Local Error-Correction29

30 July 25, 2011 Local Error-Correction 30 Recent LDCs - II [Yekhanin ‘07, Raghavendra ‘08, Efremenko ‘09]

31 Other end of spectrum Minimum locality possible? Minimum locality possible? q = 2: Hadamard codes achieve n = 2 k ; q = 2: Hadamard codes achieve n = 2 k ; [Kerenedis, deWolf]: n ≥ exp(k). [Kerenedis, deWolf]: n ≥ exp(k). q = 3: Best possible = ?. q = 3: Best possible = ?. Till 2006: Widely held belief: n ≥ exp (k.1 ) Till 2006: Widely held belief: n ≥ exp (k.1 ) [Yekhanin ‘07]: n · exp (k.0000001 ) [Yekhanin ‘07]: n · exp (k.0000001 ) [Raghavendra ‘08]: Clarified above. [Raghavendra ‘08]: Clarified above. [Efremenko ‘09]: n · exp (exp(√(log k))) … [Efremenko ‘09]: n · exp (exp(√(log k))) … July 25, 2011Local Error-Correction31

32 July 25, 2011Local Error-Correction32 Essence of the idea: Build “good” combinatorial matrix over Z m Build “good” combinatorial matrix over Z m (integers modulo m). Embed Z m in multiplicative subgroup of F. Embed Z m in multiplicative subgroup of F. Get locally decodable code over F. Get locally decodable code over F.

33 July 25, 2011Local Error-Correction33 “Good” Combinatorial matrix 0 … … 0 … … 0 arbitrary Columns closed under addition A = k x n matrix over Z m Zeroes on diagonal Non-zero off-diagonal

34 Embedding into a field Let A = [a ij ] be good over Z m. Let A = [a ij ] be good over Z m. Let ! 2 F be primitive m th root of unity. Let ! 2 F be primitive m th root of unity. Let G = [ ! a ij ]. Let G = [ ! a ij ]. Thm [Y, R, E]: Thm [Y, R, E]: G generates an m query LDC over F !!! G generates an m query LDC over F !!! Highly non-intuitive! July 25, 2011Local Error-Correction34

35 Improvements Let A = [a ij ] be good ; Let G = [ ! a ij ]. Let A = [a ij ] be good ; Let G = [ ! a ij ]. Off-diagonal entries of A from S Off-diagonal entries of A from S ) code is (|S|+1)-locally decodable. (suffices for [Efremenko]). (suffices for [Efremenko]). ! S roots of t-sparse polynomial ! S roots of t-sparse polynomial ) code is t-locally decodable. (critical for [Yekhanin]). (critical for [Yekhanin]). July 25, 2011Local Error-Correction35

36 July 25, 2011Local Error-Correction36 “Good” Matrices? [Yekhanin]: [Yekhanin]: Picked m prime. Picked m prime. Hand-constructed matrix. Hand-constructed matrix. Achieved n = exp(k (1/|S|) ) Achieved n = exp(k (1/|S|) ) Optimal if m prime! Optimal if m prime! Managed to make S large (10 6 ) with t=3. Managed to make S large (10 6 ) with t=3. [Efremenko] [Efremenko] m composite! m composite! Achieves |S| = 3 and n = exp(exp(√(log k))) Achieves |S| = 3 and n = exp(exp(√(log k))) ([Beigel,Barrington,Rudich];[Grolmusz]) ([Beigel,Barrington,Rudich];[Grolmusz]) Optimal? Optimal?

37 July 25, 2011Local Error-Correction37 Limits to Local Decodability: Katz-Trevisan q queries ) n = k 1 + Ω(1/q) q queries ) n = k 1 + Ω(1/q) Technique: Technique: Recall D(x) computes C(x) whp for all x. Recall D(x) computes C(x) whp for all x. Can assume (with some modifications) that query pattern uniform for any fixed x. Can assume (with some modifications) that query pattern uniform for any fixed x. Can find many random strings such that their query sets are disjoint. Can find many random strings such that their query sets are disjoint. In such case, random subset of n 1-1/q coordinates of codeword contain at least one query set, for most x. In such case, random subset of n 1-1/q coordinates of codeword contain at least one query set, for most x. Yields desired bound. Yields desired bound.

38 July 25, 2011Local Error-Correction38 Some general results Sparse, High-Distance Codes: Sparse, High-Distance Codes: Are Locally Decodable and Testable Are Locally Decodable and Testable [KaufmanLitsyn, KaufmanS] [KaufmanLitsyn, KaufmanS] 2-transitive codes of small dual-distance: 2-transitive codes of small dual-distance: Are Locally Decodable Are Locally Decodable [Alon,Kaufman,Krivelevich,Litsyn,Ron] [Alon,Kaufman,Krivelevich,Litsyn,Ron] Linear-invariant codes of small dual-distance: Linear-invariant codes of small dual-distance: Are also Locally Testable Are also Locally Testable [KaufmanS] [KaufmanS]

39 July 25, 2011Local Error-Correction39 Summary Local algorithms in error-detection/correction lead to interesting new questions. Local algorithms in error-detection/correction lead to interesting new questions. Non-trivial progress so far. Non-trivial progress so far. Limits largely unknown Limits largely unknown O(1)-query LDCs must have Rate(C) = 0 O(1)-query LDCs must have Rate(C) = 0 [Katz-Trevisan] [Katz-Trevisan]

40 July 25, 2011Local Error-Correction40 Questions Can LTC replace RS (on your hard disks)? Can LTC replace RS (on your hard disks)? Lower bounds? Lower bounds? Better error models? Better error models? Simple/General near optimal constructions? Simple/General near optimal constructions? Other applications to mathematics/computation? (PCPs necessary/sufficient)? Other applications to mathematics/computation? (PCPs necessary/sufficient)? Lower bounds for LDCs?/Better constructions? Lower bounds for LDCs?/Better constructions?

41 July 25, 2011 Local Error-Correction 41 Thank You!


Download ppt "Local Algorithms & Error-correction Madhu Sudan Microsoft Research July 25, 2011 1 Local Error-Correction."

Similar presentations


Ads by Google