Download presentation

Presentation is loading. Please wait.

Published byLandon Berry Modified over 3 years ago

1
Optimal Lower Bounds for 2-Query Locally Decodable Linear Codes Kenji Obata

2
Codes Error correcting code C : {0,1} n {0,1} m with decoding procedure A s.t. for y {0,1} m with d(y,C(x)) δm, A(y) = x

3
Locally Decodable Codes Weaken power of A: Can only look at a constant number q of input bits Weaken requirements: A need only recover a single given bit of x Can fail with some probability bounded away from ½ Study initiated by Katz and Trevisan [KT00]

4
Locally Decodable Codes Define a (q, δ, )-locally decodable code: A can make q queries (w.l.o.g. exactly q queries) For all x {0,1} n, all y {0,1} m with d(y, C(x)) δm, all inputs bits i 1,…, n A(y, i) = x i w/ probability ½ +

5
LDC Applications Direct: Scalable fault-tolerant information storage Indirect: Lower bounds for certain classes of private information retrieval schemes (more on this later)

6
Lower Bounds for LDCs [KT00] proved a general lower bound m n q/(q-1) (at best n 2, but known codes exponential) For 2-query linear LDCs Goldreich, Karloff, Schulman, Trevisan [GKST02] proved an exponential bound m 2 Ω(εδn)

7
Lower Bounds for LDCs Restriction to linear codes interesting, since known LDC constructions are linear But 2 Ω(εδn) not quite right: –Lower bound should increase arbitrarily as decoding probability 1 ( ε ½) –No matching construction

8
Lower Bounds for LDCs In this work, we prove that for 2-query linear LDCs, m 2 Ω(δ/(1-2ε)n) Optimal: There is an LDC construction matching this within a constant factor in the exponent

9
Techniques from [KT00] Fact: An LDC is also a smooth code (A queries each position w/ roughly the same probability) … so can study smooth codes Connects LDCs to information-theoretic PIR schemes: q queries q servers smoothness statistical indistinguishability

10
Techniques from [KT00] For i 1,…,n, define the recovery graph G i associated with C: Vertex set {1,…,m} (bits of the codeword) Edges are pairs (q 1, q 2 ) such that, conditioned on A querying q 1, q 2, A(C(x),i) outputs x i with prob > ½ Call these edges good edges (endpoints contain information about x i )

11
Techniques from [KT00]/[GKST02] Theorem: If C is (2, c, ε )-smooth, then G i contains a matching of size εm/c. Better to work with non-degenerate codes Each bit of the encoding depends on more than one bit of the message For linear codes, good edges are non-trivial linear combinations Fact: Any smooth code can be made non-degenerate (with constant loss in parameters).

12
Core Lemma [GKST02] Let q 1,…,q m be linear functions on {0,1} n s.t. for every i 1,…,n there is a set M i of at least γm disjoint pairs of indices j 1, j 2 such that x i = q j 1 (x) + q j 2 (x). Then m 2 γn.

13
Putting it all together… If C is a (2, c, )-smooth linear code, then (by reduction to non-degenerate code + existence of large matchings + core lemma), m 2 n/4c. If C is a (2, δ, )-locally decodable linear code, then (by LDC smooth reduction), m 2 δn/8.

14
Putting it all together… Summary: locally decodable smooth big matchings exponential size This work: locally decodable big matchings (skip smoothness reduction, argue directly about LDCs)

15
The Blocking Game Let G(V,E) be a graph on n vertices, w a prob distribution on E, X w an edge sampled according to w, S a subset of V Define the blocking probability β δ (G) as min w ( max |S|δn Pr (X w intersects S) )

16
The Blocking Game Want to characterize β δ (G) in terms of size of a maximum matching M(G), equivalently defect d(G) = n – 2M(G) Theorem: Let G be a graph with defect αn. Then β δ (G) min (δ/(1-α), 1).

17
The Blocking Game clique αnαn (1-α)n Define K(n,α) to be the edge- maximal graph on n vertices with defect αn: K1K1 K2K2

18
The Blocking Game Optimization on K(n,α) is a relaxation of optimization on any graph with defect αn If d(G) αn then β δ (G) β δ (K(n,α)) So, enough to think about K(n,α).

19
The Blocking Game Intuitively, best strategy for player 1 is to spread distribution as uniformly as possible A (λ 1,λ 2 )-symmetric dist: all edges in (K 1,K 2 ) have weight λ 1 all edges in (K 2,K 2 ) have weight λ 2 Lemma: (λ 1,λ 2 )-symmetric dist w s.t. β δ (K(n,α)) = max |S|δn Pr (X w intersects S).

20
The Blocking Game Claim: Let w 1,…,w k be dists s.t. max |S|δn Pr (X w i intersects S) = β δ (G). Then for any convex comb w = γ i w i max |S|δn Pr (X w intersects S) = β δ (G). Proof: For S V, |S| δn, intersection prob is γ i β δ (G) = β δ (G). So max |S| δn Pr (X w intersects S) β δ (G). But by defn of β δ (G), this must be β δ (G).

21
The Blocking Game Proof: Let w be any distribution optimizing β δ (G). If w does, then so does π(w) for π Aut(G) = Γ. By prior claim, so does w = (1/|Γ|) π Γ π(w). For e E, σ Γ, w(e) = (1/|Γ|) π Γ w(π(e)) = (1/|Γ|) π Γ w(πσ(e)) = w(σ(e)).. So, if e, e are in the same Γ-orbit, they have the same weight in w w is (λ 1,λ 2 )-symmetric.

22
The Blocking Game Claim: If w is (λ 1,λ 2 )-sym then S V, |S| δn s.t. Pr (X w intersects S) min (δ/(1-α), 1). Proof: If δ 1 – α then can cover every edge. Otherwise, set S = any δn vertices of K 2. Then Pr = δ ( 1/(1 - α) + ½ n 2 (1 - α – δ) λ 2 ) which, for δ < 1 - α, is at least δ/(1 - α) (optimized when λ 2 = 0).

23
The Blocking Game Theorem: Let G be a graph with defect αn. Then β δ (G) min (δ/(1-α), 1). Proof: β δ (G) β δ (K(n,α)). Blocking prob on K(n,α) is optimized by some (λ 1,λ 2 )-sym dist. For any such dist w, δn vertices blocking w with Pr min (δ/(1-α), 1).

24
Lower Bound for LDLCs Still need a degenerate non-degenerate reduction (this time, for LDCs instead of smooth codes) Theorem: Let C be a (2, δ, ε)-locally decodable linear code. Then, for large enough n, there exists a non-degenerate (2, δ/2.01, ε)-locally decodable linear code C : {0,1} n {0,1} 2m.

25
Lower Bound for LDLCs Theorem: Let C be a (2, δ, ε)-LDLC. Then, for large enough n, m 2 1/4.03 δ/(1-2ε) n. Proof: Make C non-degenerate Local decodability low blocking probability (at most ¼ - ½ ε) low defect (α 1 – (δ/2.01)/(1-2ε)) big matching (½ (δ/2.01)/(1-2ε) (2m) ) exponentially long encoding (m 2 (1/4.02) δ/(1-2ε)n – 1 )

26
Matching Upper Bound Hadamard code on {0,1} n y i = a i · x (a i runs through {0,1} n ) 2-query locally decodable Recovery graphs are perfect matchings on n-dim hypercube Success parameter ε = ½ - 2δ Can use concatenated Hadamard codes (Trevisan):

27
Matching Upper Bound Set c = (1-2ε)/4δ (can be shown that for feasible values of δ, ε, c 1). Divide input into c blocks of n/c bits, encode each block with Hadamard code on {0,1} n/c. Each block has a fraction cδ corrupt entries, so code has recovery parameter ½ - 2 (1-2ε)/4δ δ = ε Code has length (1-2ε)/4δ 2 4δ/(1-2ε)n

28
Conclusions There is a matching upper bound (concatenated Hadamard code) New results for 2-query non-linear codes (but using apparently completely different techniques) q > 2? –No analog to the core lemma for more queries –But blocking game analysis might generalize to useful properties other than matching size

Similar presentations

OK

TU/e Algorithms (2IL15) – Lecture 11 1 Approximation Algorithms.

TU/e Algorithms (2IL15) – Lecture 11 1 Approximation Algorithms.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on public speaking Ppt on public speaking skills Ppt on edge detection circuit Ppt on digital multimedia broadcasting Agriculture templates free download ppt on pollution Ppt on asp dot net Ppt on perimeter and area of circle Ppt on abstract art images Ppt on electricity from wastewater to drinking Ppt on different sectors of economy