Download presentation

Presentation is loading. Please wait.

Published byKailey Curling Modified over 3 years ago

1
Expander Graphs, Randomness Extractors and List-Decodable Codes Salil Vadhan Harvard University Joint work with Venkat Guruswami (UW) & Chris Umans (Caltech)

2
[GW94,WZ95,TUZ01, RVW00,CRVW02] Connections in Pseudorandomness Randomness Extractors Expander Graphs List-Decodable Error-Correcting Codes Pseudorandom Generators Samplers [Tre99,RRV99, ISW99,SU01,U02] [Tre99,TZ01, TZS01,SU01] [CW89,Z96] This Work [PV05,GR06] This Work

3
Outline Expander Construction Application to Extractors Connections Conclusions

4
(Bipartite) Expander Graphs Goals: Minimize D Maximize A Minimize M | (S)| A ¢ |S| D N M S, |S| K Nonconstructive: D = O(log(N/M)/ ) A = (1- ) ¢ D M = (KD/ “ (K,A) expander” O(1) if M=N log N) if M ·p N =

5
Applications of Expanders Fault-tolerant networks (e.g., [Pin73,Chu78,GG81]) Sorting in parallel [AKS83] Complexity theory [Val77,Urq87] Derandomization [AKS87,INW94,Rei05,…] Randomness extractors [CW89,GW94,TUZ01,RVW00] Ramsey theory [Alo86] Error-correcting codes [Gal63,Tan81,SS94,Spi95,LMSS01] Distributed routing in networks [ PU89,ALM96,BFU99 ]. Data structures [ BMRS00 ]. Distributed storage schemes [ UW87 ]. Hard tautologies in proof complexity [BW99,ABRW00,AR01 ]. Other areas of Math [KR83,Lub94,Gro00,LP01] Need explicit constructions (deterministic, time poly(log N)).

6
Advantage of Expansion (1- ) ¢ D At least (1-2 ) D |S| elements of (S) are unique neighbors: touch exactly one edge from S | (S)| (1- ) D |S| D N M S, |S| K x Fault tolerance: Even if an adversary removes most (say ¾) edges from each vertex, lossless expansion maintained (with =4 )

7
Application to Data Structures [BMRS00] Goal: store small S ½ [N] s.t. can test membership by (probabilistically) reading 1 bit. Expansion (1- ) ¢ D ) 9 0,1 assignment to [M] s.t. for every x 2 [N], a 1-O( ) fraction of neighbors have correct answer! D N M S, |S| K | (S)| (1- ) ¢ D ¢ |S| /2 0 0 0 1 1 0 1 1 0

8
Application to Data Structures [BMRS00] Size: M=O(K ¢ log N) with optimal expander (K ¢ log N) necessary to represent set. Perfect hashing: same size, but read O(log N)-bit word D N M S, |S| K /2 0 0 0 1 1 0 1 1 0

9
Explicit Constructions Nonconstructive O(log(N/M)) (1- ) ¢ DO(KD Ramanujan graphs […LPS86,M88] O(1) ¼ D/2 [Kah94] N Zig-zag CRVW02] O(1) (1- ) ¢ D N Ta-Shma, Umans, Zuckerman [TUZ01] polylog(N) quasipoly(log N) (1- ) ¢ D quasipoly(KD) poly(KD) Our Result polylog(N) (1- ) ¢ D poly(KD) degree D expansion A |right-side| M arbitrary constant. quasipoly(t)=exp(polylog t)

10
Our Result Thm: For every N, K, >0, 9 explicit (K,A) expander with degree D = poly(log N, 1/ ) expansion A = (1- ) ¢ D #right vertices M = D 2 ¢ K 1.01. | (S)| A ¢ |S| D N M S, |S| K

11
Our Construction Left vertices = F q n = polys of degree · n-1 over F q Degree = q Right vertices = F q m+1 ( f,y ) = y ’th neighbor of f = (y, f(y), (f h mod E)(y), (f h 2 mod E)(y), …, (f h m-1 mod E)(y)) where E(Y) = irreducible poly of degree n h = a parameter Thm: This is a (K,A) expander with K=h m, A = q-hnm.

12
Setting Parameters ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) f = poly of degree · n-1, E = irreducible of degree n N = F q n, D = q, M = F q m+1 Thm: This is a (K,A) expander with K=h m, A = q-hnm. Set h = poly ( nm / ) q = h 1.01 Then: D = q = poly(log N, 1/ ) A = q-hnm ¸ (1- ) ¢ D M = q m+1 = q ¢ (h 1.01 ) m = D ¢ K 1.01

13
Rel’n to Parvaresh-Vardy Codes [PV05] ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) f = poly of degree · n-1, E = irreducible of degree n Thm: This is a (K,A) expander with K=h m, A = q-hnm. ( f,y ) = ( y, y ’th symbol of PV encoding f ) Proof of expansion inspired by list-decoding algorithm for PV codes.

14
List-Decoding View of Expanders For T µ [M], define LIST(T) = {x 2 [N] : (x) µ T} Lemma: G is a (=K,A) expander iff for all T µ [M] of size AK-1, we have |LIST(T)| · K-1 | (S)| A ¢ K D N S, |S|=K M “ (=K,A) expander”

15
Comparing List-Decoding Views : [N] £ [D] ! [D] £ [M] T µ [D] £ [M] ObjectInterpretation x 2 LIST(T) iffDecoding Problem expanders x,y) = y ’th nbr of x 8 y (x,y) 2 T |T| < AK ) |LIST(T)| < K list-decodable codes x,y) = (y,ECC(x) y ) Pr y [ (x,y) 2 T] ¸ 1/M + T = {(y,r y )} ) |LIST(T)| < K

16
Proof of Expansion ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) f = poly of degree · n-1, E = irreducible of degree n Thm: For A=q-nmh and any K · h m, we have T µ F q m+1 of size AK-1 ) |LIST(T)| · K-1 Proof Outline (following [S97,GS99,PV05] ): 1.Find a low-degree poly Q vanishing on T. 2.Show that every f 2 LIST(T) is a “root” of a related polynomial Q’. 3.Show that deg(Q’) · K-1 =

17
Proof of Expansion: Step 1 ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) f = poly of degree · n-1, E = irreducible of degree n Thm: For A=q-nmh, K= h m, |T| · AK-1 ) |LIST(T)| · K-1. Step 1: Find a low-degree poly Q vanishing on T. Take Q(Y,Z 1,…,Z m ) to be of degree · A-1 in Y, degree · h-1 in each Z i. # coefficients = A K > |T| = # constraints ) nonzero solution WLOG E(Y) doesn’t divide Q(Y,Z 1,…,Z m ).

18
Proof of Expansion: Step 2 ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) f = poly of degree · n-1, E = irreducible of degree n Thm: For A=q-nmh, K= h m, |T| · AK-1 ) |LIST(T)| · K-1. Step 1: 9 Q vanishing on T, deg · A-1 in Y, h-1 in Z i, E - Q Step 2: Every f 2 LIST(T) is a “root” of a related Q’. f(Y) 2 LIST(T) ) 8 y 2 F q Q(y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) = 0 ) Q(Y, f(Y), (f h mod E)(Y), …, (f h m-1 mod E)(Y)) 0 ) Q(Y, f(Y), f(Y) h, …, f(Y) h m-1 ) 0 (mod E(Y)) ) Q’(f) = 0 in F q [Y]/E(Y), where Q’(Z) = Q(Y,Z,Z h,…,Z h m-1 ) mod E(Y)

19
Proof of Expansion: Step 3 ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) f = poly of degree · n-1, E = irreducible of degree n Thm: For A=q-nmh, K= h m, |T| · AK-1 ) |LIST(T)| · K-1. Step 1: 9 Q vanishing on T, deg · A-1 in Y, h-1 in Z i, E - Q Step 2: 8 f 2 LIST(T) Q’(f) = 0 where Q’(Z) = Q(Y,Z,Z h,…,Z h m-1 ) mod E(Y) Step 3: Show that deg(Q’) · K-1 Q’(Z) nonzero because Q(Y,Z 1,….,Z m ) not divisible by E(Y) & is of deg · h-1 in Z i deg(Q’(Z)) · h-1+(h-1) ¢ h+ +(h-1) ¢ h m-1 = h m -1 = K-1

20
Proof of Expansion: Wrap-Up ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) f = poly of degree · n-1, E = irreducible of degree n Thm: For A=q-nmh, K= h m, |T| · AK-1 ) |LIST(T)| · K-1. Step 1: 9 Q vanishing on T, deg · A-1 in Y, h-1 in Z i, E - Q Step 2: 8 f 2 LIST(T) Q’(f) = 0 where Q’(Z) = Q(Y,Z,Z h,…,Z h m-1 ) mod E(Y) Step 3: Show that deg(Q’) · K-1 Proof of Thm: |LIST(T)| · deg(Q’) · K-1. ¥

21
Our Result Thm: For every N, K, >0, 9 explicit (K,A) expander with degree D = poly(log N, 1/ ) expansion A = (1- ) ¢ D #right vertices M = D 2 ¢ K 1.01. | (S)| A ¢ |S| D N M S, |S| K

22
Outline Expander Construction Application to Extractors Connections Conclusions

23
Extractors: Original Motivation [SV84,Vaz85,VV85,CG85,Vaz87,CW89,Zuc90,Zuc91] Randomization is pervasive in CS –Algorithm design, cryptography, distributed computing, … Typically assume perfect random source. –Unbiased, independent random bits –Unrealistic? Can we use a “weak” random source? –Source of biased & correlated bits. –More realistic model of physical sources. (Randomness) Extractors: convert a weak random source into an almost-perfect random source.

24
Applications of Extractors Derandomization of (poly-time/log-space) algorithms [Sip88,NZ93,INW94, GZ97,RR99, MV99,STV99,GW02] Distributed & Network Algorithms [WZ95,Zuc97,RZ98,Ind02]. Hardness of Approximation [Zuc93,Uma99,MU01,Zuc06] Data Structures [Ta02] Cryptography [BBR85,HILL89,CDHKS00,Lu02,DRS04,NV04] Metric Embeddings [Ind06]

25
Def: A (k, ) -extractor is Ext : {0,1} n £ {0,1} d ! {0,1} m s.t. 8 k -source X, Ext ( X,U d ) is -close to U m. 8 x Pr [ X = x ] · 2 -k Extractors [NZ93] d random bits “seed” Optimal (nonconstructive): d = log(n-k)+2log(1/ )+O(1) m = k+d-2log(1/ )-O(1) E XT k - source of length n m almost-uniform bits in variation distance

26
Our Result d random bits “seed” E XT k - source of length n m almost-uniform bits Thm: For every n, k, >0, 9 explicit (k, ) extractor with seed length d=O(log(n/ )) and output length m=.99k. Previously achieved by [LRVW03] –Only worked for ¸ 1/n o(1) –Complicated recursive construction

27
Approach: Condensers [RR99,RSW00] d random bits “seed” C ON k - source of length n ¼ k’ - source of length m Def: A k ! k’ condenser is Con : {0,1} n £ {0,1} d ! {0,1} m s.t. 8 k -source X, Con ( X,U d ) -close to some k’- source. Can extract from output: easier if k’/m > k/n. Called lossless if k’=k+d.

28
2k2k Lossless Condensers Expanders Lemma [TUZ01]: Con : {0,1} n £ {0,1} d ! {0,1} m is a k ! k+d condenser iff it defines a (2 k,(1- ) ¢ 2 d ) expander. Proof ( ( ): Suffices to condense sources uniform on 2 k strings. Expansion ) can make 1-1 by moving fraction of edges {0,1} n {0,1} m 2d2d ¸ (1- ) 2 d ¢ 2 k n - bit k - source ¼ m -bit ( k+d) - source d -bit seed C ON x Con(x,y) y

29
Our Condenser Thm: For every N, K, >0, 9 explicit (K,A) expander with degree D = poly(log N, 1/ ) expansion A = (1- ) ¢ D #right vertices M = D 2 ¢ K 1.01. ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) Thm: For every n, k, >0, 9 explicit k ! k+d condenser w/ seed length d = O(log n+log(1/ )) output length m=2d+1.01 ¢ k Con ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y))

30
Our Extractor Condense: 9 explicit k ! k+d condenser w/ seed length d = O(log n+log(1/ )) output length m ¼ 1.01k Con ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) Then Extract: apply extractor for min-entropy rate.99: Constant –Ext(x,y) = y’th vertex on expander walk specified by x. –Extraction follows from Chernoff bound for expander walks [G98], via equivalence of extractors and samplers [Z96].

31
Our Extractor Condense: 9 explicit k ! k+d condenser w/ seed length d = O(log n+log(1/ )) output length m ¼ 1.01k Con ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) Then Extract: apply extractor for min-entropy rate.99: Arbitrary –Zuckerman’s extractor for constant min-entropy rate [Z96].

32
Variations on the Condenser Thm: 9 explicit k ! k+d condenser w/ seed length d = O(log n+log(1/ )) output length m ¼ 1.01k Con ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) Variations (lose constant fraction of min-entropy): “Repeated roots” [GS99] in analysis –seed length d = log n+log(1/ )+O(1) –output length m = O(k ¢ log(n/ ))

33
Variations on the Condenser Thm: 9 explicit k ! k+d condenser w/ seed length d = O(log n+log(1/ )) output length m ¼ 1.01k Con ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) Variations (lose constant fraction of min-entropy): E(Y) = Y q-1 - , for primitive root [GR06] ) (f h i mod E)(y) = f ( i y) ) univariate analogue of Shaltiel-Umans extractor [SU01].

34
Outline Expander Construction Application to Extractors Connections Conclusions

35
Comparing List-Decoding Views : {0,1} n £ {0,1} d ! {0,1} d £ {0,1} m T µ {0,1} d £ {0,1} m N=2 n,D=2 d,… ObjectInterpretation x 2 LIST(T) iffDecoding Problem expanders x,y) = y ’th nbr of x 8 y (x,y) 2 T |T| < AK ) |LIST(T)| < K list-decodable codes x,y) = (y,ECC(x) y ) Pr y [ (x,y) 2 T] ¸ 1/2 m + T = {(y,r y )} ) |LIST(T)| < K extractors Pr y [ (x,y) 2 T] ¸ |T|/2 m+d + 8 T |LIST(T)| < K lossy condensers Pr y [ (x,y) 2 T] ¸ |T|/2 m+d + |T| · K’-1 ) |LIST(T)| · K

36
Outline Expander Construction Application to Extractors Connections Conclusions

37
List-decoding view ) best known constructions of –Highly unbalanced expanders –Lossless condensers –Randomness extractors Push it further? –Nonbipartite expanders –Direct construction of extractor –Extractors optimal up to additive constants –Better list-decodable codes

Similar presentations

OK

Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)

Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)

© 2018 SlidePlayer.com Inc.

All rights reserved.

To ensure the functioning of the site, we use **cookies**. We share information about your activities on the site with our partners and Google partners: social networks and companies engaged in advertising and web analytics. For more information, see the Privacy Policy and Google Privacy & Terms.
Your consent to our cookies if you continue to use this website.

Ads by Google

Topics for ppt on environmental pollution Ppt on earthquake resistant building Ppt on waxes definition Ppt on history of australia timeline Ppt on best hr practices in india Ppt on data collection methods observation Ppt on area of plane figures worksheet Ppt on event driven programming vs procedural programming Ppt on soil pollution and its control Ppt on idea cellular advertisement