Download presentation

Presentation is loading. Please wait.

Published byTrinity Beardsley Modified over 2 years ago

1
List decoding and pseudorandom constructions: lossless expanders and extractors from Parvaresh-Vardy codes Venkatesan Guruswami Carnegie Mellon University --- CMI Pseudorandomness Workshop, Aug 23, 2011 ---

2
[GW94,WZ95, TUZ01,RVW00, CRVW02] Connections in Pseudorandomness Randomness Extractors Expander Graphs Error-Correcting Codes Pseudorandom Generators [STV99,SU01,Uma02] [Tre99,TZ01, TZS01,SU01] Algebraic list decoding [SS96,Spi96, GI02,GI03, GR06,GUV07] [Tre99,RRV99, ISW99,SU01,Uma02] Euclidean Sections, Compressed sensing [GLR08,GLW08] Expander codes

3
[GW94,WZ95, TUZ01,RVW00, CRVW02] Connections in Pseudorandomness Randomness Extractors Expander Graphs List-Decodable Error-Correcting Codes Pseudorandom Generators [STV99,SU01,U02] [Tre99,TZ01, TZS01,SU01] This talk [PV05,GR06] [GI02,GI03] [Tre99,RRV99, ISW99,SU01,U02] This talk

4
List Decodable codes Code C D with N codewords, alphabet size | | = Q (e,L)-list-decodable: Every Hamming ball of radius e has at most L codewords of C –Combinatorial packing condition –Balls of radius e around codewords cover each point L times. –List error correction of e errors with worst-case list size L

5
List Decoding Centric View of Pseudorandom Objects

6
List decoding, in different notation Encoding function E : [N] [Q] D View as map (bipartite graph) : [N] x [D] [D] x [Q] – (x, y) = (y, E(x) y ) List decoding property: For all r [Q] D, if T = { (y, r y ) : y [D] } then |LIST(T)| L where we define LIST(T) = { x : (x, y) T for at least D - e values of y } N D D x Q x

7
Bipartite expanders For all K’ ≤ K, and T [M] with |T| < AK’, LIST(T) < K’ where LIST(T) = { x [N] : for all y [D], (x, y) T } | (S)| A ¢ |S| ( vertex expansion A = expansion factor ) M S, |S| K “ (K,A) expander” D N : [N] x [D] [M]

8
Extractors : [N] x [D] [M] is a (k, )-extractor if for all T [M], |LIST(T)| < 2 k where LIST(T) = { x [N] : Pr y [ (x,y) T ] ≥ |T|/M + } d random bits “seed” E XT unknown source of length n with k bits of “min-entropy” m almost-uniform bits M = 2 m Would like m k N = 2 n D = 2 d

9
Condensers (weaker object en route extractors) Output not close to uniform but is close to source with good min-entropy –Ideally k’ k (don’t lose entropy), m k (good entropy “rate”) Can also be captured by list decoding type definition –LIST(T) small for all small subsets T [M], where LIST(T) = { x : Pr y [ (x,y) T ] ≥ } d random bits seed C OND k - source of length n ~ k’-source of length m

10
The common framework Definitions of various useful objects : [N] x [D] [M] captured as: “For all subsets T [M] that obey certain property, a suitably defined list decoding of T, LIST(T), has small size” –List decodable codes: T arising out of received words –Expanders, condensers: T of small size Also case for “list recoverable codes” –Extractors: arbitrary T The framework gives not just unified abstractions, but also a proof method that leads to the best constructions and analysis.

11
Parameters of interest Map : [N] x [D] [M] What we care about varies for different objects Extractors: small seed length D (= poly(log N)); large output length M Codes: want small alphabet size M, small D (= O(log N)) –Small |LIST(T)|, plus efficient algorithm to recover LIST(T) Tight analysis of size of LIST(T) : –exact value not too crucial for codes; –for lossless expanders it is crucial (factor 2 worse bound implies factor 2 worse expansion)

12
The abstraction in action Unbalanced expanders Expander Construction from Parvaresh-Vardy codes View as condensers and application to extractors Conclusions

13
Unbalanced Expander Graphs Goals: Minimize D Maximize A ( lossless expansion: A close to D ) Minimize M (not much larger than O(KD)) | (S)| A ¢ |S| ( vertex expansion) M S, |S| K “ (K,A) expander” N D

14
Expanders have many uses … Fault-tolerant networks (e.g., [Pin73,Chu78,GG81]) Sorting in parallel [AKS83] Derandomization [AKS87,IZ89,INW94,IW97,Rei05,…] PCP theorem [Din06] Randomness Extractors [CW89,GW94,TUZ01,RVW00,GUV07] Error-correcting codes [SS96,Spi96,LMSS01,GI01-04] Distributed routing in networks [ PU89,ALM96,BFU99 ]. Data structures [ BMRV00 ]. Hard tautologies in proof complexity [BW99,ABRW00,AR01 ]. Pseudorandom matrices, Almost Euclidean sections of L 1 N [GLR’08,GLW’08] …. Need explicit constructions (deterministic, time poly(log N)).

15
(Bipartite) Expander Graphs Goals: Minimize D Maximize A Minimize M | (S)| A ¢ |S| M S, |S| K Optimal (Non-constructive): D = O(log (N/M) / ) A = (1- ) ¢ D M = O(KD/ “ (K,A) expander” N D

16
Explicit Constructions Optimal O(log (N/M)) (1- ) ¢ D O(KD Ramanujan graphs O(1) ¼ D/2N Zig-zag CRVW02] O(1) (1- ) ¢ D N Ta-Shma, Umans, Zuckerman[TUZ01] polylog(N) exp(poly(log log N)) (1- ) ¢ D exp(poly(log KD) poly(KD) G., Umans, Vadhan polylog(N) (1- ) ¢ Dpoly(KD) degree D expansion A |right-side| M arbitrary positive constant.

17
Utility of Expansion Utility of Expansion (1- ) ¢ D At least (1-2 ) D |S| elements of (S) are unique neighbors: touch exactly one edge from S | (S)| (1- ) D |S| D N M S, |S| K x Set membership in bit-probe model [BMRV’00] Fault tolerance: Even if an adversary removes say ¾ edges from each vertex, lossless expansion maintained (with =4 ) Useful in Expander codes [SS’96]

18
The Result Theorem [GUV]: N, K, >0, 9 explicit (K,A) expander with degree D = poly(log N, 1/ ) expansion A = (1- ) ¢ D #right vertices M = D 2 ¢ K 1.01 | (S)| A ¢ |S| M S, |S| K “ (K,A) expander” N D

19
Parvaresh-Vardy codes Variant of Reed-Solomon codes Parameters of construction: n, F q, m, h, an irreducible polynomial E(Y) of degree n over F q Encoding: Given message f F q n or polynomial f(Y) F q [Y] of degree (n-1), –PV(f) y = (f 0 (y), f 1 (y), …, f m-1 (y)) for y F q where f i (Y) = (f(Y)) h^i mod E(Y) Define (f, y) = (y, PV(f) y ) –Consider bipartite expander with neighborhood given by

20
Expander theorem Left vertices = polynomials of degree · n-1 over F q (N = q n ) Degree D = q Right vertices = F q m+1 (M = q m+1 ) ( f,y ) = y ’th neighbor of f = (y, f(y), (f h mod E)(y), (f h 2 mod E)(y), …, (f h m-1 mod E)(y)) where E(Y) = irreducible * poly of degree n over F q h = a parameter Thm [GUV’07] : This is a (K,A) expander for K = h m, A = q-hnm. * can be found deterministically in poly(n, log q, char( F q )) time

21
Close relation to list decoding Proof of expansion based on list decoding of Parvaresh- Vardy codes –Need a tight analysis of list size –For “list recovery” version S1S1 S2S2 SqSq y 1 y 2 y q K Possible values for each position

22
Recall list decoding view For T µ [M], define LIST(T) = {x 2 [N] : (x) µ T} Lemma: G is a (=K,A) expander if and only if for all T µ [M] of size AK-1, we have |LIST(T)| · K-1 | (S)| A ¢ K “ (=K,A) expander” M S, |S|=K N D

23
Expansion analysis ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) f = poly of degree · n-1, y F q, E = irreducible of degree n Theorem: For A = q - nmh and any K · h m, we have T µ F q m+1 of size AK-1 ) |LIST(T)| · K-1 Proof outline, following [S97,GS99,PV05]: 1.Find a nonzero low-degree multivariate polynomial Q vanishing on T. 2.Show that every f 2 LIST(T) is a root of a related univariate polynomial Q*. 3.Show that Q * is nonzero and deg(Q * ) · K-1 =

24
Proof of Expansion: Step 1 Thm: For A=q-nmh, K= h m, |T| · AK-1 ) |LIST(T)| · K-1. Step 1: Find a low-degree poly Q vanishing on T µ F q m+1 Take Q(Y,Z 1,…,Z m ) to be of degree · A-1 in Y, degree · h-1 in each Z i. # coefficients = A K > |T| = # homogeneous constraints, so a nonzero solution exists Wlog E(Y) doesn’t divide Q(Y,Z 1,…,Z m ).

25
Proof of Expansion: Step 2 ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) Step 1: 9 Q(Y,Z 1,…,Z m ) vanishing on T, deg · A-1 in Y, h-1 in Z i, E - Q Step 2: Every f 2 LIST(T) is a “root” of a related Q * Polynomial f 2 LIST(T) ) 8 y 2 F q Q(y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) = 0 ) Q(Y, f(Y), (f h mod E)(Y), …, (f h m-1 mod E)(Y)) 0 ) Q(Y, f(Y), f(Y) h, …, f(Y) h m-1 ) 0 (mod E(Y)) ) Q * (f) = 0 in extension field U= F q [Y]/(E(Y)), where Q* U[Z] is given by Q * (Z) = Q(Y,Z,Z h,…,Z h m-1 ) mod E(Y) Degree ≤ A-1+nmh < q ≤ # roots

26
Proof of Expansion: Step 3 Step 2: 8 f 2 LIST(T) Q * (f) = 0 where Q * (Z) = Q(Y,Z,Z h,…,Z h m-1 ) mod E(Y) Step 3: Show that Q * is nonzero and deg(Q * ) · K-1 Q * (Z) nonzero because –Q(Y,Z 1,….,Z m ) mod E(Y) is nonzero –Q is of deg · h-1 in Z i so distinct monomals get mapped to distinct powers of Z when we set Z i = Z h i deg(Q * ) · h-1+(h-1) ¢ h+ +(h-1) ¢ h m-1 = h m -1 = K-1

27
Proof of Expansion: Wrap-Up ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) LIST(T) = { x 2 [N] : (x) µ T } Theorem: For A = q - nmh, K= h m, |T| · AK-1 ) |LIST(T)| · K-1. There is a nonzero polynomial Q * over U= F q [Y]/(E(Y)) with deg(Q * ) · K - 1 such that every f LIST(T) satisfies Q * (f) = 0. Hence |LIST(T)| · deg(Q * ) · K - 1. ¥

28
Parameter Choices LHS = F q n, degree D = q, RHS = F q m+1 We have a (K,A) expander with K = h m, A = q - nmh To make A (1- ) ¢ D, pick q nmh/ . To make M ¼ KD, need q m+1 ¼ q h m, so take q ¼ h 1+ Set h ¼ ( nm / ) 1/ q ¼ h 1+ . Then: A = q - nmh (1- q = (1- ) ¢ D M = q m+1 ¼ q ¢ h ( 1+ m ¼ D ¢ K 1+ D = ( nm / ) 1+1/ ¼ ((log N)(log K)/ ) 1+1/

29
Our Expander Result Thm: For every N, K, >0, 9 explicit (K,A) expander with degree D = O((log N) ¢ (log K)/ ) 1+1/ expansion A = (1- ) ¢ D #right vertices M = (D ¢ K) 1+ | (S)| A ¢ |S| M S, |S| K “ (K,A) expander” N D

30
Outline Unbalanced expanders Expander Construction from Parvaresh-Vardy codes View as condensers and application to Extractors Conclusions

31
Extractors [NZ’93] Goal: Output -close to uniform on {0,1} m (for large m and small d) Optimal (nonconstructive): d = log n + 2 log(1/ ) + O(1) m = (k+d) - 2 log(1/ ) - O(1) d random bits “seed” E XT Uniform sample from unknown subset X {0,1} n of size 2 k m almost-uniform bits

32
Extractors: Original Motivation Randomization is pervasive in CS –Algorithm design, cryptography, distributed computing, … Typically assume perfect random source. –Unbiased, independent random bits –Unrealistic? Can we use a “weak” random source? –Source of biased & correlated bits. –More realistic model of physical sources. (Randomness) Extractors: convert a weak random source into an almost-perfect random source. Dozens of constructions over 15+ years

33
Extractors: many “extraneous” uses… Derandomization of (poly-time/log-space) algorithms [Sip88,NZ93,INW94, GZ97,RR99, MV99,STV99,GW02] Distributed & Network Algorithms [WZ95,Zuc97,RZ98,Ind02]. Hardness of Approximation [Zuc93,Uma99,MU01,Zuc06] Data Structures [Ta02] Cryptography [BBR85,HILL89,CDHKS00,Lu02,DRS04,NV04] List decodable codes [TZ01,Gur04] Metric Embeddings [Ind06] Compressed sensing [Ind07]

34
[GUV] Result on Extractors Thm: For every n, k, >0, 9 explicit (k, ) extractor with seed length d=O(log n + log (1/ )) and output length m=.99k. Previously achieved by [LRVW03] –Only worked for ¸ 1/n o(1) –Complicated recursive construction Optimal up to constant factors

35
2k2k Expanders & Lossless Condensers Lemma [TUZ01]: : {0,1} n £ {0,1} d ! {0,1} m is a lossless ((n, k) ! (m,k +d )) condenser if graph is a (2 k,(1- ) ¢ 2 d ) expander. Proof: Expansion ) can make 1-1 by moving fraction of edges {0,1} n {0,1} m 2d2d ¸ (1- ) 2 d ¢ 2 k n - bit source with entropy k m ¼ 1.01k bit source with entropy ( k+d) d -bit seed C OND x (x,y)(x,y) y

36
Extractor Using PV code, we have compressed the n bit source to 1.01k bits while retaining all the entropy (using O(log n) bit seed) –Cond ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) Now extract 0.99k bits from the 1.01k bit source with entropy k –Easier, specialized task (due to high entropy percentage) –Good constructions already known For constant error , can use a simple random walk based extractor –Compose with our condenser to get final extractor

37
Extractor for high min-entropy Extractor for min-entropy rate 99% that extracts 99% of the input min-entropy with constant error : Ext(x,y) = y’th vertex on expander walk specified by x ( n bit source: specify walk of length n/c) 2 c -degree expander on 2 (1- )n nodes Extraction follows from Chernoff bound for expander walks [Gil98]

38
Variation on the Condenser Cond ( f,y ) = (y, f(y), (f h mod E)(y), …, (f h m-1 mod E)(y)) Use E(Y) = Y q-1 - , for generator of F q * [G.-Rudra’06] ) (f q i mod E)(y) = f ( i y) Cond(f,y) = (y, f(y), f (γy), f(γ 2 y)…, f(γ m-1 y)) Condenser from Folded Reed-Solomon code [ GR06 ] –Loses small constant fraction of min-entropy Okay for the extractor application –Univariate analogue of Shaltiel-Umans extractor f(Y) q = f(Y q ) f( Y) mod E(Y)

39
Conclusions List decoding view + an algebraic code construction ) best known constructions of –Highly unbalanced expanders –Lossless condensers –Randomness extractors Future directions? –Constant degree lossless expanders (alternative to zig-zag) Non-bipartite expanders? –Direct construction of a simple, algebraic extractor –Extractors with better (or even optimal) entropy loss? Suffices to achieve this for entropy rate 0.999 –Other pseudorandom objects: multi-source extractors?

Similar presentations

OK

An Introduction to Randomness Extractors Ronen Shaltiel University of Haifa Daddy, how do computers get random bits?

An Introduction to Randomness Extractors Ronen Shaltiel University of Haifa Daddy, how do computers get random bits?

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on e-commerce business models Ppt on leadership styles of bill gates Ppt on general electric company Ppt on social networking project Ppt on use of maths in our daily life Ppt on cloud computing security issues Ppt on law against child marriage in africa Ppt on use of technology in agriculture Ppt on interview skills free download Ppt on power system stability studies