Download presentation

Presentation is loading. Please wait.

Published byLilian Latus Modified over 3 years ago

1
Linear-Degree Extractors and the Inapproximability of Max Clique and Chromatic Number David Zuckerman University of Texas at Austin

2
Max Clique and Chromatic Number [FGLSS,…,Hastad]: Max Clique inapproximable to n 1-, any >0, assuming NP ZPP. [LY,…,FK]: Same for Chromatic Number. Can we assume just NP P? Thm: Both inapproximable to n 1-, any >0, assuming NP P. Thm: Derandomized [Khot]: Both inapproximable to n/2 (log n) 1-, some >0, assuming NQ Q. Derandomization tool: disperser.

3
Outline Extractors and Dispersers Dispersers and Inapproximability Extractor/Disperser Construction –Additive Number Theory Conclusion

4
Weak Random Sources Random element from set A, |A| 2 k. |A| 2 k {0,1} n

5
Weak Random Sources Can arise in different ways: –Physical source of randomness. –Condition on some information: Cryptography: bounded storage model. Pseudorandom generators for space-bounded machines. Convex combinations yield more general model, k-source: x Pr[X=x] 2 -k.

6
Weak Random Sources Goal: Algorithms that work for any k-source. Should not depend on knowledge of k-source. First attempt: convert weak randomness to good randomness.

7
Goal Ext very long weakly random long almost random Should work for all k-sources. Problem: impossible.

8
Solution: Extractor [Nisan-Z] Ext very long weakly random long almost random short truly random

9
Extractor Parameters [NZ,…, Lu-Reingold-Vadhan-Wigderson] Ext n bits k-source m=.99k bits almost random d=O(log n) truly random Almost random in variation (statistical) distance. Error = arbitrary constant > 0.

10
(1- )M K=2 k Graph-Theoretical View of Extractor D=2 d N=2 n M=2 m Think of K=N, M. Goal: D=O(log N). output -uniform x E(x,y 1 ) E(x,y 2 ) E(x,y 3 ) Disperser

11
Applications of Extractors PRGs for Space-Bounded Computation [Nisan-Z] PRGs for Random Sampling [Z] Cryptography [Lu, Vadhan, CDHKS, Dodis-Smith] Expander graphs and superconcentrators [Wigderson-Z] Coding theory [Ta-Shma- Z] Hardness of approximation [Z, Umans, Mossel-Umans] Efficient deterministic sorting [Pippenger] Time-space tradeoffs [Sipser] Data structures [Fiat-Naor, Z, BMRV, Ta-Shma]

12
Extractor Degree In many applications, left-degree D is relevant quantity: –Random sampling: D=# samples –Extractor codes: D=length of code –Inapproximability of Max Clique: size of graph = large-case-clique-size c D (scaled-down).

13
Extractor/Disperser Constructions n=lg N=input length. Previous typical good extractor: D=n O(1). [Ta-Shma-Z-Safra]: –D=O(n log * n), but M=K o(1). –For K=N (1), D=n polylog(n), M=K (1). New Extractor: For K=N (1), D=O(n) and M=K.99. New Disperser: Same, even D=O(n/log s -1 ), s=1- =fraction hit on right side

14
Extractor Parameters [Nisan-Z,…, Lu-Reingold-Vadhan-Wigderson] Ext n bits k-source k=lg K.99k bits almost random O(log n) random seed Almost random in variation (statistical) distance. Error = arbitrary constant > 0. [TZS]: For k= (n), lg n + O(log log n) bit seed. New theorem: For k= (n), lg n + O(1) bit seed. (k)

15
Dispersers and Inapproximability Max Clique: can amplify success probability of PCP verifier using appropriate disperser [Z]. Chromatic Number: derandomize Feige-Kilian reduction. –[FK]: randomized graph products [BS]. –We use derandomized graph powering. Derandomized graph products of [Alon- Feige-Wigerson-Z] too weak.

16
Fractional Chromatic Number Chromatic number (G) N/ (G), (G) = independence number. Fractional chromatic number f (G): (G)/log N f (G) (G), f (G) N/ (G).

17
Overview of Feige-Kilian Reduction Poly-time reduction from NP-complete L to Gap-Chromatic Number: –x L f (G) b/c, –x L f (G) > b.

18
Overview of Feige-Kilian Reduction Poly-time reduction from NP-complete L to Gap-Chromatic Number: –x L f (G) b/c, –x L (G) b Amplify: G G D, OR product. –(v 1,…,v D ) ~ (w 1,…,w D ) if i, v i ~ w i. – (G D ) = (G) D. – f (G D ) = f (G) D. –Gap c c D. Graph too large: take random subset of V D.

19
sM |A| K Disperser picks subset V of V D deterministically D V V x (x 1,x 2,…,x D ) y (y 1,y 2,…,y D ) x x 1 x 2 x 3 y1y1 y y2y2 y3y3 Strong disperser: A i: i (A) sM 1 2 3 2

20
sM |A| K (G) < sM (G) < K V V x x i y yiyi If A independent in G, |A| K then ( i) i (A) sM. Since OR graph product.

21
Properties of Derandomized Powering If (G) < s|V|, then (G) < K. f (G) f (G D ) = f (G) D.

22
Properties of Derandomized Powering If (G) < s|V|, then (G) < K. –If x L, then (G) K N so f (G) N 1- -Since disperser works for any entropy rate >0. f (G) f (G D ) = f (G) D. –If x L, then f (G) N. –Uses D=O((log N)/log s -1 ).

23
Extractor/Disperser Outline Condense: Extract:.9 uniform + lg n+O(1) random bits + O(1) random bits

24
Extractor for Entropy Rate.9 (extension of [AKS]) G=2 c -regular expander on {0,1} m Weak source input: walk (v 1,v 2,…,v D ) in G –m + (D-1)c bits Random seed: i [D] Output: v i. Proof: Chernoff bounds for random walks [Gil,Kah]

25
Condensing with O(1) bits 1.[BKSSW, Raz]: somewhere condenser Some choice of seed condenses. Uses additive number theory [BKT,BIW] 2-bit seed suffices to increase entropy rate. New result: 1-bit seed suffices. Simpler. 2.[Raz]: convert to condenser.

26
Condensing via Incidence Graph 1-Bit Somewhere Condenser: –Input: edge –Output: random endpoint Condenses rate to somewhere rate (1+ ), some > 0. –Distribution of (L,P) a somewhere rate (1+ ) source. lines points = F p 2 L P (L,P) an edge iff P on L N 3/2 edges

27
Somewhere r-source (X,Y) is an elementary somewhere r-source if either X or Y is an r-source. Somewhere r-source: convex combination of elementary somewhere r-sources.

28
Incidence Theorem [BKT] P,L sets of points, lines in F p 2 with |P|, |L| M p 1.9. # incidences I(P,L)=O(M 3/2- ), some >0. lines points = F p 2 L P L P few edges

29
Simple Statistical Lemma If distribution X is -far from an r-source, then S, |S|<2 r : Pr[X S]. Proof: take S={x | Pr[X = x] > 2 -r }. 2 -r S FpFp

30
Statistical Lemma for Condenser Lemma: If (X,Y) is -far from somewhere r-source, then S supp(X), T supp(Y), |S|,|T| < 2 r, such that Pr[X S and Y T]. Proof: S={s: Pr[X=s] > 2 -r } T={t: Pr[Y=t] > 2 -r }

31
Statistical Lemma for Condenser Lemma: If (X,Y) is -far from somewhere r-source, then S supp(X), T supp(Y), |S|,|T| < 2 r, such that Pr[X S and Y T]. Proof of Condenser: Suppose output -far from somewhere r-source. Get sets S and T. I(S,T) 2 r, r = input min-entropy. Contradicts Incidence Theorem.

32
Additive Number Theory A=set of integers, A+A=set of pairwise sums. Can have |A+A| < 2|A|, if A=arithmetic progression, e.g. {1,2,…,100}. Similarly can have |A A| < 2|A|. Cant have both simultaneously: –[ES,Elekes]: max(|A+A|,|A A|) |A| 5/4 –False in F p : A=F p

33
Additive Number Theory A=set of integers, A+A=set of pairwise sums. Can have |A+A| < 2|A|, if A=arithmetic progression, e.g. {1,2,…,100}. Similarly can have |A A| < 2|A|. Cant have both simultaneously: –[ES,Elekes]: max(|A+A|,|A A|) |A| 5/4 –[Bourgain-Katz-Tao, Konyagin]: similar bound over prime fields F p : |A| 1+, assuming 1 0.

34
Independent Sources Corollary: if |A| p.9, then |A A+A| |A| 1+. Can we get statistical version of corollary? –If A,B,C independent k-sources, k.9n, is AB+C close to k-source, k=(1+ )k? (n=log p) –[Z]: under Generalized Paley Graph conjecture. [Barak-Impagliazzo-Wigderson] proved statistical version.

35
Simplifying and Slight Strengthening Strengthening: assume (A,C) a 2k-source and B an independent k-source. Use Incidence Theorem. Relevance: lines of form ab+c. How can we get statistical version?

36
Simplified Proof of BIW Suppose AB+C -far from k-source. Let S=set of size < 2 k from simple stat lemma. Let points P=supp(B) S. Let lines L=supp((A,C)), where (a,c) line ax+c. I(P,L) |L| |supp(B)| = 2 3k. Contradicts Incidence Theorem.

37
Conclusions and Future Directions 1.NP-hard to approximate Max Clique and Chromatic Number to within n 1-, any >0. NQ-hard to within n/2 (log n) 1- some >0. What is the right n 1-o(1) factor? 2.Extractor construction with linear degree for k= (n), m=.99k output bits. Linear degree for general k? 3.1-bit somewhere-condenser. Also simplify/strengthen [BIW,BKSSW,Bo]. Other uses of additive number theory?

Similar presentations

OK

Randomness Conductors Expander Graphs Randomness Extractors Condensers Universal Hash Functions............

Randomness Conductors Expander Graphs Randomness Extractors Condensers Universal Hash Functions............

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on popular social networking sites Ppt on instrument landing system theory Ppt on data handling for class 3 Ppt on cse related topics about information Ppt on condition based maintenance conference Topics for ppt on environmental problems Ppt on various means of transport and communication and their uses Ppt on central administrative tribunal india Short ppt on green revolution Download ppt on conventional and non conventional sources of energy