Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS151 Complexity Theory Lecture 10 April 29, 2015.

Similar presentations


Presentation on theme: "CS151 Complexity Theory Lecture 10 April 29, 2015."— Presentation transcript:

1 CS151 Complexity Theory Lecture 10 April 29, 2015

2 2 Codes and Hardness 01001010 m:m: 01001010 Enc(m): 00010 01100010 R: 01000 D C f:{0,1} log k  {0,1} f ’:{0,1} log n  {0,1} small circuit C approximating f’ decoding procedure i 2 {0,1} log k small circuit that computes f exactly f(i)

3 April 29, 20153 Encoding 01001010 m:m: Emb: [k] ! S t StSt FqtFqt p m degree h polynomial with p m (Emb(i)) = m i 57292103 836 0100 1010... evaluate at all x 2 F q t encode each symbol with Had:{0,1} log q ! {0,1} q

4 April 29, 20154 Decoding small circuit C computing R, agreement ½ +  Decoding step 1 –produce circuit C’ from C given x 2 F q t outputs “guess” for p m (x) C’ computes {z : Had(z) has agreement ½ +  with x-th block}, outputs random z in this set 01001010 Enc(m): 0001 01100010 R: 0100

5 April 29, 20155 Decoding Decoding step 1 (continued): –for at least  /2 of blocks, agreement in block is at least ½ +  /2 –Johnson Bound: when this happens, list size is S = O(1/  2 ), so probability C’ correct is 1/S –altogether: Pr x [C’(x) = p m (x)] ≥  (  3 ) C’ makes q queries to C C’ runs in time poly(q)

6 April 29, 20156 Decoding small circuit C’ computing R’, agreement  ’ =  (  3 ) Decoding step 2 –produce circuit C’’ from C’ given x 2 emb(1,2,…,k) outputs p m (x) idea: restrict p m to a random curve; apply efficient R-S list-decoding; fix “good” random choices 57292103 836 pm:pm: 57699103 R’: 816

7 April 29, 20157 Restricting to a curve –points x=  1,  2,  3, …,  r 2 F q t specify a degree r curve L : F q ! F q t w 1, w 2, …, w r are distinct elements of F q for each i, L i :F q ! F q is the degree r poly for which L i (w j ) = (  j ) i for all j Write p m (L(z)) to mean p m (L 1 (z), L 2 (z), …, L t (z)) p m (L(w 1 )) = p m (x) degree r ¢ h ¢ t univariate poly x=  1 22 33 rr

8 April 29, 20158 Decoding small circuit C’ computing R’, agreement  ’ =  (  3 ) Decoding step 2 (continued): –pick random w 1, w 2, …, w r ;  2,  3, …,  r to determine curve L –points on L are (r-1)-wise independent –random variable: Agr = |{z : C’(L(z)) = p m (L(z))}| –E[Agr] =  ’q and Pr[Agr < (  ’q)/2] < O(1/(  ’q)) (r-1)/2 57292103 836 pm:pm: 57699103 R’: 816

9 April 29, 20159 Decoding small circuit C’ computing R’, agreement  ’ =  (  3 ) Decoding step 2 (continued): –agr = |{z : C’(L(z)) = p m (L(z))}| is ¸ (  ’q)/2 with very high probability –compute using Reed-Solomon list-decoding: {q(z) : deg(q) · r ¢ h ¢ t, Pr z [C’(L(z)) = q(z)] ¸ (  ’q)/2} –if agr ¸ (  ’q)/2 then p m (L( ¢ )) is in this set! 57292103 836 pm:pm: 57699103 R’: 816

10 April 29, 201510 Decoding Decoding step 2 (continued): –assuming (  ’q)/2 > (2r ¢ h ¢ t ¢ q) 1/2 –Reed-Solomon list-decoding step: running time = poly(q) list size S · 4/  ’ –probability list fails to contain p m (L( ¢ )) is O(1/(  q)) (r-1)/2

11 April 29, 201511 Decoding Decoding step 2 (continued): –Tricky: functions in list are determined by the set L( ¢ ), independent of parameterization of the curve Regard w 2,w 3, …, w r as random points on curve L for q  p m (L( ¢ )) Pr[q(w i ) = p m (L(w i ))] · (rht)/q Pr[ 8 i, q(w i ) = p m (L(w i ))] · [(rht)/q] r-1 Pr[ 9 q in list s.t. 8 i q(w i ) = p m (L(w i ))] · (4/  ’)[(rht)/q] r-1

12 April 29, 201512 Decoding Decoding step 2 (continued): –with probability ¸ 1 - O(1/(  q)) (r-1)/2 - (4/  )[(rht)/q] r-1 list contains q * = p m (L( ¢ )) q * is the unique q in the list for which q(w i ) = p m (L(w i )) ( =p m (  i ) ) for i = 2,3,…,r –circuit C’’: hardwire w 1, w 2, …, w r ;  2,  3, …,  r so that 8 x 2 emb(1,2,…,k) both events occur hardwire p m (  i ) for i = 2,…r on input x, find q *, output q * (w 1 ) ( = p m (x) )

13 April 29, 201513 Decoding Putting it all together: –C approximating f’ used to construct C’ C’ makes q queries to C C’ runs in time poly(q) –C’ used to construct C’’ computing f exactly C’’ makes q queries to C’ C’’ has r-1 elts of F q t and 2r-1 elts of F q hardwired C’’ runs in time poly(q) – C’’ has size poly(q, r, t, size of C)

14 April 29, 201514 Picking parameters k truth table size of f, hard for circuits of size s q field size, h R-M degree, t R-M dimension r degree of curve used in decoding –h t ¸ k(to accomodate message of length k) –  6 q 2 >  (rhtq)(for R-S list-decoding) –k[O(1/(  q)) (r-1)/2 + (4/  ’)[(rht)/q] r-1 ] < 1 (so there is a “good” fixing of random bits) –Pick: h = s, t = (log k)/(log s) –Pick: r =  (log k), q =  (rht  -6 )

15 April 29, 201515 Picking parameters k truth table size of f, hard for circuits of size s q field size, h R-M degree, t R-M dimension r degree of curve used in decoding h = s, t = (log k)/(log s) r =  (log k), q =  (rht  -6 ) Claim: truth table of f’ computable in time poly(k) (so f’ 2 E if f 2 E). –poly(q t ) for R-M encoding –poly(q) ¢ q t for Hadamard encoding –q · poly(s), so q t · poly(s) t = poly(h) t = poly(k) log k,  -1 < s

16 April 29, 201516 Picking parameters k truth table size of f, hard for circuits of size s q field size, h R-M degree, t R-M dimension r degree of curve used in decoding h = s, t = (log k)/(log s) r =  (log k), q =  (rht  -6 ) Claim: f’ s’-approximable by C implies f computable exactly in size s by C’’, for s’ = s  (1) –C has size s’ and agreement  =1/s’ with f’ –C’’ has size poly(q, r, t, size of C) = s log k,  -1 < s

17 April 29, 201517 Putting it all together Theorem 1 (IW, STV): If E contains functions that require size 2 Ω(n) circuits, then E contains 2 Ω(n) -unapproximable functions. (proof on next slide) Theorem (NW): if E contains 2 Ω(n) -unapp- roximable functions then BPP = P. Theorem (IW): E requires exponential size circuits  BPP = P.

18 April 29, 201518 Putting it all together Proof of Theorem 1: –let f = {f n } be hard for size s(n) = 2 δn circuits –define f’ = {f n ’} to be just-described encoding of (the truth tables of) f = {f n } –two claims we just showed: f’ is in E since f is. if f ’ is s’(n) = 2 δ’n -approximable, then f is computable exactly by size s(n) = 2 δn circuits. –contradiction.

19 April 29, 201519 Extractors PRGs: can remove randomness from algorithms –based on unproven assumption –polynomial slow-down –not applicable in other settings Question: can we use “real” randomness? –physical source –imperfect – biased, correlated

20 April 29, 201520 Extractors “Hardware” side –what physical source? –ask the physicists… “Software” side –what is the minimum we need from the physical source?

21 April 29, 201521 Extractors imperfect sources: –“stuck bits”: –“correlation”: –“more insidious correlation”: there are specific ways to get independent unbiased random bits from specific imperfect physical sources 111111 “ “ “ “ “ “ perfect squares

22 April 29, 201522 Extractors want to assume we don’t know details of physical source general model capturing all of these? –yes: “min-entropy” universal procedure for all imperfect sources? –yes: “extractors”

23 April 29, 201523 Min-entropy General model of physical source w/ k < n bits of hidden randomness Definition: random variable X on {0,1} n has min-entropy min x –log(Pr[X = x]) –min-entropy k implies no string has weight more than 2 -k {0,1} n 2 k strings string sampled uniformly from this set

24 April 29, 201524 Extractor Extractor: universal procedure for “purifying” imperfect source: –E is efficiently computable –truly random seed as “catalyst” seed source string near-uniform {0,1} n 2 k strings E t bits m bits

25 April 29, 201525 Extractor “(k, ε)-extractor”  for all X with min-entropy k: –output fools all circuits C: |Pr z [C(z) = 1] - Pr y, x  X [C(E(x, y)) = 1]| ≤ ε –distributions E(X, U t ), U m “ε-close” (L 1 dist ≤ 2 ε) Notice similarity to PRGs –output of PRG fools all efficient tests –output of extractor fools all tests

26 April 29, 201526 Extractors Using extractors –use output in place of randomness in any application –alters probability of any outcome by at most ε Main motivating application: –use output in place of randomness in algorithm –how to get truly random seed? –enumerate all seeds, take majority

27 April 29, 201527 Extractors Goals:good: best: short seed O(log n) log n+O(1) long outputm = k Ω(1) m = k+t–O(1) many k’sk = n Ω(1) any k = k(n) seed source string near-uniform {0,1} n 2 k strings E t bits m bits

28 April 29, 201528 Extractors random function for E achieves best ! –but we need explicit constructions –many known; often complex + technical –optimal extractors still open Trevisan Extractor: –insight: use NW generator with source string in place of hard function –this works (!!) –proof slightly different than NW, easier

29 April 29, 201529 Extractors random function for E achieves best ! –but we need explicit constructions –many known; often complex + technical –optimal extractors still open Trevisan Extractor: –insight: use NW generator with source string in place of hard function –this works (!!) –proof slightly different than NW, easier

30 April 29, 201530 Trevisan Extractor Ingredients: (  > 0, m are parameters) –error-correcting code C:{0,1} n  {0,1} n’ distance (½ - ¼m -4 )n’ blocklength n’ = poly(n) –(log n’, a = δlog n/3) design: S 1,S 2,…,S m  {1…t = O(log n’)} E(x, y)=C(x)[y |S 1 ]◦C(x)[y |S 2 ]◦…◦C(x)[y |S m ]

31 April 29, 201531 Trevisan Extractor E(x, y)=C(x)[y |S 1 ]◦C(x)[y |S 2 ]◦…◦C(x)[y |S m ] Theorem (T): E is an extractor for min-entropy k = n δ, with –output length m = k 1/3 –seed length t = O(log n) –error ε ≤ 1/m 010100101111101010111001010 C(x): seed y

32 April 29, 201532 Trevisan Extractor Proof: –given X  {0,1} n of size 2 k –assume E fails to ε-pass statistical test C |Pr z [C(z) = 1] - Pr x  X, y [C(E(x, y)) = 1]| > ε –distinguisher C  predictor P: Pr x  X, y [P(E(x, y) 1 … i-1 )=E(x, y) i ] > ½ + ε/m

33 April 29, 201533 Trevisan Extractor Proof (continued): –for at least ε/2 of x  X we have: Pr y [P(E(x, y) 1 … i-1 )=E(x, y) i ] > ½ + ε/(2m) –fix bits  outside of S i to preserve advantage Pr y’ [P(E(x;  y’  ) 1 … i-1 )=C(x)[y’] ] >½ + ε/(2m) –as vary y’, for j ≠ i, j-th bit of E(x;  y’  ) varies over only 2 a values –(m-1) tables of 2 a values supply E(x;  y’  ) 1 … i-1

34 April 29, 201534 Trevisan Extractor P output C(x)[y’] w.p. ½ + ε/(2m) y’ Y’  {0,1} log n’

35 April 29, 201535 Trevisan Extractor Proof (continued): –(m-1) tables of size 2 a constitute a description of a string that has ½ + ε/(2m) agreement with C(x) –# of strings x with such a description? exp((m-1)2 a ) = exp( n δ2/3 ) = exp(k 2/3 ) strings Johnson Bound: each string accounts for at most O(m 4 ) x’s total #: O(m 4 )exp(k 2/3 ) << 2 k (ε/2) contradiction

36 April 29, 201536 Extractors (k,  )- extractor: –E is efficiently computable – 8 X with minentropy k, E fools all circuits C: |Pr z [C(z) = 1] - Pr y, x  X [C(E(x, y)) = 1]| ≤ ε seed source string near-uniform {0,1} n 2 k strings E t bits m bits Trevisan: k = n ± t = O(log n) m = k 1/3 ² = 1/m

37 April 29, 201537 Strong error reduction L  BPP if there is a p.p.t. TM M: x  L  Pr y [M(x,y) accepts] ≥ 2/3 x  L  Pr y [M(x,y) rejects] ≥ 2/3 Want: x  L  Pr y [M(x,y) accepts] ≥ 1 - 2 -k x  L  Pr y [M(x,y) rejects] ≥ 1 - 2 -k We saw: repeat O(k) times –n = O(k)·|y| random bits; 2 n-k bad strings Want to spend n = poly(|y|) random bits; achieve << 2 n/3 bad strings

38 April 29, 201538 Strong error reduction Better: –E extractor for minentropy k=|y| 3 =n δ, ε < 1/6 –pick random w  {0,1} n, run M(x, E(w, z)) for all z  {0,1} t, take majority –call w “bad” if maj z M(x, E(w, z)) incorrect |Pr z [M(x,E(w,z))=b] - Pr y [M(x,y)=b]| ≥ 1/6 –extractor property: at most 2 k bad w –n random bits; 2 n δ bad strings

39 April 29, 201539 RL Recall: probabilistic Turing Machine –deterministic TM with extra tape for “coin flips” RL (Random Logspace) –L  RL if there is a probabilistic logspace TM M: x  L  Pr y [M(x,y) accepts] ≥ ½ x  L  Pr y [M(x,y) rejects] = 1 –important detail #1: only allow one-way access to coin-flip tape –important detail #2: explicitly require to run in polynomial time

40 April 29, 201540 RL L  RL  NL  SPACE(log 2 n) Theorem (SZ) : RL  SPACE(log 3/2 n) Belief: L = RL (open problem)


Download ppt "CS151 Complexity Theory Lecture 10 April 29, 2015."

Similar presentations


Ads by Google