Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Power of Randomness in Computation 呂及人中研院資訊所.

Similar presentations


Presentation on theme: "The Power of Randomness in Computation 呂及人中研院資訊所."— Presentation transcript:

1 The Power of Randomness in Computation 呂及人中研院資訊所

2 PART I: Randomization

3 Random Sampling

4 Polling With probability >99% % in population = % in sample  5% independent of population size Population: 20 million, voting yellow or red Random Sample: 3,000

5 Lesson A small set of random samples gives a good picture of the whole population. A small set of random samples gives a good picture of the whole population. Allow sub-linear time algorithms! Allow sub-linear time algorithms! More applications: More applications: –Volume estimation –Clustering –Machine learning,...

6 Fingerprints

7 Problem Alice:  n x   n x = y ? Measure: communication complexity Bob:  n y   n

8 First Attempt Alice:  n x   n x = y ? Bob:  n y   n i  r {1..n} i, x i x i = y i ?  (x,y) Works only when  (x,y) is large

9 Solution Alice:  n x   n Bob:  n y   n x  y: Prob i [C(x) i =C(y) i ]  0 0 x = y: Prob i [C(x) i  C(y) i ] = 0 i  r {1..m} i, C(x) i C(x) i = C(y) i ? C: error-correcting code  with   1 x  C(x)y  C(y) can repeat several times to reduce error

10 Lesson Transform the data, before random sampling! Transform the data, before random sampling!

11 Dimensionality Reduction Raw data A  {0,1} n, for very large n. Raw data A  {0,1} n, for very large n. –e.g. images, voices, DNA sequences,... –|A| << 2 n. Goal: Goal: –compressing each element of A, while keeping its “ essence ”

12 Proof Systems

13 Classical Proof Systems Prover: provides the proof. Prover: provides the proof. –Hard. Verifier: verifies the proof. Verifier: verifies the proof. –Relatively easy! –Still needs to read through the proof. –What if you, the reviewer, receive a paper of 300 pages to verify...

14 Probabilistically Correct Proof (PCP) Verifier: flips “ some ” random coins flips “ some ” random coins reads only a “ small ” parts of the proof reads only a “ small ” parts of the proof tolerates a “ small ” error tolerates a “ small ” error

15 Proof? A format of arguments agreed upon by Prover and Verifier A format of arguments agreed upon by Prover and Verifier –soundness & completeness. Choosing a good proof format Choosing a good proof format  Fast & simple verification!

16 Probabilistically Correct Proof (PCP) Prover: transforms the proof transforms the proof by encoding it with some error correcting (testing) code!

17 PCP for NP NP = PCP ( O(log n), 3 ). NP = PCP ( O(log n), 3 ). NP contains S AT, T SP,..., and NP contains S AT, T SP,..., and M ATH = { (S,1 t ) : ZFC | = S in t steps}.

18 Graph Non- Isomorphism

19 Isomorphic? G1G1G1G1 G2G2G2G2

20 Isomorphic! G1G1G1G1 G2G2G2G2

21 Problem Input: two graphs G 1 and G 2 Input: two graphs G 1 and G 2 Output: yes iff G 1 and G 2 are not isomorphic. Output: yes iff G 1 and G 2 are not isomorphic. G 1 iso. G 2   short proof G 1 iso. G 2   short proof (GNSIO  co-NP) G 1 not iso. G 2   short proof ??? G 1 not iso. G 2   short proof ???

22 Randomized Algorithm Verifier: Verifier: –Picks a random i  {1,2} –Sends G, a random permutation of G i Prover: Prover: –Sends j  {1,2} Verifier: Verifier: –Outputs “ non-isomorphic ” iff i = j.

23 New Features Non-transferable proofs Non-transferable proofs Zero-knowledge proofs Zero-knowledge proofs IP=PSACE IP=PSACE “ a lot more can be proved efficiently ”

24 Reachability

25 Problem Input: undirected graph G and two nodes s, t Input: undirected graph G and two nodes s, t Output yes iff s is connected to t in G Output yes iff s is connected to t in G Complexity: poly( n ) time! Complexity: poly( n ) time! Question: O(log n ) space? Question: O(log n ) space? number of nodes

26 Randomized Algorithm Take a random walk a length poly( n ) from s. Take a random walk a length poly( n ) from s. Output yes iff t is visited during the walk. Output yes iff t is visited during the walk. Complexity: randomized O(log n ) space Complexity: randomized O(log n ) space –only need to remember the current node

27 Lesson Interesting probabilistic phenomenon behind: Interesting probabilistic phenomenon behind: –Mixing rate of Markov chain (related to resistance of electrical networks)

28 Primality Testing

29 Problem Input: a number x Input: a number x Output: yes iff x is a prime Output: yes iff x is a prime Important in cryptography,... Important in cryptography,...

30 Randomized Algorithm Generate a random r  { 1,..., x } Generate a random r  { 1,..., x } Output yes iff Output yes iff –GCD ( x, r ) = 1 & –[ r / x ]  r (x-1)/2 (mod x ) Jacobi symbol

31 PART II: Derandomization

32 Issues Randomized algorithm M for A: Randomized algorithm M for A: –M has access to perfectly random y –  x, Prob y [ M(x,y)  A(x) ] < 0.000000001 Issues? Issues? –Small probability of error. –Need perfectly random y. How?

33 Solutions Randomness extractors Randomness extractors Pseudo-random generators Pseudo-random generators Derandomization Derandomization

34 Randomness Extractors

35 Setting slightly random almost random E XT short random seed: catalyst short seed, long output Goal: short seed, long output

36 Applications Complexity Complexity Cryptography Cryptography Data structures Data structures Distributed computing Distributed computing Error-correcting codes Error-correcting codes Combinatorics, graph theory Combinatorics, graph theory......

37 Pseudo-Random Generators

38 Random? Are coin tosses really random? Are coin tosses really random? They “ look random ” to you, because you don ’ t have enough power (computation / measurement). They “ look random ” to you, because you don ’ t have enough power (computation / measurement). In many cases, “ look random ” is good enough! In many cases, “ look random ” is good enough!

39 PRG random seed pseudo-random P RG short seed, long output Goal: short seed, long output

40 Definition {0,1} n  {0,1} m G: {0,1} n  {0,1} m, for n<m, is an  - PRG against a complexity class C:  predicate T  C, | Prob r [T(G(r)) = 1]  Prob y [T(y) = 1] | < .

41 PRG exists? From an “average-case hard” function From an “average-case hard” function f: {0,1} n  {0,1}, define PRG G: {0,1} n  {0,1} n+1 as G(r) = r 。 f(r)

42 PRG exists? From an “worst-case hard” function From an “worst-case hard” function f: {0,1} n  {0,1}, define PRG G: {0,1} n  {0,1} n+1 as G(r) = r 。 f(r) From a one-way function... From a one-way function...

43 Pseudo-Randomness Foundation of cryptography Foundation of cryptography –Public-key encryption –zero-knowledge proofs, –secure function evaluation,... Secret is there, but it looks random More applications: learning theory, mathematics, physics,... More applications: learning theory, mathematics, physics,...

44 Derandomizatoin

45 Open Problems Does randomness help poly-time / log-space / nondet. poly-time computation? Does randomness help poly-time / log-space / nondet. poly-time computation? BPP = P? BPL = L? BPNP = NP?

46 Open Problems Is there a PRG with seed length O(log n) that fools poly-time / log- space / nondet. poly-time computation? Is there a PRG with seed length O(log n) that fools poly-time / log- space / nondet. poly-time computation?

47 Derandomization Rand. algorithm M for language A: Rand. algorithm M for language A: Prob y [ M( x, y ) = A( x ) ] > 0.99,  x Construct PRG G (fooling M) s.t. Construct PRG G (fooling M) s.t. Prob r [ M( x, G(r) ) = A( x ) ] > 0.5,  x To determine A( x ), take majority vote of M( x, G(r) ) over all possible r. To determine A( x ), take majority vote of M( x, G(r) ) over all possible r.

48 Breakthroughs Primality  P: Primality  P: Agrawal-Kayal-Saxena 2002 Undirected Reachability  L: Undirected Reachability  L: Reingold 2005

49 Still Open Graph non-isomorphism in NP? Graph non-isomorphism in NP? (If two graphs are non-isomorphic, is there always a short proof for that?)

50 Conclusion

51 Randomness is useful Randomness is useful Interesting probabilistic phenomena behind Interesting probabilistic phenomena behind Randomness is in the eye of the beholder Randomness is in the eye of the beholder Exciting area! Exciting area!

52 Appendix

53 PCP for any L  NP  efficient verifier V  efficient verifier V –uses O(log n) random bits, –reads 3 bits from the proof. Correctness: Correctness: –x  L   short proof p, Prob [ V accepts p ] > 0.9 –x  L   short proof p, Prob [ V accepts p ] < 0.5

54 Definitions (extractor) Randomness measure: min-entropy. Randomness measure: min-entropy. H  (X)  k   x, Prob [ X=x ]  2 -k. Z is  -random if || Z  U || 1  . Z is  -random if || Z  U || 1  . E XT is an (k,  )-extractor if E XT is an (k,  )-extractor if H  (X)  k  E XT (X,U) is  -random.


Download ppt "The Power of Randomness in Computation 呂及人中研院資訊所."

Similar presentations


Ads by Google