Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computability & Complexity II Chris Umans Caltech.

Similar presentations


Presentation on theme: "Computability & Complexity II Chris Umans Caltech."— Presentation transcript:

1 Computability & Complexity II Chris Umans Caltech

2 June 25, 2004CBSSS2 Complexity Theory Classify problems according to the computational resources required –running time –storage space –parallelism –randomness –rounds of interaction, communication, others… Attempt to answer: what is computationally feasible with limited resources?

3 June 25, 2004CBSSS3 The central questions Is finding a solution as easy as recognizing one? P = NP? Is every sequential algorithm parallelizable? P = NC? Can every efficient algorithm be converted into one that uses a tiny amount of memory? P = L? Are there small Boolean circuits for all problems that require exponential running time? EXP  P/poly? Can every randomized algorithm be converted into a deterministic algorithm one? P = BPP?

4 June 25, 2004CBSSS4 Central Questions We think we know the answers to all of these questions … … but no one has been able to prove that even a small part of this “world-view” is correct. If we’re wrong on any one of these then computer science will change dramatically

5 June 25, 2004CBSSS5 Outline We’ll look at three of these problems: –P vs. NP “the power of nondeterminism” –L vs. P “the power of sequential computation” –P vs. BPP “the power of randomness”

6 June 25, 2004CBSSS6 P vs. NP

7 June 25, 2004CBSSS7 Completeness In an ideal world, given language L 1. state an algorithm deciding 2. prove that no algorithm does better we are pretty good at part 1 we are currently completely helpless when it comes to part 2, for most problems that we care about

8 June 25, 2004CBSSS8 Completeness in place of part 2 we can –relate the difficulty of problems to each other via reductions –prove that a problem is a “hardest” problem in a complexity class via completeness powerful, successful surrogate for lower bounds

9 June 25, 2004CBSSS9 Recall: reductions “many-one” reduction: –now, require f computable in polynomial time yes no yes no AB reduction from language A to language B f f

10 June 25, 2004CBSSS10 Completeness complexity class C (set of languages) language L is C-complete if –L is in C –every language in C reduces to L formalizes “L is hardest problem in complexity class C”: –L in P implies every language in C is in P related concept: language L is C-hard if –every language in C reduces to L

11 June 25, 2004CBSSS11 Some problems we care about 3SAT = {  : 3-CNF  is satisfiable} TSP = { : there is a tour of all nodes of G with weight ≤ k} SUBSET-SUM = { : some subset of {x 1, x 2, … x n } sums to B} IS= { : there is a subset of at least k vertices of G with no edges between them} Only know exponential time algorithms for these problems

12 June 25, 2004CBSSS12 From last week… We have defined the complexity classes P (polynomial time), EXP (exponential time) all languages decidable RE HALT EXP P some language

13 June 25, 2004CBSSS13 Why not EXP 3-SAT, TSP, SUBSET-SUM, IS, … Why not show these are EXP-complete? all possess positive feature that some problems in EXP probably don’t have: a solution can be recognized in polynomial time

14 June 25, 2004CBSSS14 NP Definition: NP = all languages decidable by a Nondeterministic TM in Polynomial time Theorem: language L is in NP if and only if it is expressible as: L = { x |  y, |y| ≤ |x| k, (x, y)  R } where R is a language in P. poly-time TM M R deciding R is a “verifier” “witness” or “certificate” efficiently verifiable

15 June 25, 2004CBSSS15 Poly-time verifiers Example: 3SAT expressible as 3SAT = {φ : φ is a 3-CNF formula for which  assignment A for which (φ, A)  R} R = {(φ, A) : A is a sat. assign. for φ} –satisfying assignment A is a “witness” of the satisfiability of φ (it “certifies” satisfiability of φ) –R is decidable in poly-time

16 June 25, 2004CBSSS16 all languages EXP decidable NP P P  NP  EXP believe all containments proper assuming P ≠ NP, can prove a problem is hard by showing it is NP-complete

17 June 25, 2004CBSSS17 NP-complete problems L is NP-complete if –L is in NP –every problem in NP reduces to L Theorem (Cook): 3SAT is NP-complete. –opened door to showing thousands of problems we care about are NP-complete

18 June 25, 2004CBSSS18 NP-complete problems Example reduction: –3SAT = { φ : φ is a 3-CNF Boolean formula that has a satisfying assignment } (3-CNF = AND of OR of ≤ 3 literals) –IS = { | G is a graph with an independent set V’  V of size ≥ k } (ind. set = set of vertices no 2 of which are connected by an edge)

19 June 25, 2004CBSSS19 Ind. Set is NP-complete The reduction f: given φ = (x  y   z)  (  x  w  z)  …  (…) we produce graph G φ : x y zz  x x wz one triangle for each of m clauses edge between every pair of contradictory literals set k = m …

20 June 25, 2004CBSSS20 Ind. Set is NP-complete φ = (x  y   z)  (  x  w  z)  …  (…) Claim: φ has a satisfying assignment if and only if G has an independent set of size at least k –Proof? IS  NP. Reduction shows NP-complete. x y zz  x x wz …

21 June 25, 2004CBSSS21 Interlude: Transforming Turing Machine computations into Boolean circuits

22 June 25, 2004CBSSS22 Configurations Useful convention: Turing Machine configurations. Any point in computation represented by string: C = x 1 x 2 … x i q x i+1 x i+2 … x m start configuration for single-tape TM on input x: q start x 1 x 2 …x n x1x1... state = q x2x2 …xixi x i+1 …xmxm

23 June 25, 2004CBSSS23 Turing Machine Tableau Tableau (configurations written in an array) for machine M on input x=x 1 …x n : x1/qsx1/qs x2x2 …xnxn _ … x1x1 x2/q1x2/q1 …xnxn _ … x1/q1x1/q1 a…xnxn _ … _/q a _…__ …............ height = time taken width = space used

24 June 25, 2004CBSSS24 Turing Machine Tableau Important observation: contents of cell in tableau determined by 3 others above it: a/q 1 ba b/q 1 a a a aba b

25 June 25, 2004CBSSS25 Turing Machine Tableau Can build Boolean circuit STEP –input (binary encoding of) 3 cells –output (binary encoding of) 1 cell ab/q 1 a a STEP each output bit is some function of inputs can build circuit for each size is independent of size of tableau

26 June 25, 2004CBSSS26 Turing Machine Tableau w copies of STEP compute row i from i-1 x1/qsx1/qs x2x2 …xnxn _ … x1x1 x2/q1x2/q1 …xnxn _ …............ width w tableau for M on input x … … STEP

27 June 25, 2004CBSSS27 x 1/ q s x2x2 …xnxn _ … STEP............ 1 iff cell contains q accept ignore these Circuit C built from TM M C(x) = 1 iff M accepts input x x1x1 x2x2 xnxn Size = O(width x height)

28 June 25, 2004CBSSS28 Back to NP-completeness

29 June 25, 2004CBSSS29 Circuit-SAT is NP-complete Circuit SAT: given a Boolean circuit (gates , ,  ), with variables y 1, y 2, …, y m is there some assignment that makes it output 1? Theorem: Circuit SAT is NP-complete. Proof: –clearly in NP

30 June 25, 2004CBSSS30 NP-completeness –Given L  NP of form L = { x |  y such that (x, y)  R } x1x1 x2x2 …xnxn y1y1 y2y2 …ymym circuit produced from TM deciding R 1 iff (x,y)  R –hardwire input x; leave y as variables

31 June 25, 2004CBSSS31 3SAT is NP-complete Idea: auxiliary variable for each gate in circuit x 1 x 2 x 3 x 4 …x n gigi (g i  z) (  z   g i ) (z   g i )  z gigi (  z 1  g i ) (  z 2  g i ) (  g i  z 1  z 2 ) (z 1  z 2  g i )  z 1 z 2 gigi (  g i  z 1 ) (  g i  z 2 ) (  z 1   z 2  g i ) (z 1  z 2  g i )  z 1 z 2

32 June 25, 2004CBSSS32 P vs. NP Most “hard” problems we care about have turned out to be NP-complete –L is in NP –every problem in NP reduces to L Not known that P ≠ NP arguably most significant open problem in Computer Science and Math.

33 June 25, 2004CBSSS33 L vs. P

34 June 25, 2004CBSSS34 Multitape TMs A useful variant: k-tape TM 0 11001 1 10 1 00 q0q0 read-only input tape finite control … k read/write heads 0 11001 k-1 “work tapes” … 0 11001 1 10 1 00 … 0 … …

35 June 25, 2004CBSSS35 Space complexity SPACE(f(n)) = languages decidable by a multi-tape TM that touches at most f(n) squares of its work tapes, where n is the input length, and f :N  N Important space class: L = SPACE (O(log n))

36 June 25, 2004CBSSS36 L and P configuration graph: nodes are configurations, edge (C, C’) iff C yields C’ in one step # configurations for a TM that runs in space O(log n) is n k for some constant k can determine if reach q accept or q reject from start configuration by exploring configuration graph (e.g. use DFS) Conclude: L  P

37 June 25, 2004CBSSS37 all languages EXP decidable NP P L  P  NP  EXP believe all containments proper L

38 June 25, 2004CBSSS38 Logspace A question: –Boolean formula with n nodes –evaluate using O(log n) space?    101 depth-first traversal requires storing intermediate values idea: short-circuit ANDs and ORs when possible

39 June 25, 2004CBSSS39 Logspace   10   101  Can we evaluate an n node Boolean circuit using O(log n) space?

40 June 25, 2004CBSSS40 A P-complete problem We don’t know how to prove L ≠ P But, can identify problems in P least likely to be in L using P-completeness. need stronger reduction (why?) yes no yes no L1L1 L2L2 f f

41 June 25, 2004CBSSS41 A P-complete problem logspace reduction: f computable by TM that uses O(log n) space Theorem: If L 2 is P-complete, then L 2 in L implies L = P

42 June 25, 2004CBSSS42 A P-complete problem Circuit Value (CVAL): given a variable-free Boolean circuit (gates , , , 0, 1), does it output 1? Theorem: CVAL is P-complete. Proof: –already argued in P

43 June 25, 2004CBSSS43 A P-complete problem –L arbitrary language in P –TM M decides L in n k steps x1x1 x2x2 x3x3 …xnxn circuit produced from TM deciding L 1 iff x  L –hardwire input x –poly-size circuit; logspace reduction

44 June 25, 2004CBSSS44 L vs. P Not known that L ≠ P Interesting connection to parallelizability: –small-space algorithm automatically gives efficient parallel algorithm –efficient parallel algorithm automatically gives small-space algorithm P-complete problems least likely to be parallelizable

45 June 25, 2004CBSSS45 P vs. BPP

46 June 25, 2004CBSSS46 Communication complexity Goal: compute f(x, y) while communicating as few bits as possible between Alice and Bob count number of bits exchanged (computation free) at each step: one party sends bits that are a function of held input and received bits so far two parties: Alice and Bob function f:{0,1} n x {0,1} n  {0,1} Alice holds x  {0,1} n ; Bob holds y  {0,1} n

47 June 25, 2004CBSSS47 Communication complexity simple function (equality): EQ(x, y) = 1 iff x = y simple protocol: –Alice sends x to Bob (n bits) –Bob sends EQ(x, y) to Alice (1 bit) –total: n + 1 bits

48 June 25, 2004CBSSS48 Communication complexity Can we do better? –deterministic protocol? –probabilistic protocol? at each step: one party sends bits that are a function of held input and received bits so far and the result of some coin tosses required to output f(x, y) with high probability over all coin tosses

49 June 25, 2004CBSSS49 Communication complexity Theorem: no deterministic protocol can compute EQ(x, y) while exchanging fewer than n+1 bits. Proof: –“input matrix”: X = {0,1} n Y = {0,1} n f(x,y)

50 June 25, 2004CBSSS50 Communication complexity –assume without loss of generality 1 bit sent at a time –A sends 1 bit depending only on x: X = {0,1} n Y = {0,1} n inputs x causing A to send 1 inputs x causing A to send 0

51 June 25, 2004CBSSS51 Communication complexity –B sends 1 bit depending only on y and received bit: X = {0,1} n Y = {0,1} n inputs y causing B to send 1 inputs y causing B to send 0

52 June 25, 2004CBSSS52 Communication complexity –at end of protocol involving k bits of communication, matrix is partitioned into at most 2 k combinatorial rectangles –bits sent in protocol are the same for every input (x, y) in given rectangle –conclude: f(x,y) must be constant on each rectangle

53 June 25, 2004CBSSS53 Communication complexity –any partition into combinatorial rectangles with constant f(x,y) must have 2 n + 1 rectangles –protocol that exchanges ≤ n bits can only create 2 n rectangles, so must exchange at least n+1 bits. X = {0,1} n Y = {0,1} n 1 1 1 1 0 0 Matrix for EQ:

54 June 25, 2004CBSSS54 Communication complexity protocol for EQ employing randomness? –Alice picks random prime p in {1...4n 2 }, sends: p (x mod p) –Bob sends: (y mod p) –players output 1 if and only if: (x mod p) = (y mod p)

55 June 25, 2004CBSSS55 Communication complexity –O(log n) bits exchanged –if x = y, always correct –if x ≠ y, incorrect if and only if: p divides |x – y| –# primes in range is ≥ 2n –# primes dividing |x – y| is ≤ n –probability incorrect ≤ 1/2 Randomness gives an exponential advantage!!

56 June 25, 2004CBSSS56 BPP model: probabilistic Turing Machine –deterministic TM with additional read-only tape containing “coin flips” BPP (Bounded-error Probabilistic Poly-time) –L  BPP if there is a p.p.t. TM M: x  L  Pr y [M(x,y) accepts] ≥ 2/3 x  L  Pr y [M(x,y) rejects] ≥ 2/3 –“p.p.t” = probabilistic polynomial time

57 June 25, 2004CBSSS57 all languages EXP decidable P L  P  BPP  EXP L BPP

58 June 25, 2004CBSSS58 P vs. BPP Theorem: If E ( = TIME(2 O(n) )) contains a hard problem (L that requires exponential-size Boolean circuits) then P = BPP. “for decision problems, randomness probably does not help”


Download ppt "Computability & Complexity II Chris Umans Caltech."

Similar presentations


Ads by Google