Presentation is loading. Please wait.

Presentation is loading. Please wait.

Turing Machines for Dummies Why representations do matter Peter van Emde Boas ILLC-FNWI-Univ. Of Amsterdam Bronstee.com Software & Services B.V. SOFSEM.

Similar presentations


Presentation on theme: "Turing Machines for Dummies Why representations do matter Peter van Emde Boas ILLC-FNWI-Univ. Of Amsterdam Bronstee.com Software & Services B.V. SOFSEM."— Presentation transcript:

1 Turing Machines for Dummies Why representations do matter Peter van Emde Boas ILLC-FNWI-Univ. Of Amsterdam Bronstee.com Software & Services B.V. SOFSEM 2012 – Jan 25 2012 Špindlerúv Mlýn Czech Republic 1

2 2

3 Turing Machine Finite Program : P Tape Read/Write head P (K ) (K {L,0,R}) : (q,s,q,s,m) P denotes the instruction: When reading s in state q print s, perform move m and proceed to state q. Nondeterminism! K: States : tape symbols 3

4 Transitions Configuration c : finite string in *(K ) * $ A B A A C C B B A $ Transition c --> c obtained by performing instruction in P E.G., the instruction $ A B A A C C B B A $ |-- $ A B A A C B B B A $ Computation: sequence of transitions 4

5 Configurations Three ingredients are required for describing a Configuration: The Machine State : q The contents of the tape (preferably with endmarkers) : $ A B A A C B C B B A $ The position of the reading head : i Available options Mathematical Representation: Intrinsic Representation: $ x j x j+1 …. q x i …. x l-1 x l $ or $ x j x j+1 …. …. x l-1 x l $ Semi Intrinsic: 5

6 6

7 Theme of this presentation The Convenience of the Intrinsic Representation –Its History : who invented it, who saw its usefulness ? –Applications which are hard, if not impossible when the Mathematical representation is used Chomsky Hierarchy and Automata models Master reductions for NP: Cook-Levin and Tilings Stockmeyer on Regular Expressions Parallel Computation Thesis : the Second Machine Class –Is there a real problem ? 7

8 HISTORY Turing Machine and their Use 8

9 The teachings of our Master Our textbooks present Turing Machine programs in the format of quintuples or quadruples. What format did Turing use himself ? Some fragments of the 1936 paper Configuration means state in our terminology Looks like quintuples…. 9

10 For Turing Composite transitions are allowed 10

11 This is an example of the Intrinsic Representation Complete Configuration means Configuration on our terminology 11

12 A Macro language for Turing Machine programs 12

13 This Macro Language supports Recursion ! 13

14 The format of TM programs which today is conventional arises as a simplification introduced for the purpose of constructing the Universal Turing Machine Turing operates as an Engineer (Programmer) rather than a Mathematician / Logician 14

15 Nondeterminism Our concept of Nondeterminism (the applicable instruction is not necessarily unique) is for Turing a serious programming error Nondeterminism became accepted in the late 1950-ies as a consequence of the needs of Automata Theory 15

16 Using the model Initial configuration on some input Final configuration –No available instruction –By final state (accept, reject) –Evaporation of the state Complete Computation –From initial to final configuration (or infinity) Result of computation –Language recognition (always halting, condition on final configuration) –Language acceptance (accept by termination) –Function evaluation (partial function/relation, requires termination) Non Terminating Computations –Stream Computing –Interactive computation 16

17 Example Turing Machine K = {q,r,_} S = {0,1,B} P = { (q,0,q,0,R), (q,1,q,1,R), (q,B,r,B,L), (r,0,_,1,0), (r,1,r,0,L), (r,B,_,1,0) } q0 1 0 1 1 B 0 q1 0 1 1 B 0 1 q0 1 1 B 0 1 0 q1 1 B 0 1 0 1 q1 B 0 1 0 1 1 qB 0 1 0 1 r1 B 0 1 0 r1 0 B 0 1 r0 0 0 B 0 1 1 0 0 B Successor Machine; Increments a number in binary. _ represents the empty halting state. 11 + 1 = 12 17

18 Variants Semi Infinite Tape B B B B B B B A A A AO O O O O B B B B A A OO O B BB A A O O q q Tape folding; remember which track you are on…. 18

19 Multiple Tapes B B B B B B B A A A AO O O O O Z U X Z Y U X X Y Z XZ Z Z Y U q 19

20 Multiple Tapes B B B B B B B A A A AO O O O O Z U X Z Y U X X Y Z XZ Z Z Y U q Tapes become tracks on a single tape Markers used for maintaining head positions on the tracks 20

21 Invariance Thesis Other variants –Multi dimensional tapes –Multi heads on a single tape –Jumps to other head positions All models of Turing Machines are equivalent –up to polynomial overhead in time and constant factor overhead in space First Machine Class –Includes RAM / RASP model as well 21

22 How Turing Machines are used Marvelous TM algorithms do exist in the literature –Hennie Stearns: oblivious k-tapes on two tapes –Slisenko: Real-time Palindromes regognition and string matching –Vitányi: Real-time Oblivious multi-counter simulation The main use of TMs is for proving negative results (using reductions) –Undecidability –NP-hardness –Other hardness results Requires direct encoding of TM-computations in target formalisms: Master Reductions 22

23 Time-Space Diagram q0 1 0 1 1 B 0 q1 0 1 1 B 0 1 q0 1 1 B 0 1 0 q1 1 B 0 1 0 1 q1 B 0 1 0 1 1 qB 0 1 0 1 r1 B 0 1 0 r1 0 B 0 1 r0 0 0 B 0 1 1 0 0 B Master Reductions use this Time-Space Diagram as representation of the computation subject to the Reduction The Intrinsic Representation is far more useful, if not required, for constructing these Master Reductions WHY ?? Because validity of the Diagram can be checked Locally 23

24 Time for a new hero Larry Stockmeyer FOCS 1978, Ann Arbor © Peter van Emde Boas ; 19781016 Thesis MIT 1974 Rep. MAC-TR-133 24

25 Stockmeyer on representations This is a Mathematical Representation 25

26 Stockmeyer on representations For the Single Tape model the Intrinsic Representation is used 26

27 Stockmeyers Lemma Validiy of transition becomes a local check on a 2 by 3 window in the Time-Space Diagram NB: for Stockmeyer Functions are partial and multi-valued, I.E., Relations Is this the first time the convenience of the Intrinsic Representation is mentioned explicitly ? 27

28 Applications Automata Theory Master reductions for NP –Cook/Levin reduction to SAT –Bounded Tiling Stockmeyer on Regular Expressions Parallel Computation Thesis 28

29 Automata Theory The Machine based characterization of the Chomsky Hierarchy –Regular grammars Finite Automata –Context Free grammars Push Down Automata –Context Sensitive grammars Linear Bounded Automata –Unrestricted grammars Turing Machines 29

30 A Side Remark Traditional Textbooks present Automata Theory in the order: REG, CF, CSL, Type 0 resp. FA, PDA, LBA, TM Alternative: start with Turing Machines, treating the alternative models as restricted models Advantage: the concepts involving Configurations and Computations dont need a separate presentation for each model The desired characterizations are obtained by correlating production steps in the grammar world and computation segments in the Machine world, observing the required restrictions on both sides 30

31 A trivial Observation The production proces in the Grammar world can be simulated by a single Tape Turing Machine Turing Machines are after all perfect symbol manipulators Remains to show that restricted grammar classes can be simulated by restricted Machine models 31

32 The Converse Direction Using the Intrinsic Representation, the transitions of a Turing Machine are described by Context Sensitive Rules: corresponds to qaX bpX corresponds to qa pb corresponds to Xqa pXb etcetera 32

33 The context In the grammar world a production starts with the start symbol S, and terminates in a string of terminals In the machine world a computation starts with an initial ID with the terminal string on the input tape, and ends in an accepting configuration Hence: Mutual Simulations require some adaptations…… 33

34 TM simulation of a type 0 grammar In the initial Configuration the TM writes the start symbol S in a second track of the tape. The productions in the grammar are stepwise simulated in this second track (shifting the symbols left/right of the rewritten ones over the required distance) When the production is completed the TM checks whether the two tracks contain the same string, and accepts accordingly Hence: the language generated by a type 0 grammar can be recognized by a Turing Machine 34

35 A type 0 grammar simulates a TM From S generate the Initial Configuration in two tracks (this can be done using Regular productions only) Simulate the TM computation using the CS rules in the first track, leaving the symbols in the second track invariant If the machine accepts, erase the entire first track (this may require lenght reducing rules, hence type 0….) Hence: The language accepted by a Turing Machine can be produced by a Type 0 grammar. 35

36 CS grammars and LBA The same proof idea works LBA simulates CS grammar: no intermediate string exceeds the given input in lenght CS grammar simulates LBA: in previous proof erasing rules are only needed to remove extra workspace on the tape, and the LBA doesnt use extra workspace Beware for the endmarkers: better have them printed as markers on the first and last input symbol…. 36

37 CF grammars and PDA Snag: the PDA is a two tape device Solution: code configurations as Yields correspondence between leftmost derivation and PDA computations 37

38 S A B A C A B C z y z x CF Rules: S AB, A AC, B BA, A C, A x, B y, C z Left Derivation: *S *AB *ACB x*CB xz*B xz*BA xzy*A xzy*C xzyz* PDA Instructions: *,λ,S *,AB *,λ,A *,AC *,λ,B *,BA *,λ,A *,C *,x,A *, λ *,y,B *, λ *,z,S *, λ Syntax tree The Left derivation is equal to the time-space diagram of the PDA computation Hence: a single state PDA can accept what the CF grammar produces 38

39 PDA and CFG A single state PDA can simulate a CFG The PDA accepts by empty stack If the PDA has several states the CFG rules must encode these states PDA Instructions: q,λ,S r,AB r,λ,A s,AC q,λ,A s,C r,x,A s, λ CF Rules: [qSα] [rAβ][βBα] [rAα] [sAβ][βCα] [qAα] [sCα] [qAs] x α, β, range over the state symbols [rAr] means in state r, with A on top of the stack, a computation starts after which in state s the symbol below A is exposed 39

40 Regular Grammars and Finite Automata The standard translation: q,a r q ar instruction production rule Machine configuration and partial derivation abbaqacbac abbaq For the computation the input already processed is irrelevant; all information resides in the state, and the unread input determines the computation. In the grammar the past symbols are produced and the future symbols are invisible. A matter of perspective: past vs. future 40

41 Master reductions for NP Cook-Levin reduction to SAT –Based on Mathematical representation –Based on Intrinsic representation What is the difference ? Tiling based reduction –Does it require an Intrinsic representation ? 41

42 Time-Space Diagram q0 1 0 1 1 B 0 q1 0 1 1 B 0 1 q0 1 1 B 0 1 0 q1 1 B 0 1 0 1 q1 B 0 1 0 1 1 qB 0 1 0 1 r1 B 0 1 0 r1 0 B 0 1 r0 0 0 B 0 1 1 0 0 B 0 1 0 1 1 B q 0 0 1 0 1 1 B q 1 0 1 0 1 1 B q 2 0 1 0 1 1 B q 3 0 1 0 1 1 B q 4 0 1 0 1 1 B q 5 0 1 0 1 1 B r 4 0 1 0 1 0 B r 3 0 1 0 0 0 B r 2 0 1 1 0 0 B - 2 Intrinsic Mathematical 42

43 Reduction to SAT (intrinsic) The key idea is to introduce a family of propositional variables: P[i,j,a] expressing at row i (time i) on position j (space j) the symbol a is written in the diagram Conditions: I at every position some symbol is written II at no position more than one symbol is written IIIthe diagram starts with the initial configuration on the input IVthe diagram terminates with an accepting configuration Vthe transitions follow the Turing Machine program Va : expressed using implications (beware for Nondeterminism) Vb : expressed by exclusion of illegal transitions 43

44 Reduction to SAT (mathematical) The key idea is to introduce a family of propositional variables: P[i,j,a] expressing at row i (time i) on position j (space j) the symbol a is written in the diagram Q[i,q] expressing at time i the machine is in state q M[i,j] expressing at time i the head is in position j Extra Conditions: VI at every time the machine is in some state VII at no time the machine is in more than one state VIIIat every time the head is in some position IXat no time the head is in more than one position The correctness conditions III, IV and V are rephrased, somewhat easier to understand…. 44

45 Whats the difference ?? Assume that a computation of T steps is described, hence the height (but also the width) of the diagram is O(T) Assume that the number of symbols used is K K = O( # states * # tape symbols ) Investigate the size of the required propositional formulas. Investigate whether these formulas are expressed as clauses. 45

46 Expressing the Conditions Conditions: I at every position some symbol is written O(T 2 K) II at no position more than one symbol is written O(T 2 K 2 ) IIIthe diagram starts with the initial configurationO(T) on the input IVthe diagram terminates with an accepting O(1) configuration Vthe transitions follow the Turing Machine program Va : expressed using implications O(T 2 K 2 ) Vb : expressed by exclusion of illegal transitions O(T 2 K 6 ) VI at every time the machine is in some stateO(TK) VII at no time the machine is in more than one stateO(TK 2 ) VIIIat every time the head is in some positionO(T 2 ) IXat no time the head is in more than one positionO(T 3 ) All conditions (except Va) are easily expressed by clauses 46

47 Conclusion The standard proof (EG., Garey & Johnson) uses the Mathematical representation, yielding a cubic formula size blow-up However, a quadratic formula size blow-up is achievable when using the intrinsic representation Same overhead is obtained when taking the detour by the tiling based reduction (next) 47

48 Tiling based Reduction Tile Type: square divided in 4 coloured triangles. Infinite stock available No rotations or reflections allowed Tiling: Covering of region of the plane such that adjacent tiles have matching colours Boundary condition: colours given along (part of) edge of region, or some given tile at some given position. 48

49 Turing Machines and Tilings Idea: tile a region and let successive color sequences along rows correspond to successive configurations..... s s symbol passing tile s qs state accepting tiles q s qs q s instruction step tiles q s qs q (q,s,q,s,0) (q,s,q,s,R) (q,s,q,s,L) SNAG: Pairs of phantom heads appearing out of nowhere... Solution: Right and Left Moving States.... 49

50 Example Turing Machine K = {q,r,_} S = {0,1,B} P = { (q,0,q,0,R), (q,1,q,1,R), (q,B,r,B,L), (r,0,_,1,0), (r,1,r,0,L), (r,B,_,1,0) } q0 1 0 1 1 B 0 q1 0 1 1 B 0 1 q0 1 1 B 0 1 0 q1 1 B 0 1 0 1 q1 B 0 1 0 1 1 qB 0 1 0 1 r1 B 0 1 0 r1 0 B 0 1 r0 0 0 B 0 1 1 0 0 B Successor Machine; adds 1 to a binary integer. _ denotes empty halt state. 11 + 1 = 12 50

51 Reduction to Tilings q0 1 0 1 1 B 0 q1 0 1 1 B 0 1 q0 1 1 B 0 1 0 q1 1 B 0 1 0 1 q1 B 0 1 0 1 1 qB 0 1 0 1 r1 B 0 1 0 r1 0 B 0 1 r0 0 0 B 0 1 1 0 0 B © Peter van Emde Boas ; 19921029 51

52 Implementation in Hardware © Peter van Emde Boas ; 19950310 © Peter van Emde Boas ; 19921031 htpp://www.squaringthecircles.com/turingtiles 52

53 Tiling reductions initial configuration accepting configuration/ by construction unique blank border space blank border time Program : Tile Types Input: Boundary condition Space: Width region Time: Height region 53

54 Tiling Problems Square Tiling: Tiling a given square with boundary condition: Complete for NP. Corridor Tiling: Tiling a rectangle with boundary conditions on entrance and exit (length is undetermined): Complete for PSPACE. Origin Constrained Tiling: Tiling the entire plane with a given Tile at the Origin. Complete for co-RE hence Undecidable Tiling: Tiling the entire plain without constraints. Still Complete for co-RE (Wang/Bergers Theorem). Hard to Prove! 54

55 Detour to SAT A reduction from Bounded Tiling to SAT requires propositional variables t[i,j,s] expressing at position (i,j) a tile of type s is placed Conditions: IEverywhere some tile is placedO(T 2 K) IINowhere more than one tile is placedO(T 2 K 2 ) IIIBoundary conditions are observedO(TK) IVAdjacency conditions are observedO(T 2 K 2 ) T height and width of the tiled region K number of tile types All conditions are expressed as clauses 55

56 Is the intrinsic representation needed ? A tiling reduction is posible for the semi-intrinsic representation (state information can be transmitted through rows…) Translating numeric information into geometric info (without introducing a semi-intrinsic representation) seems hard if not impossible… 56

57 Stockmeyer on Regular Expressions Thesis MIT 1974 Rep. MAC-TR-133 57

58 Regular Expressions S finite alphabet (in our applications Σ U (K Σ) U { $ } ) REG(S) : 0 REG(S)M(0) = empty language 1 REG(S)M(1) = {λ} singleton empty word a REG(S) for a SM(a) = {a} singleton letter a word If f, g REG(S) then f + g REG(S)M(f + g) = M(f) U M(g) union f.g REG(S)M(f.g) = M(f).M(g) concatenation f* REG(S)M(f*) = M(f)* Kleene star f* = 1 + f + f.f + f.f.f. + …. Extra operations: f 2 = f.fsquaring f gM( f g) = M(f) M(g) intersection ~ fM(~ f) = S* \ M(f) complementation 58

59 Regular Expressions Describe the Regular languages over S Transformation expression Finite automaton is easy (construction where the parentheses in the expression become the states in the Finite Automaton) Converse transformation more difficult but standard textbook material (induction over number of states) Other interpretations exist and are useful: Regular Algebras, EG., in programming logics (PDL) Complete axiomatizations exist No direct algebraic expressions for intersection and complementation Regular languages being closed under intersection and complementation implies that these operations are expressible in all individual instances Extra operators yield succinctness 59

60 Stockmeyers Decision Problems NEC(f,S)is M(f) a proper subset of S* ? EQ(f,g)is M(f) = M(g) ? INEQ(f,g)is M(f) M(g) ? NEC(f,S) is equivalent to INEQ(f,S*) EQ and INEQ are complementary problems Stockmeyer (1974) characterizes the complexity of these problems, depending on the set of available operators Considering complementary problems was meaninful: the Immerman-Szelepsényi result was discovered only 13 years later…. 60

61 Stockmeyers Master Reduction Given a TM program P and some input string ω, there doesnt exist a regular expression denoting the (linearizations of) accepting time-space diagrams. But Violations against representing such a diagram can be described by regular expressions Syllabus Errorum approach: construct a Regular expression which enumerates all possible violations, and test whether there remains a string not covered by this expression (NEC problem) 61

62 Syllabus Errorum A correct time-space diagram consists of configurations, all of equal lenght, separated by $ symbols The first configuration must be the intitial configuration on the given input The last configuration must be accepting (the unique accepting) configuration The diagram may contain no illegal transitions This condition is captured by the absense of forbidden 2 by 3 windows in the diagram, as expressed by Stockmeyers lemma; For this method to work the Intrinsic representation seems essential. 62

63 Yardstick expressions Alphabet S used: Σ U (K Σ) U { $ } ; The width of the time space diagram (space consumed by the computation) is denoted M. Let V = Σ U (K Σ), W = Σ Given alphabet Z and number N we construct an regular expression Ya(Z,N) representing strings of length N of symbols from Z Note that Ya(1+Z,N) now represents strings of lenght N Using these yardstick expressions the various sources of errors can be described 63

64 Error Descriptions There is a substring inbetween two $ symbols which is to short: S*.$. Ya(1+V,M-1).$.S* There is a substring inbetween two $ symbols which is to long: S*.$. Ya(V,M+1).S*.$.S* There is an incorrect transition in the diagram: S*.xyz. Ya($+V,M-2).uvw.S* where is a forbidden 2 by 3 window in the diagram Similar (even more simple) expressions for the properties starts wrong and ends wrong x y z u v w 64

65 The reduction The regular expression which is the sum of all these error types represents the exact complement of the set of time space diagrams of accepting computations by P on input ω in space M Denote this expression by ER(P, ω, M) Input is accepted iff NEC( ER(P, ω, M), S) Remains to estimate the lenght of this expression Remember that we consider Nondeterministic space bounded computations 65

66 The size of the yardstick expressions Without extra operators Ya(Z,N) is of size O(N) yielding NPSPACE hardness for NEC With squaring 2 Ya(Z,N) is of size O(log(N)) yielding NEXPSPACE hardness for NEC 66

67 Expressions without * The same method also works without using *, but now the height of the diagram (time) must be restricted Yields reductions showing NP hardness and NEXPTIME hardness for the INEQ problem (without or with squaring) Nonelementary hardness if complementation is added Matching upper bounds are also obtained 67

68 Parallel Computation Thesis // PTIME = // NPTIME = PSPACE True for Computational Models which combine Exponential Growth potential with Uniform Behavior. The Second Machine Class 68

69 Representative examples Sequential devices operating on huge objects Vector machinePratt & Stockmeyer74,76 MRAMHartmanis & Simon74 MRAM without bit-logicBertoni, Mauri, Sabadini 81 EDITRAMStegwee, Torenvliet, VEB 85 ASMMTromp, VEB90,93 Alternating TM (RAM)Chandra, Stockmeyer & Kozen 81 Parallel Devices Parallel TMSavitch 77 PRAMSavitch & Stimson76,79 SIMDAGGoldschlager78,82 AggregateGoldschlager78,82 Array Proc. Machinev Leeuwen & Wiedermann85 69

70 How to prove it Inclusion //NPTIME PSPACE : –Guess computation trace –Verify that it accepts by means of recursive procedure –Validate that the parameters are polynomially bounded in size –Uniformity of behavior is essential 70

71 How to prove it Inclusion PSPACE //PTIME : –Todays authors show that QBF //PTIME –Original proofs give direct simulations of PSPACE computations, based on techniques originating from the proof of Savitch Theorem PSPACE = NPSPACE 71

72 Walter Savitch Amsterdam; CWI, Aug 1976 San Diego, Oct 1983 © Peter van Emde Boas Proved in 1970 PSPACE = NPSPACE 72

73 Understanding PSPACE Acceptance = Reachability in Computation Graph Solitaire Problem: finding an Accepting path in an Exponentially large, but highly Regular Graph Matrix Powering Algorithm: Parallelism Recursive Procedure: Savitch Theorem Logic: QBF, Alternation, Games 73

74 Polynomial Space Configuration Graph Configurations & Transitions: –(finite) State, Focus of Interaction & Memory Contents –Transitions are Local (involving State and Memory locations in Focus only; Focus may shift). Only a Finite number of Transitions in a Configuration –Input Space doesn´t count for Space Measure 74

75 Polynomial Space Configuration Graph Exponential Size Configuration Graph: – input length: |x| = k ; Space bound: S(k) –Number of States: q (constant) –Number of Focus Locations: k.S(k) t (where t denotes the number of heads) –Number of Memory Contents: C S(k) –Together: q.k.S(k) t. C S(k) = 2 O(S(k)) (assuming S(k) = (log(k)) ) 75

76 Polynomial Space Configuration Graph Uniqueness Initial & Final Accepting Configuration: –Before Accepting Erase Everything –Return Focus to Starting Positions –Halt in Unique Accepting State Start Goal 76

77 Path Finding in Configuration Graph 77

78 Path Finding in Configuration Graph Cycles in accepting path are irrelevant Trash Nodes: Unreachable: or Useless 78

79 Unreasonable Algorithm Step 1: generate this Exponentially large structure Step 2: Perform Exponentially long heavy computation on this structure Step 3: Extract a single bit of information from the result - the rest of the work is wasted. : Which is just what the Parallel Models do..... 79

80 Unreasonable Algorithm Transitive Closure of Adjacency Matrix by Iterated squaring ==> // Models Recursive approaches ==> // Models, Savitch' Theorem & Hardness QBF and Games 80

81 Adjacency Matrix 1 0 0 1 1 0 1 1 0 0 1 0 1 0 0 0 0 0 1 1 1 0 0 0 1 1 5 4 3 2 Matrix describes Presence of Edges in Graph; 1 on diagonal: length zero paths M := 81

82 Adjacency Matrix 1 0 0 1 1 1 1 1 0 0 1 0 1 1 1 1 0 0 1 1 1 5 4 3 2 In Boolean Matrix Algebra M 2 : Paths up to length 2 M 4 : paths up to length 4 M 2 = 1 0 0 1 1 1 1 1 1 1 1 0 1 1 1 1 0 0 1 1 1 5 4 3 2 M 4 = 82

83 Matrix Squaring M[i,j] := ( M[i,k] M[k,j] ) k On an N node graph, a single squaring requires O(N 3 ) operations Log(N) squarings are required to compute N-th Power of the Matrix Remember that N = 2 O(S) 83

84 Think Parallel O( N 3 ) processors can compute these squarings in time –O( log(N)) if unbounded fan-in is allowed –O( log(N) 2 ) if fan-in is bounded This is the basis for recognizing PSPACE in polynomial time on PRAM models More in Second Machine Class paper and/or chapter in Handbook of TCS; (both publications from the 1980-ies) 84

85 How to obtain this Matrix ? The row/column index is a binary number which codes a configuration. This code must be efficient in order that it is easy to recognize whether two configurations are connected by a transition "Locality" of the transitions is key: configuration only changes at focus; everywhere else it remains the same. 85

86 How to obtain this Matrix ? The intrinsic representation for Turing Machine has all desired properties. We need some routine to extract from a binary bitstring a group of bits coding a single symbol. On a RAM model your values are numbers - not bitstrings. Extracting these symbol codes requires number-to-binary conversion, which presupposes the availability of some "multiplicative" instruction which lacks in the standard model (but which is always granted). 86

87 The Problem ??! Is there a dragon out there ?? © Games Workshop 87

88 What do actual authors Use ? Whenever TM computations are used in master reductions, the author will almost always chose either an intrinsic or a semi intrinsic representation This holds even if the formal definition for configurations is based on the mathematical representation 88

89 Representative authors AuthorTitleMathematicalIntrinsicSemi-Intrinsic ReidelEncy of MathX RI SoareRE sets and degrees 87 X SC KleeneIntro to Metamath 52 X Boolos & JeffreyComputability & Logic 74 X BörgerComputability 89 X CohenComputability & Logic 87 X M DavisComputability & Unsolv. 58 X 89

90 Representative authors AuthorTitleMathematicalIntrinsicSemi-Intrinsic Hopcroft & Ullman Formal Lang & Autom 69 XX illustration Hopcroft & Ullman Formal Lang & Autom 79 X F HennieIntro to Comput 77 X Without states HarrisonIntro Formal Languages 78 X SudkampIntro TCS 06 X earlier eds as well Lewis & Papadimitriou Elts th of comp 81 XX for reductions MehlhornEATCS mon 2 84 X 90

91 Representative authors AuthorTitleMathematicalIntrinsicSemi-Intrinsic Rudick & Wigderson Comp Compl Theory 04 X J Savage Models of Computation 98 X For k tapes X Balcazar Diaz & Gabarro Structural Complexity 88 X H RogersTh Recursive Functions 67 X OdifreddiClassical Rec Theory 89 X WJ SavitchAbstr Mach & Grammars 82 XX J MartinIntr Lang & th of comp 97 X 91

92 We are Safe In practice, all authors on the basis of their intuition use the intrinsic representation Why then it seems that Stockmeyer is the unique author who makes the advantages explicit ?? 92


Download ppt "Turing Machines for Dummies Why representations do matter Peter van Emde Boas ILLC-FNWI-Univ. Of Amsterdam Bronstee.com Software & Services B.V. SOFSEM."

Similar presentations


Ads by Google