Instructor: Aaron Roth

Slides:



Advertisements
Similar presentations
Turing Machines Memory = an infinitely long tape Persistent storage A read/write tape head that can move around the tape Initially, the tape contains only.
Advertisements

Variants of Turing machines
1 Introduction to Computability Theory Lecture11: Variants of Turing Machines Prof. Amos Israeli.
P, NP, PS, and NPS By Muhannad Harrim. Class P P is the complexity class containing decision problems which can be solved by a Deterministic Turing machine.
CS5371 Theory of Computation Lecture 11: Computability Theory II (TM Variants, Church-Turing Thesis)
Theory of Computing Lecture 20 MAS 714 Hartmut Klauck.
1 Turing Machines. 2 A Turing Machine Tape Read-Write head Control Unit.
Turing Machines A more powerful computation model than a PDA ?
Introduction to CS Theory Lecture 15 –Turing Machines Piotr Faliszewski
1 More About Turing Machines “Programming Tricks” Restrictions Extensions Closure Properties.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 3 The Church-Turing Thesis Contents Turing Machines definitions, examples,
Computability Review homework. Video. Variations. Definitions. Enumerators. Hilbert's Problem. Algorithms. Summary Homework: Give formal definition of.
Lecture 17 Undecidability Topics:  TM variations  Undecidability June 25, 2015 CSCE 355 Foundations of Computation.
Lecture 24UofH - COSC Dr. Verma 1 COSC 3340: Introduction to Theory of Computation University of Houston Dr. Verma Lecture 24.
Recursively Enumerable and Recursive Languages. Definition: A language is recursively enumerable if some Turing machine accepts it.
Chapters 11 and 12 Decision Problems and Undecidability.
CIS 262 Automata, Computability, and Complexity Fall Instructor: Aaron Roth
More About Turing Machines
Turing’s Thesis Costas Busch - LSU.
Time complexity Here we will consider elements of computational complexity theory – an investigation of the time (or other resources) required for solving.
Part VI NP-Hardness.
CSE 105 theory of computation
Busch Complexity Lectures: Turing Machines
Turing Machines.
CS21 Decidability and Tractability
CSE 105 theory of computation
CS154, Lecture 7: Turing Machines.
CSE 105 theory of computation
Turing Machines Acceptors; Enumerators
COSC 3340: Introduction to Theory of Computation
Hierarchy of languages
Busch Complexity Lectures: Undecidable Problems (unsolvable problems)
Summary.
Turing Machines Chapter 17.
Turing Machines (TM) Deterministic Turing Machine (DTM)
CSE 105 theory of computation
Decidable Languages Costas Busch - LSU.
8. Introduction to Turing Machines
Decidable Languages A language L is decidable if there is a Turing machine ML such that given any word w  0*, then: Input of ML: a  b  … w Output of.
CS21 Decidability and Tractability
CS21 Decidability and Tractability
MA/CSSE 474 Theory of Computation
Recall last lecture and Nondeterministic TMs
CS21 Decidability and Tractability
Instructor: Aaron Roth
Instructor: Aaron Roth
Instructor: Aaron Roth
Decidability and Tractability
Instructor: Aaron Roth
Instructor: Aaron Roth
CSE 105 theory of computation
CSE 105 theory of computation
Instructor: Aaron Roth
Instructor: Aaron Roth
Instructor: Aaron Roth
Instructor: Aaron Roth
Variants of Turing machines
Instructor: Aaron Roth
Instructor: Aaron Roth
CSE 105 theory of computation
Automata, Grammars and Languages
Instructor: Aaron Roth
Instructor: Aaron Roth
Instructor: Aaron Roth
MA/CSSE 474 Theory of Computation
Intro to Theory of Computation
CSE 105 theory of computation
CSE 105 theory of computation
Presentation transcript:

Instructor: Aaron Roth aaroth@cis.upenn.edu CIS 262 Automata, Computability, and Complexity Spring 2019 http://www.seas.upenn.edu/~cse262/ Instructor: Aaron Roth aaroth@cis.upenn.edu Lecture: March 13, 2019

Logistics PS3 is Due (already). PS4 is out. Midterm: Monday (March 18)! Just you and your pencil.

Decidable Languages A language L is called “decidable” if there exists a halting Turing machine M such that L(M) = L Examples: PALINDROMES = { w | w is a palindrome } is decidable PRIMES All regular languages { w | count(w, a) = count(w, b) } … All problems that can be solved by correct, terminating programs

Beyond Decidable Languages Consider TM M L(M) = { w | M halts on w in its accepting state } A language L is called “recognizable” or equivalently, “recursively enumerable” (RE), if there exists a TM M such that L(M) = L Weaker notion than decidable languages: L is decidable implies that L is recognizable, but not conversely

Problem Classification REGULAR DECIDABLE HILBERT10 a*b PRIMES RECOGNIZABLE ALL LANGUAGES

Turing Machines: General Purpose Model ? Church-Turing thesis: If a program on modern computer can solve a problem, so can a TM! If we were to define a new type of machine, a problem solvable by this new type will still be solvable by a TM! How do we convince ourselves of this hypothesis? Alonzo Church (1903-95) Professor of logic Princeton

Standard TM Model a b a a b a a # a b _ _ q3 Single one-way infinite tape, deterministic machines Agenda: Let’s generalize the model (e.g., multiple tapes) Show that the generalized model does not increase computing power Notions of “recognizable languages” and “decidable languages” unchanged

Two-tape Turing Machines a b a a b a a # a b _ _ q3 b b a a b _ _ Machine has two tapes, both potentially unbounded on right Two read/write heads, each can move independently A single transition depends on current state, contents of the two tape cells under two heads, and updates state, updates contents of both the cells, and moves each head either to left or to right

Palindromes Checking by Two-tape TM a b b a b b a _ _ q3 a b b a b b a _ _ Initially, input string is on tape 1, and tape 2 is all blanks To check if input string is a palindrome: 1. Copy input to tape 2 2. Position Head1 at left end and Head2 at right end 3. If symbols match, move Head1 to right and Head2 to left

Two-tape TM Definition 2-Tape TM M has States Q, initial state q0, accepting state qa, rejecting state qr, tape alphabet G, input albhabet S, blank symbol _, Transition function: d : Q x G x G  Q x G x {L, R} x G x {L, R} Initially Tape1 contains input string w, Tape2 has e, both heads are at the left-most ends If state equals qa, M halts and accepts w Rejection defined analogously As in case of standard TM, infinite execution means non-termination L(M) = { w | M accepts w }

Two-tape TM => Standard TM Goal: Describe a systematic way to construct, given a 2-tape TM M, a standard TM M’ such that on every input w, M accepts, rejects, or goes in infinite loop exactly when M’ accepts, rejects, or goes in infinite loop, resp. If we succeed in this goal: M is halting if and only if M’ is halting L(M) = L(M’) Corollary: The notions of decidable languages and recognizable languages can be defined using 2-tape TMs (instead of standard TMs)

Two-tape TM M => Standard TM M’ a b a a b a a _ _ q3 b b a a b _ _ Key step in transformation: How should we map 2 tapes of M to a single tape of M’ ? # a b a x a b a a # b x b a a b _ _ (q3, a, b, s)

Two-tape TM M => Standard TM M’ # a b a x a b a a # b x b a a b _ _ (q3, a, b, s) State of M’ = State of M + Contents of two “current” cells + Extra info Tape of M’ = # Contents of Tape 1 of M # Contents of Tape 2 of M Each of the two blocks has a single cell with x to mark head position Note: Symbols # and x are new (not part of tape alphabet of M) Initially: change input w to # x w # x

Simulating One Transition of M # a b a x a b a a # b x b a a b _ _ (q3, a, b, s) Suppose d(q3, a, b) = ( q5, a’, R, b’, L), then M’ should take a sequence of steps leading to # a b a a’ x b a a # x b b’ a a b _ _ (q5, b, b, s)

Simulating One Transition of M # a b a x a b a a # b x b a a b _ _ (q3, a, b, s) M’ scans its tape left to right looking for the two x’s Using the transition of M, it needs to overwrite tape cells next to x’s It needs to shift each x one cell to left or right, depending on how the corresponding head moves Record the symbols next to the revised position of x’s in its state and move all the way to the left end

Simulating One Transition of M # a b a a b a x a # b x b a a b _ _ (q3, a, b, s) d(q3, a, b) = ( q5, a’, R, b’, L) Interesting corner case: Tape 1 head is right-most cell and head wants to move right Need to add a cell to tape 1 content and shift all of tape 2 to right # a b a a b a a’ x _ # x b b’ a a b _ _ (q5, _, b, s)

Two-tape TM M => Standard TM M’ To simulate a single transition of M, M’ needs to scan its entire tape, and may even have to shift it The second component of state can be used to remember which part of this simulation sequence it is executing Low-level details maybe tedious, but high-level idea clear M’ halts with accept/reject decision based on the state of M If execution of M on an input w is infinite, then so is the corresponding execution of M’ Goal achieved: M is halting iff M’ is halting and L(M) = L(M’)

Multi-tape Turing Machines What if we allow K tapes, where K is some constant, say, 5 ? K read/write heads, each can move independently, and transition depends on current state and vector of K symbols read Different tapes can act like different variables in a program Claim: Does not change computation power ! Given K-tape TM M, there is a way to systematically construct equivalent standard TM M’ (i.e. on each input w, both M and M’ behave same way)

Two-way Infinite Tape _ _ _ a b a a b a a a b _ _ q0 Suppose we change definition of TM so that tape extends on both sides No left-most end, initially head points to first symbol of input All cells to left contain blanks, potentially unbounded, and M can use all these cells to do its computation Claim: Does not change computation power ! Given TM M with two-way infinite tape, there is a way to systematically construct equivalent standard TM M’

Nondeterministic Turing Machines (NTM) a/x,R a b a a b a a a b _ _ q q1 q3 a/b, L q2 Nondeterministic choice: Based on current state and current symbol, multiple possible transitions Change in syntax definition: Transition function d : Q x G  2 Q x G x {L, R} To simplify presentation, let’s assume d(q,g) always contains 2 choices Can be specified by d0: Q x G  Q x G x {L, R} and d1: Q x G  Q x G x {L, R}

NTM Execution Tree On input w, initial configuration = (e, q0, w) At every step, two possible successors, depending on choice of d0 or d1 Possible execution: Path through this tree Accepting execution: state equals qa Rejecting execution: state equals qr Nonterminating execution: Infinite path Input w is accepted if some execution is accepting Input w is rejected if all executions are finite and rejecting

What Can NTMs Compute? Nondeterminism seems “powerful” Machine can make a choice regarding what to do Acceptance means one of these choices succeeds Why is this relevant to real-world computing problems?? More on this later, central to class NP of optimization problems For now: nondeterminism does not change which problems are solvable Goal: Given an NTM M, construct TM M’ such that on every input w, M and M’ agree on whether to accept/reject

Simulating NTM by Deterministic TM Key idea of simulation: Try all execution paths, in breadth-first manner, that is, without committing to one particular branch C0 1 1 1 A (finite) execution of NTM M corresponds to a sequence of 0’s and 1’s E.g. 0 1 1 0 means take 4 steps using d0 at first step, d1 at second step, … 1 1 1 1

NTM M => 3-Tape DTM M’ a b a a b a a _ _ a b a a b a a _ _ 0 1 1 0 _ _ Tape-1 contains the original input w (never updated) Tape-3 contains binary counter encoding execution path (initially 0) Tape-2 used as a work tape to simulate M’ (initially all blanks)

High-level Sketch of 3-Tape DTM M’ State of M’ = (State of M, Additional Info) Initially: Tape1 has w, Tape2 has e, and Tape3 has 0 Repeat{ Copy contents of Tape1 to Tape2; Set State of M to q0 and Head2 to left-most end of Tape2; While Head3 reads 0 or 1 { Based on state of M and what Head2 reads, update state of M, current cell of Tape2, and move Head2 Using d0 if Head3 reads 0 and using d1 if Head3 reads 1; Move Head3 one step to right; }; /* Decide when to stop and accept or reject (see next slide) */ Add 1 to Tape3 content viewing it as a binary number; Restore Tape2 to contain only blanks }

When to Accept/Reject ? When state of M is accepting, M’ should accept When should it reject ? When all execution paths of same length resulting in rejecting state Maintain a bit, initially 1, in state of M’ If State of M = qa, then stop and accept else State of M != qr then b = 0 else /* current execution is rejecting */ if Tape3 contains only 1’s /* right-most path */ then if b=1 then Stop and Reject else b = 1 /* reinitialize for next level */

NTM Summary Given an NTM M, there exists an equivalent standard DTM M’ Nondeterminism does not change notions of decidability/recognizability If NTM M has an accepting execution of n steps on an input w, how long will DTM M’ execute before accepting w ? Needs to explore all paths in execution tree up to depth n, so exponential in n Thus, simulating NTM by DTM causes “exponential” slow-down Million-Dollar Question: Is such a slow-down unavoidable ?

Church-Turing Thesis Turing machines can compute what any computer can ! A scientific hypothesis, not a mathematical theorem. This holds for multi-tape machines, nondeterministic machines as we have proved, but for a rich variety of models including… parallel computers, quantum computers, and so on Henceforth, we will assume that TMs are equivalent to computer programs or high-level algorithms

Closure Properties Are decidable languages closed under union? Yes means: if two languages are decidable, then their union is guaranteed to be decidable Consider two decidable languages L1 and L2 There exist halting TMs M1 and M2 such that L(M1)= L1 and L(M2)= L2 Goal: show that there exists a halting TM M such that M accepts an input w exactly when either M1 or M2 do

Decidable Languages and Union Decidable languages are closed under union Consider two decidable languages L1 and L2 There exist halting TMs M1 and M2 such that L(M1)= L1 and L(M2)= L2 Consider the following TM M: Given an input w, Execute M1 on w; If M1 accepts then accept Else { Execute M2 on w; If M2 accepts, accept else reject } M is a halting TM, and L(M) = Union of L1 and L2

Closure Properties of Recognizable Languages What about intersection, complement, concatenation, Kleene*? See HW.

Closure Properties of Recognizable Languages A language L is recognizable if there exists a TM M such that L(M)=L Note: it is possible that M loops on some inputs Are recognizable languages closed under union? Yes means: if two languages are recognizable, then their union is guaranteed to be recognizable Consider two recognizable languages L1 and L2 There exist TMs M1 and M2 such that L(M1)= L1 and L(M2)= L2 Goal: show that there exists a TM M such that M accepts those inputs w exactly when at least one of M1 and M2 do

Recognizable Languages and Union Consider two recognizable languages L1 and L2 There exist TMs M1 and M2 such that L(M1)= L1 and L(M2)= L2 Consider the following TM M: Given an input w, Execute M1 on w; If M1 stops and accepts then accept Else { Execute M2 on w; If M2 stops and accepts, accept } Is it the case that L(M) = Union of L1 and L2 ? May not terminate, so when w is not in L1 but is in L2, M may not accept w as required

Fixing the construction for Union M should stop and accept if either one of M1 and M2 does Key idea: need to run both machines without committing to one Basically, we want to run M1 and M2 in parallel, but our computational model does not explicitly support parallelism, so simulate it by alternating steps of the two on two tapes: Input w is copied on both tapes initially Repeat { Execute one transition of M1 on Tape1; If M1 stops and accepts, stop and accept; Execute one transition of M2 on Tape2; If M2 stops and accepts, stop and accept } M accepts w if and only if either M1 accepts w or M2 accepts w