Turing Machines January 2003 Part 2:. 2 TM Recap We have seen how an abstract TM can be built to implement any computable algorithm TM has components:

Presentation on theme: "Turing Machines January 2003 Part 2:. 2 TM Recap We have seen how an abstract TM can be built to implement any computable algorithm TM has components:"— Presentation transcript:

Turing Machines January 2003 Part 2:

2 TM Recap We have seen how an abstract TM can be built to implement any computable algorithm TM has components: M = (Q, T, g, q 0, F) So we have a machine (or computer) that can do some work. Unfortunately the abstract TM is so removed from modern computers that comparisons are difficult I.e. if we have an algorithm and we can show it works on a TM how do we then implement it on a conventional computer? TM can be related to conventional computers via a simple Random-Access Machine (RAM)

3 Random-Access Machines - RAM A RAM has a (finite or infinite) number of memory words, numbered 1, 2,... containing integer values v 1, v 2,... Also a CPU containing a finite number of registers R 1, R 2,..., R n and a program counter PC. The program is stored in memory. An op-code specifies a standard operation (LOAD, STORE, ADD, JUMP,... etc.) with operands for addresses or data as required. MemoryCPU...... v2v2 v1v1...... PC RnRn R1R1 2 1

4 RAM equivalent Turing Machine An equivalent Turing Machine to the RAM has n + 3 tapes, (where n is the number of registers) one for each register, one for the PC, one for a data address register, and one for the memory. tape... R1R1 RnRn PC Data address register Memory CPU registers n+3 tapes

5 RAM equivalent Turing Machine The memory tape consists of blocks of the form \$ i : v i, where \$ and : are separators. E.g. address 1 contains v1 address 2 contains v2 etc. v1v1 v2v2 2\$:1\$:.....

6 RAM equivalent Turing Machine The TM is provided with standard routines for searching and for all the internal operations of the RAM, (this is long but straightforward in principle). General process: If the PC tape holds number i, the memory tape is searched for \$ i :. The op-code at that point specifies the routine to execute. A subsequent address (if needed) can also be copied from the memory tape into the data address register tape to control the search for the data. The PC tape is incremented and the computation continues. The above RAM equivalent Turing machine shows how a Turing Machine can (in principle) be made to carry out the same computations as a typical stored-program digital computer.

7 Church-Turing Thesis Part 3 No-one has been able either to extend the power of a simple TM or to find a computational process that could reasonably be called an effective procedure which cannot be carried out by a Turing Machine. This supports the Church-Turing thesis. The Turing Complexity Thesis This states that : It follows that : anything that can be computed at all can be computed by a TM with at most a polynomial slowdown anything that a TM cannot compute in polynomial time cannot be computed at all in polynomial time T fastest A = fastest time to compute A T TM A = time to compute A on a TM T TM A T fastest A polynomial time i.e. Not to be confused with...

8 The Complexity Class P Definition: The length of computation of a TM is the number of moves it makes before it halts. For a TM: M, let Time M (n) = max{m: x T n such that the length of computation of M on input x is m}, that is, the maximum number of moves (worst case) for any input of length n symbols. If M halts for every input then Time M (n) is finite. If Time M (n) is O(n k ) for some k 0 then M is a polynomial-time algorithm. Definition: The complexity class P is defined as : the set of problems for which a polynomial-time TM can (in principle) be constructed

9 Non-deterministic Turing Machine A non-deterministic Turing Machine (NDTM) M = (Q, T, g, q 0, F) has a choice of moves i.e. g(q, a) may be multiple-valued. As previously stated (when considering Turing Machine extensions) this adds nothing (fundamentally) to computability. The output of a NDTM TM 1, for any input, can be exactly reproduced by a deterministic machine TM 2, which methodically tries all permutations of TM 1 s choices. The number of choices tends to grow exponentially with length of input, so TM 2 takes exponential time. TM 1 is assumed to know which choice to make each time (by intuition?), and may run in polynomial time.

10 The Complexity Class NP Definition: The complexity class NP is defined as : Clearly P NP. The outstanding problem in theoretical computer science is whether P = NP. It is generally believed that P NP, i.e. there are problems which are solvable in principle (by polynomial-time NDTM, exponential-time deterministic TM) but for which no polynomial-time deterministic TM can exist, and hence no polynomial-time algorithm on a real machine. These problems are intractable. the set of problems for which a polynomial-time NDTM can (in principle) be constructed

11 The Complexity Class NP-complete Among the problems in NP\P there is a class of hardest ones, called NP-complete. If a polynomial-time algorithm is found for any NP-complete problem it will follow that P = NP. This is unlikely. Quantum computing: Some research in the abstract idea of a Quantum computer has suggested that NP-complete problems might be solvable in polynomial time on such a Quantum computer. However, the physical realisation of such a computer has not been established!

12 NP PNP-complete NP, P, NP-complete Hundreds of NP-complete problems are known. The travelling Salesman Problem (TSP) is NP-complete. Another simple one to state is the following: Given a finite set A consisting of n integers, and a number m, is there a subset of A which totals m?

13 Parallel Computation It is often supposed that parallel computing is the solution to these problems of intractability. Alas not! Imagine a powerful parallel computer with a large array of processors, each of which is a serial computer with a fixed finite amount of memory (such as a Transputer). Choices in an NP-problem could be assigned to different processors and computed in parallel. In practice, however, an exponentially-growing number of processors is just as impossible as an exponentially- growing period of time.

14 Parallel Computation For example, suppose a TSP takes time T S (n) = k.(n-1)!/2 (for some value of k) for n cities on a serial computer, and suppose T S (n 1 ) = 10 5 seconds for n 1 = 12, which is not too unrealistic. For the same computation time on a parallel machine in which 10 6 processors can be used efficiently, we can have a larger number n 2 of cities, and then T P = 10 5 = T S (n 2 )/10 6, i.e T s (n 2 ) = 10 11 so the serial machine would take 10 11 seconds (3000 years) on this task. But T S (n 2 ) = 10 11 T S (n 1 ).(n 2 -1)!/(n 1 -1)! = 10 5. (n 2 -1)!/11! from which n 2 = 17 cities. Even with a million processors used to the full the possible number of cities is still small (17).

15 Parallel Computation Moreover, there is a fundamental limit to the utilisation of processors. Chip fabrication (and ultimately atomic physics) implies that there is a minimum volume V which a processor must occupy. As time passes, more and more processors enter the computation. The information can only travel at the speed of light (c), so after time T P the information is within a sphere of radius c.T P with volume c 3 T P 3 in which there are at most N = c 3 T P 3 /V processors. For a problem which takes time T S on a serial computer, on the parallel computer T P T S /N = T S V/c 3 T P 3, hence T P 4 (V/c 3 )T S. It follows that T P and T S are polynomially related, so an algorithm which takes polynomial (or exponential) time on a serial machine also takes polynomial (or exponential) time on a parallel machine. The definitions of P, NP, and NP-complete are unaltered.

16 Summary We can relate abstract Turing Machines to modern computers through the RAM From a RAM we can generate an equivalent Turing machine where the number of tapes is n + 3 (n = number of CPU registers) From the Turing Complexity Thesis we find that : The complexity classes P :and NP : NP-complete problems cannot be solved in polynomial time Parallelism is not the answer! anything that a TM cannot compute in polynomial time cannot be computed at all in polynomial time the set of problems for which a polynomial-time TM can (in principle) be constructed the set of problems for which a polynomial-time NDTM can (in principle) be constructed

17

Download ppt "Turing Machines January 2003 Part 2:. 2 TM Recap We have seen how an abstract TM can be built to implement any computable algorithm TM has components:"

Similar presentations