INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.

Slides:



Advertisements
Similar presentations
Measuring Time Complexity
Advertisements

INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
Introduction to Computability Theory
Variants of Turing machines
THE CHURCH-TURING T H E S I S “ TURING MACHINES” Pages COMPUTABILITY THEORY.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
The diagonalization method The halting problem is undecidable Decidability.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 11-1 Complexity Andrei Bulatov Space Complexity.
Measuring Time Complexity Sipser 7.1 (pages )
P and NP Sipser (pages ). CS 311 Fall Polynomial time P = ∪ k TIME(n k ) … P = ∪ k TIME(n k ) … TIME(n 3 ) TIME(n 2 ) TIME(n)
Complexity ©D.Moshkovitz 1 Turing Machines. Complexity ©D.Moshkovitz 2 Motivation Our main goal in this course is to analyze problems and categorize them.
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY Read sections 7.1 – 7.3 of the book for next time.
Fall 2004COMP 3351 Reducibility. Fall 2004COMP 3352 Problem is reduced to problem If we can solve problem then we can solve problem.
CS Master – Introduction to the Theory of Computation Jan Maluszynski - HT Lecture 8+9 Time complexity 1 Jan Maluszynski, IDA, 2007
CS5371 Theory of Computation Lecture 10: Computability Theory I (Turing Machine)
CS 310 – Fall 2006 Pacific University CS310 Complexity Section 7.1 November 27, 2006.
1 Reducibility. 2 Problem is reduced to problem If we can solve problem then we can solve problem.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
CS 461 – Nov. 21 Sections 7.1 – 7.2 Measuring complexity Dividing decidable languages into complexity classes. Algorithm complexity depends on what kind.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
Section 11.4 Language Classes Based On Randomization
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen.
חישוביות וסיבוכיות Computability and Complexity Lecture 7 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A AAAA.
Theory of Computing Lecture 15 MAS 714 Hartmut Klauck.
Lower Bounds on the Time-complexity of Non-regular Languages on One-tape Turing Machine Miaohua Xu For Theory of Computation II Professor: Geoffrey S.
February 18, 2015CS21 Lecture 181 CS21 Decidability and Tractability Lecture 18 February 18, 2015.
Theory of Computing Lecture 17 MAS 714 Hartmut Klauck.
Computer Language Theory
THE CHURCH-TURING T H E S I S “ TURING MACHINES” Part 1 – Pages COMPUTABILITY THEORY.
Measuring complexity Section 7.1 Giorgi Japaridze Theory of Computability.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
CSC 3130: Automata theory and formal languages Andrej Bogdanov The Chinese University of Hong Kong Turing Machines.
Hierarchy theorems Section 9.1 Giorgi Japaridze Theory of Computability.
Fall 2013 CMU CS Computational Complexity Lecture 2 Diagonalization, 9/12/2013.
Computability Sort homework. Formal definitions of time complexity. Big 0. Homework: Exercises. Searching. Shuffling.
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 3 The Church-Turing Thesis Contents Turing Machines definitions, examples,
Umans Complexity Theory Lectures Lecture 1b: Turing Machines & Halting Problem.
Turing Machines Sections 17.6 – The Universal Turing Machine Problem: All our machines so far are hardwired. ENIAC
Theory of Computational Complexity Yuji Ishikawa Avis lab. M1.
FORMAL LANGUAGES, AUTOMATA, AND COMPUTABILITY * Read chapter 4 of the book for next time * Lecture9x.ppt.
Theory of Computation Automata Theory Dr. Ayman Srour.
Universal Turing Machine
Theory of Computational Complexity TA : Junichi Teruyama Iwama lab. D3
CSCI 2670 Introduction to Theory of Computing November 15, 2005.
 2005 SDU Lecture14 Mapping Reducibility, Complexity.
CSCI 2670 Introduction to Theory of Computing
Complexity & the O-Notation
Time complexity Here we will consider elements of computational complexity theory – an investigation of the time (or other resources) required for solving.
Complexity & the O-Notation
CSE 105 theory of computation
Turing Machines Acceptors; Enumerators
Intro to Theory of Computation
CS21 Decidability and Tractability
Theory of Computability
CS154, Lecture 12: Time Complexity
CSC 4170 Theory of Computation Time complexity Section 7.1.
Recall last lecture and Nondeterministic TMs
CS21 Decidability and Tractability
CSE 105 theory of computation
Theory of Computability
CSCI 2670 Introduction to Theory of Computing
CSC 4170 Theory of Computation Time complexity Section 7.1.
Intro to Theory of Computation
CSE 105 theory of computation
Presentation transcript:

INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011

WHEN A COMPUTER IS USEFUL…

ENCODING CHESS A 88 chess board has 64 squares and 32 pieces: Each square can have one of 13 values: 4 bits CHESS = { 〈 B 〉 | B is a chess board and white can force a win. } +1 bit encodes turn information.

CHESS IS DECIDABLE can_white_force_win(board, b_list): if board ∈ b_list: return NO # black can force stalemate if checkmate(black, board): return YES if checkmate(white, board): return NO if turn(board) == white: win = NO else: win = YES for m ∈ legal_moves(board): win_move = can_white_force_win(move(board,m), b_list+[board]) if turn(board) == white: win = win or win_move else: win = win and win_move return win

HOW LONG TO SAVE THE DAY? How many moves at any level? ≤ 376 W ≥ 9Q + 1K + 2B + 2R + 2N = 376 How many levels? D ≥ (64) 32 2 = ≈ ≤ So, at most, W D = 10 positions By remembering whether each board B ∈ CHESS, we can explore “only” D ≈ positions. Can we do better?

DIVIDING BUGS A circuit is a collection of boolean gates and inputs connected by wires: ∧ ⋁ ¬ ⋁ x0x0 x1x1 x2x2 It is satisfiable if some setting of inputs makes it output 1. It can be encoded by listing for the inputs and operation of each gate. CIRCUIT-SAT = { 〈 C 〉 | C is a satisfiable circuit }

CIRCUIT-SAT IS DECIDABLE is_satisfiable(C): for each input value x ∈ {0 n, 0 n-1 1, … 1 n }: if circuit_value(C,x)==1: return True else: return False How long to check the Pentium division circuit? 2 64-bit inputs: at most calls to circuit_value Around steps. Can we do better?

STABLE MARRIAGES A High School has N boys and N girls. Each has a ranked list of dates for the 1951 Senior Prom. Albert BobCharlie Alice Betty Carol B,C,A A,C,B A,B,C B,A,CC,A,BC,B,A An unstable couple prefer each other to their current dates. STABLE = { 〈 B,G 〉 | There is a pairing with no unstable couple}

COMPLEXITY THEORY Studies what can and can’t be computed under limited resources such as time, space, etc Today: Time complexity

DECISION VS SEARCH Decision ProblemsSearch Problems Is there a winning move for white? Find a winning move for white Is there an input to satisfy the circuit? Find a satisfying input. Is there a stable pairing?Find a stable pairing.

MEASURING TIME COMPLEXITY We measure time complexity by counting the elementary steps required for a machine to halt Consider the language A = { 0 k 1 k | k  0 } 1. Scan across the tape and reject if the string is not of the form 0 m 1 n 2. Repeat the following if both 0s and 1s remain on the tape: Scan across the tape, crossing off a single 0 and a single 1 3. If 0s remain after all 1s have been crossed off, or vice-versa, reject. Otherwise accept. 2k 2k 2 2k

Definition: Let M be a TM that halts on all inputs. The running time or time-complexity of M is the function ƒ : ℕ  ℕ, where ƒ(n) is the maximum number of steps that M uses on any input of length n.

ASYMPTOTIC ANALYSIS 5n 3 + 2n n + 6= O(n 3 )

Let f and g be two functions f, g : ℕ → ℝ +. We say that f(n) = O(g(n)) if positive integers c and n 0 exist so that for every integer n  n 0 f(n)  cg(n) When f(n) = O(g(n)), we say that g(n) is an asymptotic upper bound for f(n) BIG- O 5n 3 + 2n n + 6= O(n 3 ) If c = 6 and n 0 = 10, then 5n 3 + 2n n + 6  cn 3

3n log 2 n + 5n log 2 log 2 n 2n n n log 10 n 78 = O(n 4.1 ) = O(n log 2 n) = O(n log 10 n) log 10 n = log 2 n / log 2 10 O(n log 2 n) = O(n log 10 n) = O(n log n)

EXAMPLES Is f(n) = O(g(n)) or g(n) = O(f(n))? f(n)g(n) 4n 2 2n2n n log n n! n log log n (log n) log n

WHY USE BIG-O? Suppose there is a k-state TM for L that runs in time f(n). Then there is a O(k2 c )-state TM for L that runs in time f(n)/c:

Definition: TIME(t(n)) = { L | L is a language decided by a O(t(n)) time Turing Machine } A = { 0 k 1 k | k  0 }  TIME(n 2 ) CIRCUIT-SAT  TIME(2 n )

A = { 0 k 1 k | k  0 }  TIME(n log n) Cross off every other 0 and every other 1. If the # of 0s and 1s left on the tape is odd, reject x0x0x0x0x0x0xx1x1x1x1x1x1x xxx0xxx0xxx0xxxx1xxx1xxx1x xxxxxxx0xxxxxxxxxxxx1xxxxx xxxxxxxxxxxxxxxxxxxxxxxxxx

We can prove that a TM cannot decide A in less time than O(n log n)

Can A = { 0 k 1 k | k  0 } be decided in time O(n) with a two-tape TM? Scan all 0s and copy them to the second tape. Scan all 1s, crossing off a 0 from the second tape for each 1.

Different models of computation yield different running times for the same language!

Theorem: Let t(n) be a function such that t(n)  n. Then every t(n)-time multi-tape TM has an equivalent O(t(n) 2 ) single tape TM

SIMULATING MULTIPLE TAPES 1. “Format” tape. If a tape head goes off right end, insert blank. If tape head goes off left end, move back right. Scan left-to-right, finding current symbols Scan left-to-right, writing new symbols Scan left-to-right, moving each tape head: 2. For each move of the k-tape TM: L#100#□#□#R qiqi q i1 q i1□ q i1□□ q j101RSS L#100#0#□#R L#100#0#1#R ● q jRSS 100 ● ● ● qjqj

COUNTING STEPS 1. “Format” tape. If a head goes off right end, insert blank. If a head goes off left end, move back right. Scan left-to-right, finding current symbols Scan left-to-right, writing new symbols Scan left-to-right, moving each tape head: 2. For each move of the k-tape TM: O(n+k) steps Do t(n) times: 2t(n) steps 2kt(n) steps Total = t(n)((6+2k) t(n)) + O(n+k) = O(t 2 (n))