::ICS 804:: Theory of Computation - Ibrahim Otieno - +254-0722-429297 SCI/ICT Building Rm. G15.

Slides:



Advertisements
Similar presentations
1 Turing Machines and Equivalent Models Section 13.2 The Church-Turing Thesis.
Advertisements

Lecture 16 Deterministic Turing Machine (DTM) Finite Control tape head.
Variants of Turing machines
Lecture 24 MAS 714 Hartmut Klauck
YES-NO machines Finite State Automata as language recognizers.
1 Section 14.1 Computability Some problems cannot be solved by any machine/algorithm. To prove such statements we need to effectively describe all possible.
Markov Algorithms An Alternative Model of Computation.
Introduction to Computability Theory
1 Introduction to Computability Theory Lecture7: PushDown Automata (Part 1) Prof. Amos Israeli.
CS5371 Theory of Computation
Fall 2004COMP 3351 Turing Machines. Fall 2004COMP 3352 The Language Hierarchy Regular Languages Context-Free Languages ? ?
Lecture 5 Turing Machines
Courtesy Costas Busch - RPI1 Turing Machines. Courtesy Costas Busch - RPI2 The Language Hierarchy Regular Languages Context-Free Languages ? ?
Computability and Complexity 3-1 Turing Machine Computability and Complexity Andrei Bulatov.
Costas Busch - RPI1 Turing Machines. Costas Busch - RPI2 The Language Hierarchy Regular Languages Context-Free Languages ? ?
1 Turing Machines. 2 The Language Hierarchy Regular Languages Context-Free Languages ? ?
Chapter 9 Turing Machine (TMs).
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen Department of Computer Science University of Texas-Pan American.
Course Outline Mathematical Preliminaries Turing Machines
::ICS 804:: Theory of Computation - Ibrahim Otieno SCI/ICT Building Rm. G15.
CSCI 2670 Introduction to Theory of Computing September 28, 2005.
Introduction to Theory of Automata
Pushdown Automata (PDAs)
Finite State Automata / Machines and Turing Machines.
Introduction to CS Theory Lecture 15 –Turing Machines Piotr Faliszewski
1 Chapter 1 Introduction to the Theory of Computation.
Learning Automata and Grammars Peter Černo.  The problem of learning or inferring automata and grammars has been studied for decades and has connections.
1 Computability Five lectures. Slides available from my web page There is some formality, but it is gentle,
Regular Grammars Chapter 7. Regular Grammars A regular grammar G is a quadruple (V, , R, S), where: ● V is the rule alphabet, which contains nonterminals.
::ICS 804:: Theory of Computation - Ibrahim Otieno SCI/ICT Building Rm. G15.
1 Section 13.2 The Church-Turing Thesis The Church-Turing Thesis: Anything that is intuitively computable can be be computed by a Turing machine. It is.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Fundamentals of Informatics Lecture 3 Turing Machines Bas Luttik.
Foundations of (Theoretical) Computer Science Chapter 2 Lecture Notes (Section 2.2: Pushdown Automata) Prof. Karen Daniels, Fall 2010 with acknowledgement.
TM Design Macro Language D and SD MA/CSSE 474 Theory of Computation.
::ICS 804:: Theory of Computation - Ibrahim Otieno SCI/ICT Building Rm. G15.
Theory of computing, part 4. 1Introduction 2Theoretical background Biochemistry/molecular biology 3Theoretical background computer science 4History of.
UNIT - I Formal Language and Regular Expressions: Languages Definition regular expressions Regular sets identity rules. Finite Automata: DFA NFA NFA with.
Lecture 16: Modeling Computation Languages and Grammars Finite-State Machines with Output Finite-State Machines with No Output.
Discrete Structures ICS252 Chapter 5 Lecture 2. Languages and Grammars prepared By sabiha begum.
Mathematical Foundations of Computer Science Chapter 3: Regular Languages and Regular Grammars.
1 Section 13.1 Turing Machines A Turing machine (TM) is a simple computer that has an infinite amount of storage in the form of cells on an infinite tape.
1 Turing Machines and Equivalent Models Section 13.1 Turing Machines.
Lecture 24UofH - COSC Dr. Verma 1 COSC 3340: Introduction to Theory of Computation University of Houston Dr. Verma Lecture 24.
Chapter 5 Finite Automata Finite State Automata n Capable of recognizing numerous symbol patterns, the class of regular languages n Suitable for.
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 3 The Church-Turing Thesis Contents Turing Machines definitions, examples,
THE CONVENTIONS 2 simple rules: Rule # 1: Rule # 2: RR “move to the right until you find  “ Note: first check. Then move (think of a “while”) “Never.
1 Introduction to Turing Machines
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 12 Mälardalen University 2007.
1 Turing Machines. 2 The Language Hierarchy Regular Languages Context-Free Languages ? ?
Theory of Computation Automata Theory Dr. Ayman Srour.
Week 14 - Friday.  What did we talk about last time?  Simplifying FSAs  Quotient automata.
MA/CSSE 474 Theory of Computation Universal Turing Machine Church-Turing Thesis Delayed due dates for HWs See updated schedule page. No class meeting.
Chapter 1 INTRODUCTION TO THE THEORY OF COMPUTATION.
Topic 3: Automata Theory 1. OutlineOutline Finite state machine, Regular expressions, DFA, NDFA, and their equivalence, Grammars and Chomsky hierarchy.
CSE202: Introduction to Formal Languages and Automata Theory
Busch Complexity Lectures: Turing Machines
Turing Machines.
Deterministic Finite Automata
Turing Machines 2nd 2017 Lecture 9.
CSE 105 theory of computation
Turing Machines Acceptors; Enumerators
Chapter 9 TURING MACHINES.
Deterministic Finite Automata
COSC 3340: Introduction to Theory of Computation
Finite Automata Reading: Chapter 2.
MA/CSSE 474 Theory of Computation
Formal Languages, Automata and Models of Computation
CSE 105 theory of computation
CSE 105 theory of computation
Presentation transcript:

::ICS 804:: Theory of Computation - Ibrahim Otieno SCI/ICT Building Rm. G15

Course Outline Mathematical Preliminaries Turing Machines Recursion Theory Markov Algorithms Register Machines Regular Languages and finite-state automata Aspects of Computability

Last Week: Recursion Theory The primitive recursive functions The partial recursive functions The class of partial recursive functions = class of Turing-computable functions

Course Outline Mathematical Preliminaries Turing Machines ◦ Additional Varieties of Turing Machines Recursion Theory Markov Algorithms Register Machines Regular Languages and finite-state automata Aspects of Computability

Markov Algorithms Sequential computation and Markov Algorithms Language Acceptors/Recognizers Number-theoretic functions Labeled Markov algorithms Markov-Computable functions and partial recursive functions Efficiency

Sequential computation and Markov Algorithms

Three Computational Paradigms (recap) Number-theoretic function computation (functions from N to N) Language acceptance (recognition) Transduction (transformation of an input into an appropriate output)

Markov Algorithms So far: main focus on function computation and language acceptance A.A. Markov in the 1950’s formalized string- rewriting systems Focus on transduction BUT: paradigm interreducibility

An Algorithm Scheme Let S be alphabet  ={a, b, c, d}. By a Markov algorithm scheme or schema we shall mean a finite sequence of productions or rewrite rules. As a first example, consider the following two-member sequence of productions: (i) a  c (ii) b  

An Algorithm Scheme Apply production sequence to words over  e.g. w: baba Apply rules: find leftmost occurrence of the symbol on the left-hand side and substitute it with the right-hand side rule (i) transforms word w into w 1 w: baba w 1 : bcba (i) a  c (ii) b  

An Algorithm Scheme Apply production sequence to words over  e.g. w: baba Apply rules: find leftmost occurrence of the symbol on the left-hand side and substitute it with the right-hand side rule (i) transforms word w 1 into w 2 w 1 : bcba w 2 : bcbc (i) a  c (ii) b  

An Algorithm Scheme Apply production sequence to words over . e.g. w: baba Apply rules: (i) can not be applied, so now we investigate the applicability of (ii) rule (ii) transforms word w 2 into w 3 w 2 : bcbc w 3 : cbc (i) a  c (ii) b  

An Algorithm Scheme Apply production sequence to words over . e.g. w: baba Apply Rules: (i) can not be applied, so now we investigate the applicability of (ii) rule (ii) transforms word w 3 into w 4 w 3 : cbc w 4 : cc No more production rules can be applied to w 4 so substitution process ceases (i) a  c (ii) b  

An Algorithm Scheme abab * cc abcd * ccd bbbb*   *  (i) a  c (ii) b  

Terminology the substitution process that results when some Markov algorithm schema is applied to some input word = the Markov algorithm corresponding to that Markov algorithm scheme S: a Markov algorithm scheme (production or re-write rules) A S :corresponding Markov algorithm (substitution process)

Another Example Given input alphabet ={a,b,c,d}. Next, consider the following three-member sequence of productions. (i) a  c (ii) bc  cb (iii) b .cd (terminal production) Exercise: baba, baaa

Work Alphabet Sometimes temporary markers (cf. additional symbols in Turing Machine Tape Alphabet) are indispensable : work alphabet containing the input alphabet and temporary markers like #,$,%,* Extra symbols typically do not occur in the input or output

Appends ab Let input alphabet ={a, b}. Let work alphabet  be {#}. We see that the Markov algorithm scheme consisting of the four-member production sequence (i) #a  a# (ii) #b  b# (iii) # .ab (iv)   # has the effect of appending string ab to any word over S Exercise: ba, aba

Markov Schema Ordered sequence of production rules: Markov Schema Each production is of the form    : non-terminal production or  .  : terminal production Deterministic in nature

Markov Algorithm Process type: apply productions of the corresponding Markov algorithm scheme until no production is applicable or a terminal production has been applied

A Formal Definition Markov algorithm schema S is any triple , ,   nonempty input alphabet  finite work alphabet with   finite, ordered sequence of productions of form    or of form  .   and  (possibly empty) words over  w  S w ’ or w  w ’ (zero or more steps) w  * S w ’ or w  * w ’ (upon halting)

A Formal Definition Abbreviations in schema e.g.  ={a,b} #a  a# #b  b# we can summarize this as follows: #    #(for  )

Language Acceptors/Recognizers

Markov Algorithms We can implement each of our three computational paradigms: paradigm interreducibility Language Transduction Language acceptance (recognition) Function computation

Language Acceptance EX4-2-1.MKV in Deus ex Machina Input alphabet  ={a, b} Work alphabet  =  %, $, 1} Six productions/one of them terminal Transforms all and only words in language {a n b m |n  0, m  1} to word 1

Definition Let S be a Markov algorithm schema with input alphabet  and work alphabet  with 1  and 1 . Then S accepts word w if w  *1 1: accepting 1 If Markov algorithm Schema S accepts word w, then Markov algorithm A S, corresponding to S, accepts word w as well

Markov-Acceptable Language Schema S accepts language L if S accepts all and only words in L. A language that is accepted by some Markov algorithm is said to be a Markov- acceptable language.

Example Language {a n b m |n  0, m  1} is accepted What happens when we take input word aabbba

Language Recognition Input alphabet S and work alphabet  such that both 0, 1   \  S recognizes language L over  S transforms w  L into 1, that is, w  * 1 (accepting 1) S transforms w  L into 0, that is, w  * 0 (rejecting 0) Language recognized by some Markov algorithm S is said to be Markov-Recognizable Example:EX4-2-2.MKV recognizes language L={(ab) n |n  0}

Resources (Time) Definition time S (n) = the maximum number of steps in any terminating computation of S for input of length n

Resources (Space) Computation of a Markov algorithm = sequence of computation words Definition space S (n) = the maximum length of any computation word in any terminating computation of S for input of length n

Resource Analysis EX4-2-3.MKV in Deus ex Machina accepts language {w  *|n a (w) = n b (w)} with  = {a, b} time S (n) = (n/2) 2 + n + 2 for even n so O(n 2 ) ◦ words of uneven length can be disregarded for time/space analyses ◦ Most costly words to accept are a n/2 b n/2 and b n/2 a n/2 These are accepted after (n/2) 2 +n+2 steps space S (n) = n + 1 so O(n)

Complexity Given Turing machine M accepting L, there exists a Markov algorithm A S that accepts L in O(time M (n)) steps Given Markov algorithm A S accepting L, there exists a Turing machine M that accepts L in O([time A S (n)] 4 ) steps

Number-theoretic functions

Function Computation Natural number represented by n+1 1s Pair represented by string 111*1111, arguments separated by asterisk

Example Schema S for unary number-theoretic function: One terminal production 1 .11 What does this Markov Schema compute?

Example Schema S for unary number-theoretic function: One terminal production 1 .11 Computes unary successor function

Formal Definition S computes unary partial number-theoretic function f S applied to input 1 n+1 yields output 1 f(n) + 1 If S applied to input 1 n+1, where function f is not defined for n, then either the computation never terminates or its output is not of the form of an unbroken series of 1s

Exercise Schema S for unary number-theoretic function: One production 11  1 Which initial function does this schema compute?

Exercise Schema S for unary number-theoretic function: One terminal production 11 .1 Which initial function does this schema compute? unary constant-0 function C 1 0 (n) = 0 for all n !! All constant functions and projection functions are Markov-computable

Exercise Schema S for binary number-theoretic function: $1  1$ $*  % %1  % % .    $ Which number theoretic function does this schema compute?

Exercise Schema S for binary number-theoretic function: $1  1$ $*  % %1  % % .    $ Which number theoretic function does this schema compute? p 2 1

Exercise Show that the sg function is Markov-computable sg(n) = 1 if n = 0 0otherwise Hint: reduce the problem of n>0 to whether n=1

Exercise Show that the sg function is Markov-computable sg(n) = 1 if n = 0 0otherwise 111  .1 1 .11

Labeled Markov Algorithms

Labeled Adds concept of branching and production labels “gotos” Easier to design and comprehend Production labels: L 1, L 2, L 3 … Schema S: L i :    ; L j replace  with  branch off execution to production rule L j

Labeled Markov Algorithm That Accepts {w|n a (w) = n b (w) = n c (w)} L 1 : a  ; L 2   ; L 4 L 2 : b  ; L 3   ; L 5 L 3 : c  ; L 1   ; L 5 L 4 : b  ; L 5 c  ; L 5   1 ; L 5 L 5 :  . 

Equivalence Result Let A S be a labeled Markov algorithm with input alphabet . Then there exists a standard Markov algorithm A S ´ with input alphabet  that is computationally equivalent to A S

Markov-Computable functions and partial recursive functions

Equivalence Result Class of Markov-computable functions is identical to the class of ◦ partial recursive functions ◦ Turing-computable functions Function f is Markov-computable iff f is Turing- computable

Proof Start Turing Machine: open EX4-5-1.TM and EX4-5-1.TT to see a Turing machine that simulates a Markov algorithm Start Markov Algorithm and open EX MKV to see a Markov algorithm that simulates a Turing machine

Efficiency Any language that is Markov-acceptable is also Turing acceptable

Feasibility (recap) Cobham–Edmonds thesis regarding computational feasibility: the problem of determining whether a given string is a member of a given language L is feasible if and only if L is in P Complexity class P: class of all languages accepted in polynomial time by some (single- tape) Turing Machine

Efficiency Any language that is Markov-acceptable is also Turing acceptable Markov Algorithms often easier to implement  easier to determine computational feasibility of accepting some given language L