Presentation is loading. Please wait.

Presentation is loading. Please wait.

::ICS 804:: Theory of Computation - Ibrahim Otieno - +254-0722-429297 SCI/ICT Building Rm. G15.

Similar presentations


Presentation on theme: "::ICS 804:: Theory of Computation - Ibrahim Otieno - +254-0722-429297 SCI/ICT Building Rm. G15."— Presentation transcript:

1 ::ICS 804:: Theory of Computation - Ibrahim Otieno - iotieno@uonbi.ac.ke +254-0722-429297 SCI/ICT Building Rm. G15

2 Course Outline Mathematical Preliminaries Turing Machines Recursion Theory Markov Algorithms Register Machines Regular Languages and finite-state automata Aspects of Computability

3 Last Week: Recursion Theory The primitive recursive functions The partial recursive functions The class of partial recursive functions = class of Turing-computable functions

4 Course Outline Mathematical Preliminaries Turing Machines ◦ Additional Varieties of Turing Machines Recursion Theory Markov Algorithms Register Machines Regular Languages and finite-state automata Aspects of Computability

5 Markov Algorithms Sequential computation and Markov Algorithms Language Acceptors/Recognizers Number-theoretic functions Labeled Markov algorithms Markov-Computable functions and partial recursive functions Efficiency

6 Sequential computation and Markov Algorithms

7 Three Computational Paradigms (recap) Number-theoretic function computation (functions from N to N) Language acceptance (recognition) Transduction (transformation of an input into an appropriate output)

8 Markov Algorithms So far: main focus on function computation and language acceptance A.A. Markov in the 1950’s formalized string- rewriting systems Focus on transduction BUT: paradigm interreducibility

9 An Algorithm Scheme Let S be alphabet  ={a, b, c, d}. By a Markov algorithm scheme or schema we shall mean a finite sequence of productions or rewrite rules. As a first example, consider the following two-member sequence of productions: (i) a  c (ii) b  

10 An Algorithm Scheme Apply production sequence to words over  e.g. w: baba Apply rules: find leftmost occurrence of the symbol on the left-hand side and substitute it with the right-hand side rule (i) transforms word w into w 1 w: baba w 1 : bcba (i) a  c (ii) b  

11 An Algorithm Scheme Apply production sequence to words over  e.g. w: baba Apply rules: find leftmost occurrence of the symbol on the left-hand side and substitute it with the right-hand side rule (i) transforms word w 1 into w 2 w 1 : bcba w 2 : bcbc (i) a  c (ii) b  

12 An Algorithm Scheme Apply production sequence to words over . e.g. w: baba Apply rules: (i) can not be applied, so now we investigate the applicability of (ii) rule (ii) transforms word w 2 into w 3 w 2 : bcbc w 3 : cbc (i) a  c (ii) b  

13 An Algorithm Scheme Apply production sequence to words over . e.g. w: baba Apply Rules: (i) can not be applied, so now we investigate the applicability of (ii) rule (ii) transforms word w 3 into w 4 w 3 : cbc w 4 : cc No more production rules can be applied to w 4 so substitution process ceases (i) a  c (ii) b  

14 An Algorithm Scheme abab * cc abcd * ccd bbbb*   *  (i) a  c (ii) b  

15 Terminology the substitution process that results when some Markov algorithm schema is applied to some input word = the Markov algorithm corresponding to that Markov algorithm scheme S: a Markov algorithm scheme (production or re-write rules) A S :corresponding Markov algorithm (substitution process)

16 Another Example Given input alphabet ={a,b,c,d}. Next, consider the following three-member sequence of productions. (i) a  c (ii) bc  cb (iii) b .cd (terminal production) Exercise: baba, baaa

17 Work Alphabet Sometimes temporary markers (cf. additional symbols in Turing Machine Tape Alphabet) are indispensable : work alphabet containing the input alphabet and temporary markers like #,$,%,* Extra symbols typically do not occur in the input or output

18 Appends ab Let input alphabet ={a, b}. Let work alphabet  be {#}. We see that the Markov algorithm scheme consisting of the four-member production sequence (i) #a  a# (ii) #b  b# (iii) # .ab (iv)   # has the effect of appending string ab to any word over S Exercise: ba, aba

19 Markov Schema Ordered sequence of production rules: Markov Schema Each production is of the form    : non-terminal production or  .  : terminal production Deterministic in nature

20 Markov Algorithm Process type: apply productions of the corresponding Markov algorithm scheme until no production is applicable or a terminal production has been applied

21 A Formal Definition Markov algorithm schema S is any triple , ,   nonempty input alphabet  finite work alphabet with   finite, ordered sequence of productions of form    or of form  .   and  (possibly empty) words over  w  S w ’ or w  w ’ (zero or more steps) w  * S w ’ or w  * w ’ (upon halting)

22 A Formal Definition Abbreviations in schema e.g.  ={a,b} #a  a# #b  b# we can summarize this as follows: #    #(for  )

23 Language Acceptors/Recognizers

24 Markov Algorithms We can implement each of our three computational paradigms: paradigm interreducibility Language Transduction Language acceptance (recognition) Function computation

25 Language Acceptance EX4-2-1.MKV in Deus ex Machina Input alphabet  ={a, b} Work alphabet  =  {@, %, $, 1} Six productions/one of them terminal Transforms all and only words in language {a n b m |n  0, m  1} to word 1

26 Definition Let S be a Markov algorithm schema with input alphabet  and work alphabet  with 1  and 1 . Then S accepts word w if w  *1 1: accepting 1 If Markov algorithm Schema S accepts word w, then Markov algorithm A S, corresponding to S, accepts word w as well

27 Markov-Acceptable Language Schema S accepts language L if S accepts all and only words in L. A language that is accepted by some Markov algorithm is said to be a Markov- acceptable language.

28 Example Language {a n b m |n  0, m  1} is accepted What happens when we take input word aabbba

29 Language Recognition Input alphabet S and work alphabet  such that both 0, 1   \  S recognizes language L over  S transforms w  L into 1, that is, w  * 1 (accepting 1) S transforms w  L into 0, that is, w  * 0 (rejecting 0) Language recognized by some Markov algorithm S is said to be Markov-Recognizable Example:EX4-2-2.MKV recognizes language L={(ab) n |n  0}

30 Resources (Time) Definition time S (n) = the maximum number of steps in any terminating computation of S for input of length n

31 Resources (Space) Computation of a Markov algorithm = sequence of computation words Definition space S (n) = the maximum length of any computation word in any terminating computation of S for input of length n

32 Resource Analysis EX4-2-3.MKV in Deus ex Machina accepts language {w  *|n a (w) = n b (w)} with  = {a, b} time S (n) = (n/2) 2 + n + 2 for even n so O(n 2 ) ◦ words of uneven length can be disregarded for time/space analyses ◦ Most costly words to accept are a n/2 b n/2 and b n/2 a n/2 These are accepted after (n/2) 2 +n+2 steps space S (n) = n + 1 so O(n)

33 Complexity Given Turing machine M accepting L, there exists a Markov algorithm A S that accepts L in O(time M (n)) steps Given Markov algorithm A S accepting L, there exists a Turing machine M that accepts L in O([time A S (n)] 4 ) steps

34 Number-theoretic functions

35 Function Computation Natural number represented by n+1 1s Pair represented by string 111*1111, arguments separated by asterisk

36 Example Schema S for unary number-theoretic function: One terminal production 1 .11 What does this Markov Schema compute?

37 Example Schema S for unary number-theoretic function: One terminal production 1 .11 Computes unary successor function

38 Formal Definition S computes unary partial number-theoretic function f S applied to input 1 n+1 yields output 1 f(n) + 1 If S applied to input 1 n+1, where function f is not defined for n, then either the computation never terminates or its output is not of the form of an unbroken series of 1s

39 Exercise Schema S for unary number-theoretic function: One production 11  1 Which initial function does this schema compute?

40 Exercise Schema S for unary number-theoretic function: One terminal production 11 .1 Which initial function does this schema compute? unary constant-0 function C 1 0 (n) = 0 for all n !! All constant functions and projection functions are Markov-computable

41 Exercise Schema S for binary number-theoretic function: $1  1$ $*  % %1  % % .    $ Which number theoretic function does this schema compute?

42 Exercise Schema S for binary number-theoretic function: $1  1$ $*  % %1  % % .    $ Which number theoretic function does this schema compute? p 2 1

43 Exercise Show that the sg function is Markov-computable sg(n) = 1 if n = 0 0otherwise Hint: reduce the problem of n>0 to whether n=1

44 Exercise Show that the sg function is Markov-computable sg(n) = 1 if n = 0 0otherwise 111  11 11 .1 1 .11

45 Labeled Markov Algorithms

46 Labeled Adds concept of branching and production labels “gotos” Easier to design and comprehend Production labels: L 1, L 2, L 3 … Schema S: L i :    ; L j replace  with  branch off execution to production rule L j

47 Labeled Markov Algorithm That Accepts {w|n a (w) = n b (w) = n c (w)} L 1 : a  ; L 2   ; L 4 L 2 : b  ; L 3   ; L 5 L 3 : c  ; L 1   ; L 5 L 4 : b  ; L 5 c  ; L 5   1 ; L 5 L 5 :  . 

48 Equivalence Result Let A S be a labeled Markov algorithm with input alphabet . Then there exists a standard Markov algorithm A S ´ with input alphabet  that is computationally equivalent to A S

49 Markov-Computable functions and partial recursive functions

50 Equivalence Result Class of Markov-computable functions is identical to the class of ◦ partial recursive functions ◦ Turing-computable functions Function f is Markov-computable iff f is Turing- computable

51 Proof Start Turing Machine: open EX4-5-1.TM and EX4-5-1.TT to see a Turing machine that simulates a Markov algorithm Start Markov Algorithm and open EX4-5- 2.MKV to see a Markov algorithm that simulates a Turing machine

52 Efficiency Any language that is Markov-acceptable is also Turing acceptable

53 Feasibility (recap) Cobham–Edmonds thesis regarding computational feasibility: the problem of determining whether a given string is a member of a given language L is feasible if and only if L is in P Complexity class P: class of all languages accepted in polynomial time by some (single- tape) Turing Machine

54 Efficiency Any language that is Markov-acceptable is also Turing acceptable Markov Algorithms often easier to implement  easier to determine computational feasibility of accepting some given language L


Download ppt "::ICS 804:: Theory of Computation - Ibrahim Otieno - +254-0722-429297 SCI/ICT Building Rm. G15."

Similar presentations


Ads by Google