Presentation is loading. Please wait.

Presentation is loading. Please wait.

Discrete Math II Howon Kim 2017. 11.

Similar presentations


Presentation on theme: "Discrete Math II Howon Kim 2017. 11."— Presentation transcript:

1 Discrete Math II Howon Kim

2 Agenda 1 Automata Theory 2 Regular Expression & Regular Languages
3 Finite Automata DFA(Deterministic Finite Automata) FA Applications 4 Non-deterministic Finite Automata 5 Grammars & Languages 6 Theory of Computations

3 Language Types by Chomsky
Arbitrary, Natural Language Recursively Enumerable Language TM None Context-sensitive Language LBA- (linear-bounded automata) Context-free Language Regular Language FA (NFA,DFA) PDA(Pushdown Automata)

4 Language Types by Chomsky

5 Overview Set Theory of Strings (Chapter 6.1)
Regular Expressions & Regular Languages Finite State Machines Deterministic Finite Automata (DFA) Non-deterministic Finite Automata (NFA) Grammars and Languages Types of Grammars and Languages Associated Machines including Turing Machine Computation Theory NP Problem

6 Context Free Grammar (CFG)
Def. 문맥무관(context-free) 문법 G = ( V, , S, P ) Where all production rules have the form A  x, A  V and x  (V  )*. cf.) Regular Grammar where all the production rules have the form A  wB or A  w, A,B  V and w  *. V : a finite set of variables  nonterminals라고도 함  : an alphabet  terminal이라고도 함(we make strings that will be the words of language) S : a start variable, S  V P : a set of production rules

7 Examples (Ex.) G = ( {S}, {a,b}, S, P ) P : S  aS S  bS S  a S  b
more compact notation, P: S aS | bS | a | b It can produce the word abbab as follows: Used in statement of productions S ⇒ aS ⇒ abS ⇒ abbS ⇒ abbaS ⇒ abbab Used in derivation of word

8 Examples (Ex.) G = ( {S}, {0,1}, S, P )
P : S  0S1 | . L(G) = { 0n1n | n  0 } (Ex.) G = ( {S,A}, {a,b,c}, S, P ) P : S  aSc | A A  bAc |  L(G) = { ambnck | m + n = k, k  0 }

9 Examples (Ex.) G = ( {S,X,Y}, {a,b}, S, P )
P : S  X | Y X   Y  aY | bY | a | b So, CFL(context free language) is (a+b)* The only word we can generate is  It produces (a+b)+ (Ex.) G = ( {S}, {a,b}, S, P ) P : S  aS | bS | a | b |  CFL is (a+b)* The sequence of productions to generate any word is not unique It can generate bab using S ⇒ bS ⇒baS ⇒babS ⇒bab  ⇒bab or S ⇒ bS ⇒baS ⇒bab

10 Examples – derivation tree
(Ex.) G = ( {S,A}, {a,b}, S, P ) P : S  AAA | A A  AA | aA | Ab | a | b S A a b Derivation tree S ⇒ AAA ⇒aAAA ⇒abAA ⇒abAbA ⇒abaAbA ⇒abaabA ⇒abaaba In a derivation tree, the root is the start variable, all internal nodes are labeled with variables, while all leaves are labeled with terminals. The children of an internal node are labeled from left to right with the right-had side of the production used. 10 10

11 Pushdown Automata (PDA)
Recall that DFAs accept regular languages. We want to design machines similar to DFAs that will accept context-free languages. These machines will need to be more powerful. To handle a language like {anbn | n  0}, the machine needs to “remember” the number of as. To do this, we use a stack. A push-down automaton (PDA) is essentially an NFA with a stack.

12 Pushdown Automata (PDA)
Def. of Pushdown (Stack) Automata M = ( Q, , , , q0, F ) Q : a finite set of states  : a finite alphabet of tape symbols  : a finite alphabet of stack symbols (start symbol = #)  : a subset of the transition relation ( Q  ({})  ({}) ) to subset Q ({}) q0 : a start (initial) state, q0 Q F : a set of final states, F  Q (cf.) FA M = (Q, ∑,  or , q0, F)

13 Pushdown Automata - transition
stack symbol  : a subset of the transition relation ( Q  ({})  ({}) ) to subset Q ({}) Let ((p, a, β), (q, ))  Δ be a transition. It means that we Move from state p. Read a from the tape, Pop the string β from the stack, Move to state q, Push string  onto the stack. The first three (p, a, β), are “input.” The last two (q, ) are “output.” p q a, ;  13

14 Pushdown Automata – pushing/poping
When we push β, we push the symbols of β as we read them right to left. When we push the string abc, we push c, then push b, then push a. When we pop , we pop the symbols of  as we read them from left to right (reverse order). When we pop the string abc, we pop a, then pop b, then pop c. a b c stack 14

15 Pushdown Automata – pushing/poping
Thus, if we push the string abc and then pop it, we will get back abc, not cba. If we wanted to reverse the order, we would use three separate transitions: Push a Push b Push c 15

16 Pushdown Automata – configurations
A configuration fully describes the current “state” of the PDA. The current state p. The remaining input w. The current stack contents . Thus, a configuration is a triple (p, w, )  (Q, *, *). 16

17 Pushdown Automata – computations
A configuration (p, w, ) yields a configuration (p', w', ') in one step, denoted (p, w, )  (p', w', '), if there is a transition ((p, a, ), (p', ))  Δ such that w = aw',  = , and ' =  for some   *. The reflexive, transitive closure of  is denoted *. stack stack (p, w, )  (Q, *, *). 17

18 Pushdown Automata – accepting strings
After processing the string on the tape, The PDA is in either a final or a nonfinal state, and The stack is either empty or not empty. The input string is accepted by M By final state, or by empty state, Or by final state and empty state. That is, the string w  * is accepted if (s, w, e) * (f, e, e) for some f  F. (마지막) 입력 w로 인해 final state으로 간 경우를 의미함. 이때, stack은 empty가 되고, 당연 마지막 입력이었으므로, 입력도 empty가 됨 (p, w, )  (Q, *, *). 18

19 Pushdown Automata – accepting strings
There are several equivalent ways of defining acceptance by a PDA. One may define acceptance “by final state” only. The input is accepted if and only if the last state is a final state, regardless of whether the stack is empty. One may define acceptance “by empty stack” only. The input is accepted if and only if the stack is empty once the input is processed, regardless of which state the PDA is in. One may define acceptance “ by final state and empty stack ” 19

20 FA vs. PDA FA controller 1 PDA controller 1 A B #
PDA controller 1 A B #  Same structure except the stack  (current state, tape symbol, stack symbol)  (next state, stack string)  Initially (start state, input string w, #)  Accept w when the state of PDA after reading w is a final state  No memory  Read-only head moves to the right  (current state, tape symbol)  (next state)  Initially at the start state & at the leftmost position (Ex) push 1 : ( (p,0,0), (q,10) ) pop 0 : ( (p,0,0), (q,) )

21 Find L(M) for the following PDA.
An Example of PDA Find L(M) for the following PDA. M = ( {S,A,B}, {0,1}, ={0,#}, , S, {B} )  : ( (S,0,#), (S,0#) ) ; push 0 ( (S,0,0), (S,00) ) ; push 0 ( (S,,#), (B,#) ) ( (S,1,0), (A,) ) ; pop 0 ( (A,1,0), (A,) ) ; pop 0 ( (A,,#), (B,#) ) 스택으로부터 0을 읽기만 하고 쓰지는 않음 S A B Input symbol output to stack (push symbol) # 1 S # # # # Input from stack (pop symbol) 1 1 스택이 바닥남 A B L(M) = { 0n1n : n  0 }

22 Context-Sensitive Language
Def. of Context-sensitive Grammar G = ( V, , S, P ) Each production rule x  y satisfies || x ||  || y || ,where x  (V)* V (V)* and y  (V)*. (Note) Each production rule can be converted into u A v  u w v. A is sensitive to u and v. Def) 문맥무관(context-free) 문법 G = ( V, , S, P ) Where all production rules have the form A  x, A  V and x  (V  )*.

23 Context-Sensitive Language
The reason why we call these grammars context-sensitive is that we can rewrite the productions of any context-sensitive grammar to be in this form: xAy  xvy where x, y, and v are strings of any combination of variables and terminals, v is nonnull, and A is a single variable A goes to v in the context of x on the left and y on the right cf. in a context-free grammar, the productions all have a single variable on the left, such as: A aa 23 23

24 Context-Sensitive Language
However, in a context-sensitive grammar, we might have two rules such as: aAa  aba bAb  bbabb which say that we can replace A with a single b when it is in the middle of two a’s, but we replace it with bab when it is in the middle of two b’s. 24 24

25 Context-Sensitive Language
Clearly, context-sensitive rules give a grammar more power. A context-sensitive grammar can use the surrounding characters to decide to do different things with a variable, instead of always having to do the same thing every time. All productions in context-sensitive grammars are non-decreasing or non-contracting; that is, they never result in the length of the intermediate string being reduced. Def. of Context-sensitive Grammar G = ( V, , S, P ) Each production rule x  y satisfies || x ||  || y || 25 25

26 Linear bounded automaton
A Turing machine that has the length of its tape limited to the length of the input string is called a linear-bounded automaton (LBA). 26 26

27 Turing Machine Turing Machine (TM)
It consist of tapes and a controller The read/writable tape-head can move to the left and the right TM controller 1 # FA controller 1 PDA controller 1 A B #

28 Turing Machine Def. Turing Machine M = ( Q, , , , q0, H )
Q : a finite set of states  : an input alphabet  : a tape alphabet (#: blank symbol)  : a function (Q-H)    Q    {L,R,S} q0 : a start state, q0  Q H : a set of halt states, H  Q Moving tape head to Left, Right, Staying

29 TM Operation TM is a DFA because the  is a transition function.
L and R in def. of  means moving the tape-head to the left and the right, respectively. S means staying the tape-head. TM can operate after reading all the input symbols, while a DFA and a PDA should stop. Thus, a set of halt states should be defined in TM. We assume that the tape-head cannot move to the left at leftmost position of the tape.

30 TM Configuration TM 상황(configuration) ( q, u a v ) q : current state
a : symbol at position of tape-head u : left string of a in tape v : right string of a in tape Thus, it is determined by the current state q, contents of the tape, and position of the tape-head. Start configuration : ( q0, #w ) w : input string

31 Find a TM that defines L={ 0n1n | n  1}.
An Example Find a TM that defines L={ 0n1n | n  1}. # # ; move R until finding a 0 # a # ; replace 0 with a & move R to first 1 # a 0 0 b 1 1 # ; replace 1 with b & move L until finding the leftmost 0 # a a 0 b 1 1 # # a a 0 b b 1 # # a a a b b 1 # # a a a b b b # ; no 0 & no 1  halt 즉, 결국 동일한 개수의 0…0과 1…1을 읽음

32 M= ({q0,…,q5}, {0,1}, {0,1,a,b,#}, , q0, {q5})
continue M= ({q0,…,q5}, {0,1}, {0,1,a,b,#}, , q0, {q5}) Tape alphabet 1 a b # state q0 (q1, a, R) (q, x)=(q4, x, S) (q3, b, R) (q0, #, R) q1 (q1, 0, R) (q2, b, L) (q1, b, R) q2 (q2, 0, L) (q0, a, R) (q2, b, L) q3 (q3, b, R) (q5, #, S) q4 Def. Turing Machine : M = ( Q, , , , q0, H ) Q : a finite set of states  : an input alphabet  : a tape alphabet (#: blank symbol)  : a function (Q-H)    Q    {L,R,S} q0 : a start state, q0  Q H : a set of halt states, H  Q

33 An Example Head input # 0 0 0 1 1 1 # ; move R until finding a 0
# a # ; replace 0 with a & move R to first 1 # a 0 0 b 1 1 # ; replace 1 with b & move L until finding the leftmost 0 # a a 0 b 1 1 # # a a 0 b b 1 # # a a a b b 1 # # a a a b b b # ; no 0 & no 1  halt # : (q0,#,R); q0상태에서 # read, 다음상태는 q0이며, tape head 내용은 #로 대체후(write), Right 이동 (2) 0: (q1,a,R): head가 0 read. 다음상태는 q1이며, 0 a, R이동 (3) 0: (q1,0,R): head가 0 read, 다음상태는 q1, 00, R 이동 (4) 0: (q1,0,R): head가 0 read, 다음상태는 q1, 00, R 이동 (5) 1: (q2,b,L): (6) 0: (q2,0.L): (7) 0: (q2,0,L): (8) a: (q0,a,R): (9) 0: (q1,a,R): (10)0: (q1,0,R): (11)b: (q1,b,R): (12)1: (q2,b,L): 33 33

34 continue # / #, R 0 / a, R 1 / b, L # / #, S b / b, R 0 / 0, R
q0 # # q1 # a # q2 # a 0 0 b 1 1 # q0 # a 0 0 b 1 1 # q1 # a a 0 b 1 1 # q2 # a a 0 b b 1 # q1 # a a a b b 1 # q2 # a a a b b b # q0 # a a a b b b # q3 # a a a b b b # q5 # a a a b b b # q0 q1 q2 q3 q5 # / #, R 0 / a, R 1 / b, L # / #, S b / b, R 0 / 0, R b / b, L 0 / 0, L a / a, R q4 x / x, S

35 Recursively Enumerable Lang.
Def. TM M = ( Q, , , , q0, H ) L(M) = { w  * | (q0,#w) ┣*M (h,*) }, h  H Thus, L(M) is the set of all the strings that halt the TM. Recursively Enumerable Language L There exists a TM M such that L = L(M). That is, Turing recognizable !

36 Non-deterministic TM Def. Non-deterministic TM
N = ( Q, , , , q0, H ) Q : a finite set of states  : an input alphabet  : a tape alphabet (#: blank)  : a subset of ( (Q-H) )  ( Q{L,R,S} ) q0 : a start state, q0  Q H : a set of halt states, H  Q (cf.)  : a function (Q-H)    Q    {L,R,S} Non-deterministic turing machine: a set of rules that prescribes more than one action for a given situation ! Deterministic turing machine: one action to be performed for any given situation

37 Language of N-TM Def. Theorem N-TM N = ( Q, , , , q0, H )
L(N) = { w  * | (q0,#w) ┣*N (h,*) }, h  H Because the N-TM is non-deterministic, L(N) is the set of all the strings that have at least one transition path to the H. Theorem A TM M can simulate a N-TM N.


Download ppt "Discrete Math II Howon Kim 2017. 11."

Similar presentations


Ads by Google