1 INFO 2950 Prof. Carla Gomes Module Modeling Computation: Languages and Grammars Rosen, Chapter 12.1.

Slides:



Advertisements
Similar presentations
C O N T E X T - F R E E LANGUAGES ( use a grammar to describe a language) 1.
Advertisements

Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
Grammars, Languages and Parse Trees. Language Let V be an alphabet or vocabulary V* is set of all strings over V A language L is a subset of V*, i.e.,
COGN1001: Introduction to Cognitive Science Topics in Computer Science Formal Languages and Models of Computation Qiang HUO Department of Computer.
Chapter Chapter Summary Languages and Grammars Finite-State Machines with Output Finite-State Machines with No Output Language Recognition Turing.
Normal forms for Context-Free Grammars
Chapter 3: Formal Translation Models
COP4020 Programming Languages
Syntactic Pattern Recognition Statistical PR:Find a feature vector x Train a system using a set of labeled patterns Classify unknown patterns Ignores relational.
Finite State Machines Data Structures and Algorithms for Information Processing 1.
Languages and Grammars MSU CSE 260. Outline Introduction: E xample Phrase-Structure Grammars: Terminology, Definition, Derivation, Language of a Grammar,
Language Translation Principles Part 1: Language Specification.
Lee CSCE 314 TAMU 1 CSCE 314 Programming Languages Syntactic Analysis Dr. Hyunyoung Lee.
Formal Grammars Denning, Sections 3.3 to 3.6. Formal Grammar, Defined A formal grammar G is a four-tuple G = (N,T,P,  ), where N is a finite nonempty.
Languages & Strings String Operations Language Definitions.
Modeling Computation Rosen, ch. 12.
Introduction Syntax: form of a sentence (is it valid) Semantics: meaning of a sentence Valid: the frog writes neatly Invalid: swims quickly mathematics.
CSI 3120, Grammars, page 1 Language description methods Major topics in this part of the course: –Syntax and semantics –Grammars –Axiomatic semantics (next.
::ICS 804:: Theory of Computation - Ibrahim Otieno SCI/ICT Building Rm. G15.
CS/IT 138 THEORY OF COMPUTATION Chapter 1 Introduction to the Theory of Computation.
Winter 2007SEG2101 Chapter 71 Chapter 7 Introduction to Languages and Compiler.
1 INFO 2950 Prof. Carla Gomes Module Modeling Computation: Language Recognition Rosen, Chapter 12.4.
A sentence (S) is composed of a noun phrase (NP) and a verb phrase (VP). A noun phrase may be composed of a determiner (D/DET) and a noun (N). A noun phrase.
Languages, Grammars, and Regular Expressions Chuck Cusack Based partly on Chapter 11 of “Discrete Mathematics and its Applications,” 5 th edition, by Kenneth.
Grammars CPSC 5135.
1 Introduction to Regular Expressions EELS Meeting, Dec Tom Horton Dept. of Computer Science Univ. of Virginia
CSNB143 – Discrete Structure Topic 11 – Language.
Introduction to Language Theory
Copyright © Curt Hill Languages and Grammars This is not English Class. But there is a resemblance.
CMSC 330: Organization of Programming Languages Context-Free Grammars.
Parsing Introduction Syntactic Analysis I. Parsing Introduction 2 The Role of the Parser The Syntactic Analyzer, or Parser, is the heart of the front.
1 A well-parenthesized string is a string with the same number of (‘s as )’s which has the property that every prefix of the string has at least as many.
ISBN Chapter 3 Describing Syntax and Semantics.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
CSC312 Automata Theory Lecture # 26 Chapter # 12 by Cohen Context Free Grammars.
September1999 CMSC 203 / 0201 Fall 2002 Week #14 – 25/27 November 2002 Prof. Marie desJardins clip art courtesy of
Programming Languages and Design Lecture 2 Syntax Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
R. Johnsonbaugh Discrete Mathematics 5 th edition, 2001 Chapter 10 Automata, Grammars and Languages.
Lecture 16: Modeling Computation Languages and Grammars Finite-State Machines with Output Finite-State Machines with No Output.
Formal Languages and Grammars
Discrete Structures ICS252 Chapter 5 Lecture 2. Languages and Grammars prepared By sabiha begum.
Mathematical Foundations of Computer Science Chapter 3: Regular Languages and Regular Grammars.
Chapter 4: Syntax analysis Syntax analysis is done by the parser. –Detects whether the program is written following the grammar rules and reports syntax.
Models of Computation by Dr. Michael P. Frank, University of Florida Modified and extended by Longin Jan Latecki, Temple University Rosen 7 th ed., Ch.
1 A well-parenthesized string is a string with the same number of (‘s as )’s which has the property that every prefix of the string has at least as many.
Week 14 - Friday.  What did we talk about last time?  Simplifying FSAs  Quotient automata.
Lecture 17: Theory of Automata:2014 Context Free Grammars.
Chapter 2. Formal Languages Dept. of Computer Engineering, Hansung University, Sung-Dong Kim.
Formal Languages and Automata FORMAL LANGUAGES FINITE STATE AUTOMATA.
Chapter 1 INTRODUCTION TO THE THEORY OF COMPUTATION.
Modeling Arithmetic, Computation, and Languages Mathematical Structures for Computer Science Chapter 8 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesAlgebraic.
PROGRAMMING LANGUAGES
Theory of Computation Lecture #
BCT 2083 DISCRETE STRUCTURE AND APPLICATIONS
Discrete Mathematics and its Applications Rosen 7th ed., Ch. 13.1
Context-Free Grammars: an overview
Lecture 1 Theory of Automata
Natural Language Processing - Formal Language -
Discrete Mathematics and its Applications
Context-Free Languages
Discrete Mathematics and its Applications Rosen 6th ed., Ch. 12.1
Models of Computation by Dr. Michael P
CHAPTER 2 Context-Free Languages
Compilers Principles, Techniques, & Tools Taught by Jing Zhang
Discrete Mathematics and its Applications Rosen 7th ed., Ch. 13.1
Discrete Mathematics and its Applications Rosen 7th ed., Ch. 13.1
Models of Computation by Dr. Michael P
Discrete Mathematics and its Applications Rosen 8th ed., Ch. 13.1
Models of Computation by Dr. Michael P
Languages and Grammer In TCS
Presentation transcript:

1 INFO 2950 Prof. Carla Gomes Module Modeling Computation: Languages and Grammars Rosen, Chapter 12.1

Modeling Computation Given a task: Can it be performed by a computer? We learned earlier the some tasks are unsolvable. For the tasks that can be performed by a computer, how can they be carried out? We learned earlier the concept of an algorithm. –A description of a computational procedure. How can we model the computer itself, and what it is doing when it carries out an algorithm? Models of Computation – we want to model the abstract process of computation itself.

We’ll cover three types of structures used in modeling computation: Grammars Used to generate sentences of a language and to determine if a given sentence is in a language Formal languages, generated by grammars, provide models for programming languages (Java, C, etc) as well as natural language --- important for constructing compilers Finite-state machines (FSM) FSM are characterized by a set of states, an input alphabet, and transitions that assigns a next state to a pair of state and an input. We’ll study FSM with and without output. They are used in language recognition (equivalent to certain grammar)but also for other tasks such as controlling vending machines Turing Machine – they are an abstraction of a computer; used to compute number theoretic functions 3

Early Models of Computation Recursive Function Theory –Kleene, Church, Turing, Post, 1930’s (before computers!!) Turing Machines – Turing, 1940’s (defined: computable) RAM Machines – von Neumann, 1940’s (“real computer”) Cellular Automata – von Neumann, 1950’s (Wolfram 2005; physics of our world?) Finite-state machines, pushdown automata –various people, 1950’s VLSI models – 1970s ( integrated circuits made of thousands of transistors form a single chip) Parallel RAMs, etc. – 1980’s

Computers as Transition Functions A computer (or really any physical system) can be modeled as having, at any given time, a specific state s  S from some (finite or infinite) state space S. Also, at any time, the computer receives an input symbol i  I and produces an output symbol o  O. –Where I and O are sets of symbols. Each “symbol” can encode an arbitrary amount of data. A computer can then be modeled as simply being a transition function T:S×I → S×O. –Given the old state, and the input, this tells us what the computer’s new state and its output will be a moment later. Every model of computing we’ll discuss can be viewed as just being some special case of this general picture.

Language Recognition Problem Let a language L be any set of some arbitrary objects s which will be dubbed “sentences.” –“legal” or “grammatically correct” sentences of the language. Let the language recognition problem for L be: –Given a sentence s, is it a legal sentence of the language L? That is, is s  L? Surprisingly, this simple problem is as general as our very notion of computation itself! Hmm… Ex: addition ‘language’ “num1-num2-(num1+num2)”

Languages and Grammars Finite-State Machines with Output Finite-State Machines with No Output Language Recognition Turing Machines

Languages & Grammars Phrase-Structure Grammars Types of Phrase-Structure Grammars Derivation Trees Backus-Naur Form

Intro to Languages English grammar tells us if a given combination of words is a valid sentence. The syntax of a sentence concerns its form while the semantics concerns its meaning. e.g. the mouse wrote a poem From a syntax point of view this is a valid sentence. From a semantics point of view not so fast…perhaps in Disney land Natural languages (English, French, Portguese, etc) have very complex rules of syntax and not necessarily well-defined. 9

Formal Language Formal language – is specified by well-defined set of rules of syntax We describe the sentences of a formal language using a grammar. Two key questions: 1 - Is a combination of words a valid sentence in a formal language? 2 – How can we generate the valid sentences of a formal language? Formal languages provide models for both natural languages and programming languages. 10

Grammars A formal grammar G is any compact, precise mathematical definition of a language L. –As opposed to just a raw listing of all of the language’s legal sentences, or just examples of them. A grammar implies an algorithm that would generate all legal sentences of the language. –Often, it takes the form of a set of recursive definitions. A popular way to specify a grammar recursively is to specify it as a phrase-structure grammar.

12 Grammars (Semi-formal) Example: A grammar that generates a subset of the English language

13

14 A derivation of “the boy sleeps”:

15 A derivation of “a dog runs”:

16 Language of the grammar: L = { “a boy runs”, “a boy sleeps”, “the boy runs”, “the boy sleeps”, “a dog runs”, “a dog sleeps”, “the dog runs”, “the dog sleeps” }

17 Notation Variable or Non-terminal Symbols of the vocabulary Terminal Symbols of the vocabulary Production rule

► A vocabulary/alphabet, V is a finite nonempty set of elements called symbols. Example: V = {a, b, c, A, B, C, S} ► A word/sentence over V is a string of finite length of elements of V. Example: Aba ► The empty/null string, λ is the string with no symbols. ► V* is the set of all words over V. Example: V* = {Aba, BBa, bAA, cab …} ► A language over V is a subset of V*. We can give some criteria for a word to be in a language. Basic Terminology

Phrase-Structure Grammars A phrase-structure grammar (abbr. PSG) G = (V,T,S,P) is a 4-tuple, in which: –V is a vocabulary (set of symbols) The “template vocabulary” of the language. –T  V is a set of symbols called terminals Actual symbols of the language. Also, N :≡ V − T is a set of special “symbols” called nonterminals. (Representing concepts like “noun”) –S  N is a special nonterminal, the start symbol. in our example the start symbol was “sentence”. –P is a set of productions (to be defined). Rules for substituting one sentence fragment for another Every production rule must contain at least one nonterminal on its left side.

► EXAMPLE:  Let G = (V, T, S, P),  where V = {a, b, A, B, S}  T = {a, b},  S is a start symbol  P = {S → ABa, A → BB, B → ab, A → Bb}. G is a Phrase-Structure Grammar. Phrase-structure Grammar What sentences can be generated with this grammar?

Derivation Definition Let G=(V,T,S,P) be a phrase-structure grammar. Let w 0 =lz 0 r (the concatenation of l, z 0, and r) w 1 =lz 1 r be strings over V. If z 0  z 1 is a production of G we say that w1 is directly derivable from w0 and we write w o => w 1. If w 0, w 1, …., w n are strings over V such that w 0 =>w 1,w 1 =>w 2,…, w n-1 => w n, then we say that w n is derivable from w 0, and write w 0 =>*w n. The sequence of steps used to obtain w n from w o is called a derivation.

Productions A production p  P is a pair p=(b,a) of sentence fragments a, b (not necessarily in L), which may generally contain a mix of both terminals and nonterminals. –We often denote the production as b → a. Read “replace b by a” –Call b the “before” string, a the “after” string. –It is a kind of recursive definition meaning that If lbr  L T, then lar  L T. (L T = sentence “templates”) That is, if lbr is a legal sentence template, then so is lar. That is, we can substitute a in place of b in any sentence template.

Languages from PSGs The recursive definition of the language L defined by the PSG: G = (V, T, S, P): –Rule 1: S  L T (L T is L’s template language) The start symbol is a sentence template (member of L T ). –Rule 2:  (b→a)  P:  l,r  V*: lbr  L T → lar  L T Any production, after substituting in any fragment of any sentence template, yields another sentence template. –Rule 3: (  σ  L T : ¬  n  N: n  σ) → σ  L All sentence templates that contain no nonterminal symbols are sentences in L. Abbreviate this using lbr  lar. (read, “lar is directly derivable from lbr”).

Language Let G(V,T,S,P) be a phrase-structure grammar. The language generated by G (or the language of G) denoted by L(G), is the set of all strings of terminals that are derivable from the starting state S. L(G)= {w  T* | S =>*w} 24

► EXAMPLE: Let G = (V, T, S, P), where V = {a, b, A, S}, T = {a, b}, S is a start symbol and P = {S → aA, S → b, A → aa}. The language of this grammar is given by L (G) = {b, aaa}; 1.we can derive aA from using S → aA, and then derive aaa using A → aa. 2.We can also derive b using S → b. Language L(G)

26 Another example Grammar: Derivation of sentence : G=(V,T,S,P)P =T={a,b} V={a,b,S}

27 Grammar: Derivation of sentence :

28 Other derivations: So, what’s the language of the grammar with the productions?

29 Language of the grammar with the productions:

PSG Example – English Fragment We have G = (V, T, S, P), where: V = {(sentence), (noun phrase), (verb phrase), (article), (adjective), (noun), (verb), (adverb), a, the, large, hungry, rabbit, mathematician, eats, hops, quickly, wildly} T = {a, the, large, hungry, rabbit, mathematician, eats, hops, quickly, wildly} S = (sentence) P = (see next slide)

Productions for our Language P = { (sentence) → (noun phrase) (verb phrase), (noun phrase) → (article) (adjective) (noun), (noun phrase) → (article) (noun), (verb phrase) → (verb) (adverb), (verb phrase) → (verb), (article) → a, (article) → the, (adjective) → large, (adjective) → hungry, (noun) → rabbit, (noun) → mathematician, (verb) → eats, (verb) → hops, (adverb) → quickly, (adverb) → wildly }

A Sample Sentence Derivation (sentence) (noun phrase) (verb phrase) (article) (adj.) (noun) (verb phrase) (art.) (adj.) (noun) (verb) (adverb) the (adj.) (noun) (verb) (adverb) the large (noun) (verb) (adverb) the large rabbit (verb) (adverb) the large rabbit hops (adverb) the large rabbit hops quickly On each step, we apply a production to a fragment of the previous sentence template to get a new sentence template. Finally, we end up with a sequence of terminals (real words), that is, a sentence of our language L.

Another Example Let G = ({a, b, A, B, S}, {a, b}, S, {S → ABa, A → BB, B → ab, AB → b}). One possible derivation in this grammar is: S  ABa  Aaba  BBaba  Bababa  abababa. VT P

Derivability Recall that the notation w 0  w 1 means that  (b→a)  P:  l,r  V*: w 0 = lbr  w 1 = lar. –The template w 1 is directly derivable from w 0. If  w 2,…w n-1 : w 0  w 1  w 2  …  w n, then we write w 0  * w n, and say that w n is derivable from w 0. –The sequence of steps w i  w i+1 is called a derivation of w n from w 0. Note that the relation  * is just the transitive closure of the relation .

A Simple Definition of L(G) The language L(G) (or just L) that is generated by a given phrase-structure grammar G=(V,T,S,P) can be defined by: L(G) = {w  T* | S  * w} That is, L is simply the set of strings of terminals that are derivable from the start symbol.

Defining the PSG Types Type 0: Phase-structure grammars – no restrictions on the production rules Type 1: Context-Sensitive PSG: –All after fragments are either longer than the corresponding before fragments, or empty: if b → a, then |b| < |a|  a = λ. Type 2: Context-Free PSG: –All before fragments have length 1 and are nonterminals: if b → a, then |b| = 1 (b  N). Type 3: Regular PSGs: –All before fragments have length 1 and nonterminals – All after fragments are either single terminals, or a pair of a terminal followed by a nonterminal. if b → a, then a  T  a  TN.

Types of Grammars - Chomsky hierarchy of languages Venn Diagram of Grammar Types: Type 0 – Phrase-structure Grammars Type 1 – Context-Sensitive Type 2 – Context-Free Type 3 – Regular

Classifying grammars Given a grammar, we need to be able to find the smallest class in which it belongs. This can be determined by answering three questions: Are the left hand sides of all of the productions single non-terminals? If yes, does each of the productions create at most one non-terminal and is it on the right? Yes – regular No – context-free If not, can any of the rules reduce the length of a string of terminals and non-terminals? Yes – unrestricted No – context-sensitive

A regular grammar is one where each production takes one of the following forms: (where the capital letters are non- terminals and w is a non-empty string of terminals): S  , S  w, S  T, S  wT. Therefore, the grammar: S → 0S1, S → λ is not regular, it is context-free Only one nonterminal can appear on the right side and it must be at the right end of the right side. Therefore the productions A  aBc and S  TU are not part of a regular grammar, but the production A  abcA is.

Grammar Productions of the form: String of variables and terminals VocabularyTerminal symbols Start variable Non-Terminal Definition: Context-Free Grammars

► Represents the language using an ordered rooted tree. ► Root represents the starting symbol. ► Internal vertices represent the nonterminal symbol that arise in the production. ► Leaves represent the terminal symbols. ► If the production A → w arise in the derivation, where w is a word, the vertex that represents A has as children vertices that represent each symbol in w, in order from left to right. Derivation Tree of A Context-free Grammar

Language Generated by a Grammar Example: Let G = ({S,A,a,b},{a,b}, S, {S → aA, S → b, A → aa}). What is L(G)? Easy: We can just draw a tree of all possible derivations. –We have: S  aA  aaa. –and S  b. Answer: L = {aaa, b}. S aA b aaa Example of a derivation tree or parse tree or sentence diagram.

► Let G be a context-free grammar with the productions P = {S →aAB, A →Bba, B →bB, B →c}. The word w = acbabc can be derived from S as follows: S ⇒ aAB →a(Bba)B ⇒ acbaB ⇒ acba(bB) ⇒ acbabc Thus, the derivation tree is given as follows: S a AB B ba c b B c Example: Derivation Tree

Backus-Naur Form  sentence    noun phrase   verb phrase   noun phrase    article  [  adjective  ]  noun   verb phrase    verb  [  adverb  ]  article   a | the  adjective   large | hungry  noun   rabbit | mathematician  verb   eats | hops  adverb   quickly | wildly Square brackets [] mean “optional” Vertical bars mean “alternatives”

Generating Infinite Languages A simple PSG can easily generate an infinite language. Example: S → 11S, S → 0 (T = {0,1}). The derivations are: –S  0 –S  11S  110 –S  11S  1111S  –and so on… L = {(11)*0} – the set of all strings consisting of some number of concaten- ations of 11 with itself, followed by 0.

Another example Construct a PSG that generates the language L = {0 n 1 n | n  N}. –0 and 1 here represent symbols being concatenated n times, not integers being raised to the nth power. Solution strategy: Each step of the derivation should preserve the invariant that the number of 0’s = the number of 1’s in the template so far, and all 0’s come before all 1’s. Solution: S → 0S1, S → λ.

Context-Sensitive Languages The language { a n b n c n | n  1} is context-sensitive but not context free. A grammar for this language is given by: S  aSBC | aBC CB  BC aB  ab bB  bb bC  bc cC  cc Terminal and non-terminal

A derivation from this grammar is:- S  aSBC  aaBCBC(using S  aBC)  aabCBC(using aB  ab)  aabBCC(using CB  BC)  aabbCC(using bB  bb)  aabbcC(using bC  bc)  aabbcc(using cC  cc) which derives a 2 b 2 c 2.