1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.

Slides:



Advertisements
Similar presentations
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING Parsing: computing the grammatical structure of English sentences COMP3310.
Advertisements

Mrach 1, 2009Dr. Muhammed Al-Mulhem1 ICS482 Formal Grammars Chapter 12 Muhammed Al-Mulhem March 1, 2009.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
Parsing with Context Free Grammars Reading: Chap 13, Jurafsky & Martin
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
1 LING 180 Autumn 2007 LINGUIST 180: Introduction to Computational Linguistics Dan Jurafsky, Marie-Catherine de Marneffe Lecture 9: Grammar and Parsing.
Albert Gatt LIN3022 Natural Language Processing Lecture 8.
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Grammar and Parsing.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
Syntax and Context-Free Grammars CMSC 723: Computational Linguistics I ― Session #6 Jimmy Lin The iSchool University of Maryland Wednesday, October 7,
Syntactic Parsing with CFGs CMSC 723: Computational Linguistics I ― Session #7 Jimmy Lin The iSchool University of Maryland Wednesday, October 14, 2009.
CS 4705 Basic Parsing with Context-Free Grammars.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
CPSC 503 Computational Linguistics
Syntax Construction of phrases and sentences from morphemes and words. Usually the word syntax refers to the way words are arranged together. Syntactic.
Context-Free Grammar CSCI-GA.2590 – Lecture 3 Ralph Grishman NYU.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING COMP3310 Natural Language Processing Eric Atwell, Language Research Group.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin and Rada Mihalcea.
Speech and Language Processing Lecture 12—02/24/2015 Susan W. Brown.
Session 13 Context-Free Grammars and Language Syntax Introduction to Speech and Natural Language Processing (KOM422 ) Credits: 3(3-0)
TEORIE E TECNICHE DEL RICONOSCIMENTO Linguistica computazionale in Python: -Analisi sintattica (parsing)
1 Syntax Sudeshna Sarkar 25 Aug Sentence-Types Declaratives: A plane left S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
10/3/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 8 Giuseppe Carenini.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
6/2/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini.
1 LIN6932 Spring 2007 LIN6932 Topics in Computational Linguistics Lecture 6: Grammar and Parsing (I) February 15, 2007 Hana Filip.
Rules, Movement, Ambiguity
Parsing with Context-Free Grammars References: 1.Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2.Speech and Language Processing, chapters 9,
Natural Language - General
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
1 Context Free Grammars Chapter 9 (Much influenced by Owen Rambow) October 2009 Lecture #7.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
Natural Language Processing Lecture 15—10/15/2015 Jim Martin.
Natural Language Processing Lecture 14—10/13/2015 Jim Martin.
1/22/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 8 Giuseppe Carenini.
3/20/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 7 Giuseppe Carenini.
Speech and Language Processing Formal Grammars Chapter 12.
SI485i : NLP Set 7 Syntax and Parsing. 2 Syntax Grammar, or syntax: The kind of implicit knowledge of your native language that you had mastered by the.
Speech and Language Processing SLP Chapter 13 Parsing.
Parsing with Context Free Grammars. Slide 1 Outline Why should you care? Parsing Top-Down Parsing Bottom-Up Parsing Bottom-Up Space (an example) Top -
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
CSC 594 Topics in AI – Natural Language Processing
Speech and Language Processing
Basic Parsing with Context Free Grammars Chapter 13
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
CSC NLP -Context-Free Grammars
CPSC 503 Computational Linguistics
Lecture 13: Grammar and Parsing (I) November 9, 2004 Dan Jurafsky
CSC 594 Topics in AI – Natural Language Processing
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
CS 388: Natural Language Processing: Syntactic Parsing
CSCI 5832 Natural Language Processing
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 25
Probabilistic and Lexicalized Parsing
CSCI 5832 Natural Language Processing
Heng Ji January 17, 2019 SYNATCTIC PARSING Heng Ji January 17, 2019.
Presentation transcript:

1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.

2 FSA for Parsing: State Transitions Det Verb Noun Aux 0.5

3 State Transitions and Observations Det Verb Noun Aux 0.5 the a that bark run bite can will did bark dog cat

4 The State Space Det Noun Det Noun Det Noun Aux Verb Aux Verb Aux Noun Det Aux Verb Thedogcanrun

5 The State Space Det Noun Det Noun Det Noun Aux Verb Aux Verb Aux Noun Det Aux Verb Thedogcanrun

6 The State Space Det Noun Det Noun Det Noun Aux Verb Aux Verb Aux Noun Det Aux Verb Thedogcanrun

7 Syntax What is syntax for?  Grammar checkers  Question answering  Information extraction  Machine translation

8 Context-Free Grammars Capture constituency and ordering  Ordering is easy What are the rules that govern the ordering of words and bigger units in the language  What’s constituency? How words group into units and how the various kinds of units behave wrt one another

9 CFG Examples S -> NP VP NP -> Det NOMINAL NOMINAL -> Noun VP -> Verb Det -> a Noun -> flight Verb -> left

10 CFGs S -> NP VP  This says that there are units called S, NP, and VP in this language  That an S consists of an NP followed immediately by a VP  Doesn’t say that that’s the only kind of S  Nor does it say that this is the only place that NPs and VPs occur

11 Generativity As with FSAs and FSTs you can view these rules as either analysis or synthesis machines  Generate strings in the language  Reject strings not in the language  Impose structures (trees) on strings in the language

12 Derivations A derivation is a sequence of rules applied to a string that accounts for that string  Covers all the elements in the string  Covers only the elements in the string

13 Derivations as Trees

14 Parsing Parsing is the process of taking a string and a grammar and returning a (many?) parse tree(s) for that string It is completely analogous to running a finite-state transducer with a tape  It’s just more powerful  Remember this means that there are languages we can capture with CFGs that we can’t capture with finite-state methods

15 Other Options Regular languages (expressions)  Too weak Context-sensitive  Too powerful (maybe)

16 Context free All it really means is that the non-terminal on the left- hand side of a rule is out there all by itself (free of context) A -> B C Means that  I can rewrite an A as a B followed by a C regardless of the context in which A is found  Or when I see a B followed by a C I can infer an A regardless of the surrounding context

17 Key Constituents (English) Sentences Noun phrases Verb phrases Prepositional phrases

18 Sentence-Types Declaratives: A plane left S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane leave? S -> Aux NP VP WH Questions: When did the plane leave? S -> WH Aux NP VP

19 Recursion We’ll have to deal with rules such as the following where the non-terminal on the left also appears somewhere on the right (directly). Nominal -> Nominal PP [[flight] [to Boston]] VP -> VP PP[[departed Miami] [at noon]]

20 Recursion Of course, this is what makes syntax interesting flights from Denver Flights from Denver to Miami Flights from Denver to Miami in February Flights from Denver to Miami in February on a Friday Flights from Denver to Miami in February on a Friday under $300 Flights from Denver to Miami in February on a Friday under $300 with lunch

21 Recursion Of course, this is what makes syntax interesting [[flights] [from Denver]] [[[Flights] [from Denver]] [to Miami]] [[[[Flights] [from Denver]] [to Miami]] [in February]] [[[[[Flights] [from Denver]] [to Miami]] [in February]] [on a Friday]] Etc.

22 The Point If you have a rule like  VP -> V NP  It only cares that the thing after the verb is an NP. It doesn’t have to know about the internal affairs of that NP

23 The Point

24 Conjunctive Constructions S -> S and S  John went to NY and Mary followed him NP -> NP and NP VP -> VP and VP … In fact the right rule for English is X -> X and X

25 Problems Agreement Subcategorization Movement (for want of a better term)

26 Agreement This dog Those dogs This dog eats Those dogs eat *This dogs *Those dog *This dog eat *Those dogs eats

27 Subcategorization Sneeze: John sneezed Find: Please find [a flight to NY] NP Give: Give [me] NP [a cheaper fare] NP Help: Can you help [me] NP [with a flight] PP Prefer: I prefer [to leave earlier] TO-VP Told: I was told [United has a flight] S …

28 Subcategorization *John sneezed the book *I prefer United has a flight *Give with a flight Subcat expresses the constraints that a verb places on the number and syntactic types of arguments it wants to take (occur with).

29 So? So the various rules for VPs overgenerate.  They permit the presence of strings containing verbs and arguments that don’t go together  For example  VP -> V NP therefore Sneezed the book is a VP since “sneeze” is a verb and “the book” is a valid NP

30 So What? Now overgeneration is a problem  The grammar is supposed to account for all and only the strings in a language

31 Possible CFG Solution S -> NP VP NP -> Det Nominal VP -> V NP … SgS -> SgNP SgVP PlS -> PlNp PlVP SgNP -> SgDet SgNom PlNP -> PlDet PlNom PlVP -> PlV NP SgVP ->SgV Np …

32 CFG Solution for Agreement It works and stays within the power of CFGs But its ugly And it doesn’t scale all that well

33 Subcategorization It turns out that verb subcategorization facts will provide a key element for semantic analysis (determining who did what to who in an event).

34 Movement Core (canonical) example  My travel agent booked the flight

35 Movement Core example  [[My travel agent] NP [booked [the flight] NP ] VP ] S I.e. “book” is a straightforward transitive verb. It expects a single NP arg within the VP as an argument, and a single NP arg as the subject.

36 Movement What about?  Which flight do you want me to have the travel agent book? The direct object argument to “book” isn’t appearing in the right place. It is in fact a long way from where its supposed to appear. And note that its separated from its verb by 2 other verbs.

37 The Point CFGs appear to be just about what we need to account for a lot of basic syntactic structure in English. But there are problems  That can be dealt with adequately, although not elegantly, by staying within the CFG framework. There are simpler, more elegant, solutions that take us out of the CFG framework (beyond its formal power)

38 Parsing Parsing with CFGs refers to the task of assigning correct trees to input strings Correct here means a tree that covers all and only the elements of the input and has an S at the top It doesn’t actually mean that the system can select the correct tree from among all the possible trees

39 Parsing As with everything of interest, parsing involves a search which involves the making of choices We’ll start with some basic methods

40 For Now Assume…  You have all the words already in some buffer  The input isn’t POS tagged  We won’t worry about morphological analysis  All the words are known

41 Top-Down Parsing Since we’re trying to find trees rooted with an S (Sentences) start with the rules that give us an S. Then work your way down from there to the words.

42 Top Down Space

43 Bottom-Up Parsing Of course, we also want trees that cover the input words. So start with trees that link up with the words in the right way. Then work your way up from there.

44 Bottom-Up Space

45 Bottom Up Space

46 Control Of course, in both cases we left out how to keep track of the search space and how to make choices  Which node to try to expand next  Which grammar rule to use to expand a node

47 Top-Down and Bottom-Up Top-down  Only searches for trees that can be answers (i.e. S’s)  But also suggests trees that are not consistent with any of the words Bottom-up  Only forms trees consistent with the words  But suggest trees that make no sense globally

48 Problems Even with the best filtering, backtracking methods are doomed if they don’t address certain problems  Ambiguity  Shared subproblems

49 Ambiguity

50 Shared Sub-Problems No matter what kind of search (top-down or bottom-up or mixed) that we choose.  We don’t want to unnecessarily redo work we’ve already done.

51 Shared Sub-Problems Consider  A flight from Indianapolis to Houston on TWA

52 Shared Sub-Problems Assume a top-down parse making bad initial choices on the Nominal rule. In particular…  Nominal -> Nominal Noun  Nominal -> Nominal PP

53 Shared Sub-Problems

54 Shared Sub-Problems

55 Shared Sub-Problems

56 Shared Sub-Problems