Features and Unification Read J & M Chapter 11.. Solving the Agreement Problem Number agreement: S  NP VP * Mary walk. [NP NUMBER] [VP NUMBER] NP  det.

Slides:



Advertisements
Similar presentations
Prolog programming....Dr.Yasser Nada. Chapter 8 Parsing in Prolog Taif University Fall 2010 Dr. Yasser Ahmed nada prolog programming....Dr.Yasser Nada.
Advertisements

 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
November 2008NLP1 Natural Language Processing Definite Clause Grammars.
Lexical Functional Grammar : Grammar Formalisms Spring Term 2004.
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
1 Features and Augmented Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 26, 2011.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 31, 2011.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור עשר Chart Parsing (cont) Features.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Issues in Computational Linguistics: Parsing and Generation Dick Crouch and Tracy King.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Features and Unification
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
Chapter 16: Features and Unification Heshaam Faili University of Tehran.
Chapter 10. Parsing with CFGs From: Chapter 10 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
10. Parsing with Context-free Grammars -Speech and Language Processing- 발표자 : 정영임 발표일 :
Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Fall 2004 Lecture Notes #4 EECS 595 / LING 541 / SI 661 Natural Language Processing.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
Notes on Pinker ch.7 Grammar, parsing, meaning. What is a grammar? A grammar is a code or function that is a database specifying what kind of sounds correspond.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
2007CLINT-LIN-FEATSTR1 Computational Linguistics for Linguists Feature Structures.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Lexicalized and Probabilistic Parsing Read J & M Chapter 12.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
Artificial Intelligence: Natural Language
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Section 11.3 Features structures in the Grammar ─ Jin Wang.
Making it stick together…
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
Artificial Intelligence 2004
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
English Syntax Read J & M Chapter 9.. Two Kinds of Issues Linguistic – what are the facts about language? The rules of syntax (grammar) Algorithmic –
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
Chapter 11: Parsing with Unification Grammars Heshaam Faili University of Tehran.
SYNTAX.
Natural Language Processing Vasile Rus
Natural Language Processing Vasile Rus
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Basic Parsing with Context Free Grammars Chapter 13
Dependency Parsing & Feature-based Parsing
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
CS 388: Natural Language Processing: Syntactic Parsing
Natural Language - General
CPSC 503 Computational Linguistics
Presentation transcript:

Features and Unification Read J & M Chapter 11.

Solving the Agreement Problem Number agreement: S  NP VP * Mary walk. [NP NUMBER] [VP NUMBER] NP  det N* those flight [DET NUMBER] [N NUMBER] Without this check, we’d have more ambiguity: Flying planes is dangerous. Flying planes are dangerous.

Solving the Agreement Problem Subcategorization of verbs: VP  V VP to * Mary decided going. [subcat VP to ] VP  VVP ing Mary imagined going [subcat VP ing ]

Subcategorization of Verbs

Subcategorization of Verbs – Example SubcatExample Quoasked [ Quo “What was it like?”] NPasking[ NP a question] Swhasked[ Swh what trades you’re interested in] Stoask [ Sto him to tell you] PPthat means asking [ PP at home] Vtoasked [ Vto to see a girl called Evelyn] NP Sifasked [ NP him] [ Sif whether he could make] NP NPasked [ NP myself] [ NP a question] NP Swhasked [ NP him] [ Swh why he took time off]

Specifying Control Information John persuaded Mary to go. John promised Mary to go. Who does the going?

Subcategorization of Nouns and Adjectives Jane has a passion for old movies. Jane has an interest in old movies. Orth Passion Cat N Head Subcat [Cat PP] [Head [Prep for]

Reflexive Pronouns Mary wants to take care of herself. * Mary wants to take care of himself. * John and Mary want to take care of himself. Mary wants John to take care of himself.

Properties of a Good Solution We want a solution to this problem that: avoids combinatorial explosion of features is declarative so that: it can be used for both recognition and generation the linguistic facts can be reused if we change parsing algorithms to suit particular task environments has a clean, formal semantics so that we can make correct statements about what the system will do. So we reject simply writing code to handle the various cases.

Feature Structures

Reentrant Feature Structures

Unification [NUMBER SG]  [NUMBER SG] = [NUMBER SG] [NUMBER SG]  [NUMBER PL] Fails [NUMBER SG]  [NUMBER [] ] = [NUMBER SG] [NUMBER SG]  [PERSON 3 ] = NUMBER SG PERSON 3 Two feature structures can be unified if there is no conflict between them.

Unification of Reentrant Feature Structures  = S  NP VP

Subsumption A less specific (more abstract) feature structure subsumes an equal or more specific one. If F subsumes G, then we write F  G We can define unification in terms of subsumption: F  G is the most general feature structure H such that: F  H and G  H.

Two Views of Subsumption A subsumes BA  B Set Theoretic:objs satisfying B  objs satisfying A Logical: B  A A: Noun B:Noun Number SG Note that [] subsumes all other feature structures. It is the least upper bound of the semilattice formed by the set of feature structures. In the set theoretic view, it corresponds to U. In the logical view, it corresponds to T.

Analogy to Theorem Proving We use the terms “unification” and “subsumption” here in essentially the same way in which they are used in theorem proving systems:  x, y P(x, f(y))  Q(x, y) f(3) = 8 P(2, 8) Conclude: ? Think of “unification” as “matching”.

Unification Example – Conjunctions Mary [ VP fed the cat] and [ VP swept the floor]. Mary fed the cat and Joe swept the floor. Mary bought the[ Adj red] and [ Adj green] ribbons. [ Vtrans Feed] and [ Vtrans water] the plants. * [ Vtrans Feed] and [ Vintrans cough] the plants. * Mary fed [ NP the cat] and [ Adj green]. X0  X1 CONJ X2 CAT = CAT  CAT

Adding Unification to Grammar Rules The ruleS  NP VP [CATS HEAD [1] STRUCT[SUBJECT [A1] VP [A2] ] ]  A1: [CATNP AGREEMENT[2]: [] ] A2: [CAT VP HEAD[1] AGREEMENT [2] ] Note: The STRUCT feature is used here to record the tree structure of the constituents as they are produced.

Applying the Grammar Rule This rule says that we can build an S from two components, A1 and A2. To apply this rule to two candidate components X and Y, the parser must: Copy the rule structure to create a new instance of S. (Remember that unification is destructive and we’ll need to be able to reuse the rule.) Unify the feature structure of X with the specification for A1 on the right side of the rule. This will succeed if the CAT of X unifies with NP, and if it succeeds, it will bind [2] to the AGREEMENT structure of X and A1 to X. Unify the feature structure of Y with the specification for A2 on the right side of the rule. This will succeed if the CAT of Y unifies with VP and if the AGREEMENT structure of Y unifies with [2], namely the agreement structure of X. If it succeeds, it will bind [1] to the HEAD feature of Y and A2 to Y.

Example of Applying the Rule A1: [CATNP HEADdogs STRUCT[NOMdogs] AGREEMENT[NBR PL PERS 3] ] A2:[CATVP HEADran STRUCT[V ran] AGREEMENT [] ]  [CATS HEAD [ran] STRUCT[SUBJECT [CATNP HEADdogs STRUCT[NOMdogs] AGREEMENT[2]: [NBR PL PERS 3] ] VP [CATVP HEADran STRUCT[V ran] AGREEMENT [2] ] ] ]

Features Don’t Have to Be Identical Mary has become [ NP a lawyer] and [ AdjP very rich]. If we have an algebra of feature types, then we can use it for other things, even verging into semantics: * [ NP Mary] and [ NP the potatoes] cooked.

What Features to Unify and Pass Up? [ Npdet The dog] and [ Npdett the cats] like going outside. [ Npdet The dog] and [ Npquant all the cats] like going outside. The dog and the cat like going outside. X0  X1 CONJ X2 CAT = CAT  CAT AGREEMENT= ???

Implementing Unification in Parsing The simple case: The parser is following a single path. To unify two feature structures we destructively alter them to create a single structure. We change our representation to allow this: NUMBER SG PERSON 3

Adding Unification to the Earley Algorithm 1.Use unification to check constraints before applying each grammar rule. 2.Copy the feature structures before unifying them so that the chart still works: a given entry in the chart may become part of (i.e., be unified with) multiple other entries as the parser follows multiple paths.

What if We Need a Basic CFG? Suppose that we need a basic CFG, possibly because: Run-time efficiency is critical The parser must be embedded inside a larger system, such as a speech understanding system, that requires it. Then we can view the unification formalism as a high-level language, useful for description, and then compile a unification grammar into a standard context-free grammar.