Semantics cCS 224n / Lx 237 Tuesday, May 11 2004 With slides borrowed from Jason Eisner.

Slides:



Advertisements
Similar presentations
Prolog programming....Dr.Yasser Nada. Chapter 8 Parsing in Prolog Taif University Fall 2010 Dr. Yasser Ahmed nada prolog programming....Dr.Yasser Nada.
Advertisements

Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Semantics Static semantics Dynamic semantics attribute grammars
CSA4050: Advanced Topics in NLP Semantics IV Partial Execution Proper Noun Adjective.
Lexical Functional Grammar History: –Joan Bresnan (linguist, MIT and Stanford) –Ron Kaplan (computational psycholinguist, Xerox PARC) –Around 1978.
Semantic Analysis Read J & M Chapter 15.. The Principle of Compositionality There’s an infinite number of possible sentences and an infinite number of.
Syntax and Context-Free Grammars Julia Hirschberg CS 4705 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
1 Grammars and Parsing Allen ’ s Chapters 3, Jurafski & Martin ’ s Chapters 8-9.
Parsing. What is Parsing? S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det.
Drawing Trees & Ambiguity in Trees. Some Phrase Structure Rules of English S’ -> (Comp) S S’ -> (Comp) S S -> {NP/S’} (T) VP S -> {NP/S’} (T) VP VP 
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
CAS LX a. A notational holiday. Sets A set is a collection of entities of any kind. They can be finite: {√2, John Saeed, 1984}. They can be infinite:
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
LING 364: Introduction to Formal Semantics Lecture 5 January 26th.
CAS LX 502 Semantics 2b. A formalism for meaning 2.5, 3.2, 3.6.
The syntax of language How do we form sentences? Processing syntax. Language and the brain.
Constituency Tests Phrase Structure Rules
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15.
Statistical NLP Winter 2009 Lecture 17: Unsupervised Learning IV Tree-structured learning Roger Levy [thanks to Jason Eisner and Mike Frank for some slides]
CAS LX 502 Semantics 3a. A formalism for meaning (cont ’ d) 3.2, 3.6.
Syntax Directed Translation. Syntax directed translation Yacc can do a simple kind of syntax directed translation from an input sentence to C code We.
1 CS 385 Fall 2006 Chapter 14 Understanding Natural Language (omit 14.4)
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
For Monday Read chapter 23, sections 1-2 FOIL exercise due.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
CAS LX 502 8b. Formal semantics A fragment of English.
Intro to NLP - J. Eisner1 Tree-Adjoining Grammar (TAG) One of several formalisms that are actually more powerful than CFG Note: CFG with features.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Parsing. What is Parsing? S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det.
October 2004CSA4050: Semantics III1 CSA4050: Advanced Topics in NLP Semantics III Quantified Sentences.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
Syntax Why is the structure of language (syntax) important? How do we represent syntax? What does an example grammar for English look like? What strategies.
Artificial Intelligence: Natural Language
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
The man bites the dog man bites the dog bites the dog the dog dog Parse Tree NP A N the man bites the dog V N NP S VP A 1. Sentence  noun-phrase verb-phrase.
CSA2050 Introduction to Computational Linguistics Parsing I.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
1 Syntax 1. 2 In your free time Look at the diagram again, and try to understand it. Phonetics Phonology Sounds of language Linguistics Grammar MorphologySyntax.
11 Project, Part 3. Outline Basics of supervised learning using Naïve Bayes (using a simpler example) Features for the project 2.
Natural Language Processing Slides adapted from Pedro Domingos
English Syntax Read J & M Chapter 9.. Two Kinds of Issues Linguistic – what are the facts about language? The rules of syntax (grammar) Algorithmic –
Drawing Trees & Ambiguity in Trees
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 3.
1 Some English Constructions Transformational Framework October 2, 2012 Lecture 7.
CAS LX b. Summarizing the fragment analysis, relative clauses.
 2003 CSLI Publications Ling 566 Oct 17, 2011 How the Grammar Works.
CAS LX 502 9b. Formal semantics Pronouns and quantifiers.
NATURAL LANGUAGE PROCESSING
Intro to NLP - J. Eisner1 Semantics From Syntax to Meaning!
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Syntax 1.
CS730: Text Mining for Social Media, F2010
With slides borrowed from Jason Eisner
Tree-Adjoining Grammar (TAG)
Artificial Intelligence 2004 Speech & Natural Language Processing
Presentation transcript:

Semantics cCS 224n / Lx 237 Tuesday, May With slides borrowed from Jason Eisner

Objects Three Kinds: Boolean – semantic value of sentences Entities Objects, NPs Maybe space / time specifications Functions Predicates – function returning a boolean Functions might return other functions Functions might take other functions as arguments.

Nouns and their modifiers expert g expert(g) big fat expert g big(g)  fat(g)  expert(g) But: bogus expert Wrong: g bogus(g)  expert(g) Right: g (bogus(expert))(g) … bogus maps to new concept Baltimore expert ( white-collar expert, TV expert …) g Related(Baltimore, g)  expert(g) Or with different intonation: g (Modified-by(Baltimore, expert))(g) Can’t use Related for that case: law expert and dog catcher = g Related(law,g)  expert(g)  Related(dog, g)  catcher(g) = dog expert and law catcher

We’ve discussed what semantic representations should look like. But how do we get them from sentences??? First - parse to get a syntax tree. Second - look up the semantics for each word. Third - build the semantics for each constituent Work from the bottom up The syntax tree is a “recipe” for how to do it Compositional Semantics

Add a “sem” feature to each context-free rule S  NP loves NP S [sem=loves(x,y)]  NP[sem=x] loves NP[sem=y] Meaning of S depends on meaning of NPs Compositional Semantics NP V loves VP S NP x y loves(x,y) NP the bucket V kicked VP S NP x died(x)

Instead of S  NP loves NP S[sem=loves(x,y)]  NP[sem=x] loves NP[sem=y] might want general rules like S  NP VP: V[sem=loves]  loves VP[sem=v(obj)]  V[sem=v] NP[sem=obj] S[sem=vp(subj)]  NP[sem=subj] VP[sem=vp] Now George loves Laura has sem= loves(Laura)(George) In this manner we’ll sketch a version where Still compute semantics bottom-up Grammar is in Chomsky Normal Form So each node has 2 children: 1 function & 1 argument To get its semantics, apply function to argument! Compositional Semantics

NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc.

NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) the meaning that we want here: how can we arrange to get it?

NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) G what function should apply to G to yield the desired blue result? (this is like division!)

NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G

NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G a a x loves(x,L) We’ll say that “to” is just a bit of syntax that changes a VP stem to a VP inf with the same meaning.

NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G a a x loves(x,L) y x loves(x,y) L

NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G a a y x loves(x,y) L x loves(x,L) x wants(x, loves(G,L) ) by analogy

NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G a a y x loves(x,y) L x loves(x,L) x wants(x, loves(G,L)) y x wants(x,y) by analogy

NP Laura V stem love VP stem VP inf T to S inf VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. x wants(x, loves(G,L)) x present( wants(x, loves(G,L)) ) NP George v x present(v(x))

NP Laura V stem love VP stem VP inf T to S inf VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. x present(wants(x, loves(G,L))) NP George present(wants(every(nation), loves(G,L)))) every(nation)

NP Laura V stem love VP stem VP inf T to S inf VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. present( x wants(x, loves(G,L))) NP George present(wants(every(nation), loves(G,L)))) every(nation) n every(n) nation

NP Laura V stem love VP stem VP inf T to S inf VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. NP George present(wants(every(nation), loves(G,L)))) s assert(s)

In Summary: From the Words NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. G a a y x loves(x,y) L y x wants(x,y) v x present(v(x)) everynation s assert(s) assert(present(wants(every(nation), loves(G,L))))

So now what? Now that we have the semantic meaning, what do we do with it? Huge literature on logical reasoning, and knowledge learning. Reasoning versus Inference “John ate a Pizza” Q:What was eaten by John? A: pizza “John ordered a pizza, but it came with anchovies. John then yelled at the waiter and stormed out.” Q: What was eaten by John? A: nothing

Problem 1a Write grammar rules complete with semantic translations that could be added to the grammar fragment, which will parse the above sentence and generate a semantic representation using the own predicate.