CS 4705 Semantic Analysis: Syntax-Driven Semantics.

Slides:



Advertisements
Similar presentations
Problems of syntax-semantics interface ESSLLI 02 Trento.
Advertisements

March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Semantics (Chapter 17) Muhammed Al-Mulhem March 1, 2009.
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Semantics (Representing Meaning)
Semantic Analysis Read J & M Chapter 15.. The Principle of Compositionality There’s an infinite number of possible sentences and an infinite number of.
FSG, RegEx, and CFG Chapter 2 Of Sag’s Syntactic Theory.
Statistical NLP: Lecture 3
1 Computational Semantics Chapter 18 November 2012 We will not do all of this… Lecture #12.
CS 4705 Slides adapted from Julia Hirschberg. Homework: Note POS tag corrections. Use POS tags as guide. You may change them if they hold you back.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Semantics.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Meaning Representation and Semantic Analysis Ling 571 Deep Processing Techniques for NLP February 9, 2011.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Artificial Intelligence 2005/06 From Syntax to Semantics.
CS 4705 Semantics: Representations and Analyses. What kinds of meaning do we want to capture? Categories/entities –IBM, Jane, a black cat, Pres. Bush.
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
Final Review CS4705 Natural Language Processing. Semantics Meaning Representations –Predicate/argument structure and FOPC Thematic roles and selectional.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
CSC 8310 Programming Languages Meeting 2 September 2/3, 2014.
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
October 2004csa4050: Semantics II1 CSA4050: Advanced Topics in NLP Semantics II The Lambda Calculus Semantic Representation Encoding in Prolog.
Some slides adapted from Dorr, Kurfess: 481/W03/Slides/3-Knowledge-Representation.ppt.
Natural Language Processing
Chapter 15. Semantic Analysis
BİL711 Natural Language Processing
CS 4705: Semantic Analysis: Syntax-Driven Semantics
November 2003CSA4050: Semantics I1 CSA4050: Advanced Topics in NLP Semantics I What is semantics for? Role of FOL Montague Approach.
Chapter 15 Natural Language Processing (cont)
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Semantic Analysis CMSC Natural Language Processing May 8, 2003.
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
1 Natural Language Processing Chapter 15 (part 2).
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
November 2006Semantics I1 Natural Language Processing Semantics I What is semantics for? Role of FOL Montague Approach.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
7.2 Programming Languages - An Introduction to Informatics WMN Lab. Hye-Jin Lee.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Statistical NLP: Lecture 3
Basic Parsing with Context Free Grammars Chapter 13
Natural Language Processing
Semantics (Representing Meaning)
SYNTAX.
Representations of Meaning
Natural Language Processing
Natural Language Processing
CPE 480 Natural Language Processing
CSCI 5832 Natural Language Processing
CPSC 503 Computational Linguistics
CS 4705: Semantic Analysis: Syntax-Driven Semantics
Semantics September 23, /22/2018.
CSCI 5832 Natural Language Processing
Semantics: Representations and Analyses
Linguistic Essentials
CS4705 Natural Language Processing
Natural Language Processing
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Presentation transcript:

CS 4705 Semantic Analysis: Syntax-Driven Semantics

Semantics and Syntax Some representations of meaning: The cat howls at the moon. –Logic: howl(cat,moon) –Frames: Event: howling Agent: cat Patient: moon What shall we represent? –Entities, categories, events, time, aspect –Predicates, arguments (constants, variables) –And…quantifiers, operators (e.g. temporal) How do we compute such representations from a Natural Language sentence?

Compositional Semantics Assumption: The meaning of the whole is comprised of the meaning of its parts –George cooks. Dan eats. Dan is sick. –cook(George) eat(Dan) sick(Dan) –George cooks and Dan eats cook(George) ^ eat(Dan) –George cooks or Dan is sick. cook(George) v sick(Dan) –If George cooks, Dan is sick cook(George)  sick(Dan) or ~cook(George) v sick(Dan)

–If George cooks and Dan eats, Dan will get sick. (cook(George) ^ eat(Dan))  sick(Dan) sick(Dan)  cook(George) ^ eat(Dan) ?? Dan only gets sick when George cooks. You can have apple juice or orange juice. Mary got married and had a baby. George cooks but Dan eats. George cooks and eats Dan.

Meaning derives from –The entities and actions/states represented (predicates and arguments, or, nouns and verbs) –The way they are ordered and related: The syntax of the representation may correspond to the syntax of the sentence Can we develop a mapping between syntactic representations and formal representations of meaning?

Syntax-Driven Semantics S NP VP eat(Dan) Nom V N Dan eats Goal: Link syntactic structures to corresponding semantic representation to produce representation of the ‘meaning’ of a sentence while parsing it

Specific vs. General-Purpose Rules Don’t want to have to specify for every possible parse tree what semantic representation it maps to Do want to identify general mappings from parse trees to semantic representations One way: –Augment lexicon and grammar –Devise mapping between rules of grammar and rules of semantic representation –Rule-to-Rule Hypothesis: such a mapping exists

Semantic Attachment Extend every grammar rule with `instructions’ on how to map components of rule to a semantic representation, e.g. S  NP VP {VP.sem(NP.sem)} Each semantic function defined in terms of semantic representation of choice Problem: how to define semantic functions and how to specify their composition so we always get the `right’ meaning representation from the grammar

Example: McDonalds serves burgers. Associating constants with constituents –ProperNoun  McDonalds {McDonalds} –PluralNoun  burgers {burgers} Defining functions to produce these from input –NP  ProperNoun {ProperNoun.sem} –NP  PluralNoun {PluralNoun.sem} –Assumption: meaning representations of children are passed up to parents when non- branching (e.g. ProperNoun.sem(X) = X) But…verbs are where the action is

–V  serves {E(e,x,y) (Isa(e,Serving) ^ Server(e,x) ^ Served(e,y))} where e = event, x = agent, y = patient –Will every verb needs its own distinct representation? McDonalds hires students. Predicate(Agent, Patient) McDonalds gave customers a bonus. Predicate(Agent, Patient, Beneficiary)

Composing Semantic Constituents Once we have the semantics for each constituent, how do we combine them? –E.g. VP  V NP {V.sem(NP.sem)} –If goal for VP semantics of ‘serve’ is the representation E(e,x) (Isa(e,Serving) ^ Server(e,x) ^ Served(e,Meat)) then –VP.sem must tell us Which variables to be replaced by which arguments? How is replacement accomplished?

First… Lambda Notation Extension to First Order Predicate Calculus λ x P(x): λ + variable(s) + FOPC expression in those variables Lambda reduction Apply lambda-expression to logical terms to bind lambda-expression’s parameters to terms xP(x) xP(x)(car) P(car)

For NLP Semantics Parameter list (e.g. x in x) in lambda expression makes variables (x) in logical expression (P(x)) available for binding to external arguments (car) provided by semantics of other constituents

Defining VP Semantics Recall we have VP  V NP {V.sem(NP.sem)} Target semantic representation is: {E(e,x,y) (Isa(e,Serving) ^ Server(e,y) ^ Served(e,x))} Define V.sem as: { x E(e,y) (Isa(e,Serving) ^ Server(e,y) ^ Served(e,x))} –Now ‘x’ will be available for binding when V.sem applied to NP.sem of direct object

V.sem Applied to McDonalds serves burgers application binds x to value of NP.sem (burgers) x E(e,y) (Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)) (burgers) -reduction replaces x within -expression with burgers Value of V.sem(NP.sem) is now E(e,y) (Isa(e,Serving) ^ Server(e,y) ^ Served(e,burgers))

But we’re not done yet…. Need to define semantics for –S  NP VP {VP.sem(NP.sem)} –Where is the subject? –E(e,y) (Isa(e,Serving) ^ Server(e,y) ^ Served(e,burgers)) –Need another -expression in V.sem so the subject NP can be bound later in VP.sem –V.sem, version 2 x y E(e) (Isa(e,Serving) ^ Server(e,y) ^ Served(e,x))

–VP  V NP {V.sem(NP.sem)} x y E(e) (Isa(e,Serving) ^ Server(e,y) ^ Served(e,x))(burgers) y E(e) (Isa(e,Serving) ^ Server(e,y) ^ Served(e,burgers)) –S  NP VP {VP.sem(NP.sem)} y E(e) Isa(e,Serving) ^ Server(e,y) ^ Served(e,burgers)}(McDonald’s) E(e) Isa(e,Serving) ^ Server(e,McDonald’s) ^ Served(e,burgers)

What is our grammar now? S  NP VP {VP.sem(NP.sem)} VP  V NP {V.sem(NP.sem)} V  serves { x y E(e) (Isa(e,Serving) ^ Server(e,y) ^ Served(e,x))} NP  Propernoun {Propernoun.sem} NP  Pluralnow {Pluralnoun.sem} Propernoun  McDonalds Pluralnoun  burgers

But this is just the tip of the iceberg…. Terms can be complex A restaurant serves burgers. –‘a restaurant’: E x Isa(x,restaurant) –E e Isa(e,Serving) ^ Server(e, ) ^ Served(e,burgers) –Allows quantified expressions to appear where terms can by providing rules to turn them into well-formed FOPC expressions Issues of quantifier scope Every restaurant serves every burger.

How represent other constituents? Adjective phrases: Happy people, cheap food, purple socks Intersective semantics works for some… Nom  Adj Nom { x (Nom.sem(x) ^ Isa(x,Adj.sem))} Adj  cheap {Cheap} x Isa(x, Food) ^ Isa(x,Cheap) But….fake gun? Local restaurant? Former friend? Would-be singer? Ex Isa(x, Gun) ^ Isa(x,Fake)

Doing Compositional Semantics To incorporate semantics into grammar we must –Determine `right’ representation for each basic constituent –Determine `right’ representation constituents that take these basic constituents as arguments –Incorporate semantic attachments into each rule of our CFG

Parsing with Semantic Attachments Modify parser to include operations on semantic attachments as well as syntactic constituents –E.g., change an Early-style parser so when constituents are completed, their attached semantic function is applied and a meaning representation created and stored with state Or… let parser run to completion and then walk through resulting tree, applying semantic attachments from bottom-up

Option 1 (Integrated Semantic Analysis) S  NP VP {VP.sem(NP.sem)} –VP.sem has been stored in state representing VP –NP.sem stored with the state for NP –When rule completed, retrieve value of VP.sem and of NP.sem, and apply VP.sem to NP.sem –Store result in S.sem. As fragments of input parsed, semantic fragments created Can be used to block ambiguous representations

Drawback You also perform semantic analysis on orphaned constituents that play no role in final parse Case for pipelined approach: Do semantics after syntactic parse

Non-Compositional Language Some meaning isn’t compositional –Non-compositional modifiers: fake, former, local, so- called, putative, apparent,… –Metaphor: You’re the cream in my coffee. She’s the cream in George’s coffee. The break-in was just the tip of the iceberg. This was only the tip of Shirley’s iceberg. –Idiom: The old man finally kicked the bucket. The old man finally kicked the proverbial bucket. –Deferred reference: The ham sandwich wants his check. Solution: special rules?

Summing Up Hypothesis: Principle of Compositionality –Semantics of NL sentences and phrases can be composed from the semantics of their subparts Rules can be derived which map syntactic analysis to semantic representation (Rule-to-Rule Hypothesis) –Lambda notation provides a way to extend FOPC to this end –But coming up with rule2rule mappings is hard Idioms, metaphors and other non-compositional aspects of language makes things tricky (e.g. fake gun)

Next Read Ch 16: 1-2