CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.

Slides:



Advertisements
Similar presentations
Problems of syntax-semantics interface ESSLLI 02 Trento.
Advertisements

 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Semantic Analysis Read J & M Chapter 15.. The Principle of Compositionality There’s an infinite number of possible sentences and an infinite number of.
Cognitive Linguistics Croft & Cruse 9
Statistical NLP: Lecture 3
1 Computational Semantics Chapter 18 November 2012 We will not do all of this… Lecture #12.
CS 4705 Slides adapted from Julia Hirschberg. Homework: Note POS tag corrections. Use POS tags as guide. You may change them if they hold you back.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Semantics.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Meaning Representation and Semantic Analysis Ling 571 Deep Processing Techniques for NLP February 9, 2011.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Artificial Intelligence 2005/06 From Syntax to Semantics.
Final Review CS4705 Natural Language Processing. Semantics Meaning Representations –Predicate/argument structure and FOPC Thematic roles and selectional.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
PPL Syntax & Formal Semantics Lecture Notes: Chapter 2.
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
October 2004csa4050: Semantics II1 CSA4050: Advanced Topics in NLP Semantics II The Lambda Calculus Semantic Representation Encoding in Prolog.
1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15.
Some slides adapted from Dorr, Kurfess: 481/W03/Slides/3-Knowledge-Representation.ppt.
Natural Language Processing
Chapter 15. Semantic Analysis
BİL711 Natural Language Processing
CS 4705: Semantic Analysis: Syntax-Driven Semantics
November 2003CSA4050: Semantics I1 CSA4050: Advanced Topics in NLP Semantics I What is semantics for? Role of FOL Montague Approach.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Semantic Analysis CMSC Natural Language Processing May 8, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
October 2004CSA4050: Semantics III1 CSA4050: Advanced Topics in NLP Semantics III Quantified Sentences.
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
1 Natural Language Processing Chapter 15 (part 2).
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
Section 11.3 Features structures in the Grammar ─ Jin Wang.
Building a Semantic Parser Overnight
November 2006Semantics I1 Natural Language Processing Semantics I What is semantics for? Role of FOL Montague Approach.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
SYNTAX 1 NOV 9, 2015 – DAY 31 Brain & Language LING NSCI Fall 2015.
NATURAL LANGUAGE PROCESSING
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
PPL Syntax & Formal Semantics Lecture Notes: Chapter 2.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Operational Semantics of Scheme
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Statistical NLP: Lecture 3
Basic Parsing with Context Free Grammars Chapter 13
Natural Language Processing
Natural Language Processing
Natural Language Processing
CPE 480 Natural Language Processing
CSCI 5832 Natural Language Processing
CPSC 503 Computational Linguistics
CS 4705: Semantic Analysis: Syntax-Driven Semantics
بسم الله الرحمن الرحيم ICS 482 Natural Language Processing
Semantics September 23, /22/2018.
CSCI 5832 Natural Language Processing
Semantics: Representations and Analyses
Linguistic Essentials
CS4705 Natural Language Processing
Presentation transcript:

CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics

Review Some representations of meaning: –First order logic –Frames –etc. Some linguistically relevant categories we want to represent –Predicates, arguments, variables, quantifiers –Categories, events, time, aspect Today: How can we compute meaning about these categories from these representations?

Compositional Semantics The meaning of the whole is made up of the meaning of its parts –George cooks. Dan eats. Dan is sick. –Cook(George) Eat(Dan) Sick(Dan) –If George cooks and Dan eats, Dan will get sick. (Cook(George) ^ eat(Dan))  Sick(Dan) Sick(Dan)  Cook(George) Part of the meaning derives from the people and activities it’s about (predicates and arguments, or, nouns and verbs) and part from the way they are ordered and related grammatically: syntax

Syntax-Driven Semantics S NP VP eat(Dan) Nom V N Dan eats So….can we link up syntactic structures to a corresponding semantic representation to produce the ‘meaning’ of a sentence in the course of parsing it?

Specific vs. General-Purpose Rules We don’t want to have to specify for every possible parse tree what semantic representation it maps to We want to identify general mappings from parse trees to semantic representations: –Again (as with feature structures) we will augment the lexicon and the grammar –Rule-to-rule hypothesis: a mapping exists between rules of the grammar and rules of semantic representation

Semantic Attachments Extend each grammar rule with instructions on how to map the components of the rule to a semantic representation (grammars are getting complex) S  NP VP {VP.sem(NP.sem)} Each semantic function is defined in terms of the semantic representation of choice Problem: how to define these functions and how to specify their composition so we always get the meaning representation we want from our grammar?

A ‘Simple’ Example AyCaramba serves meat. Associating constants with constituents –ProperNoun  AyCaramba {AyCaramba} –MassNoun  meat {Meat} Defining functions to produce these from input –NP  ProperNoun {ProperNoun.sem} –NP  MassNoun {MassNoun.sem} –Assumption: meaning reps of children are passed up to parents for non-branching constuents Verbs here are where the action is

–V  serves {E(e,x,y) Isa(e,Serving) ^ Server(e,x) ^ Served(e,y)} –Will every verb have its own distinct representation? –Predicate(Agent,Patient)… How do we combine these pieces? –VP  V NP –Goal: E(e,x) Isa(e,Serving) ^ Server(e,x) ^ Served(e,Meat) –VP semantics must tell us Which vars to be replaced by which args? How this replacement is done?

Lambda Notation Extension to FOPC x P(x) + variable(s) + FOPC expression in those variables Lambda binding Apply lambda-expression to logical terms to bind lambda-expression’s parameters to terms (lambda reduction) Simple process: substitute terms for variables in lambda expression xP(x)(car) P(car)

Lambda notation provides requisite verb semantics –Formal parameter list makes variables within the body of the logical expression available for binding to external arguments provided by e.g. NPs –Lambda reduction implements the replacement Semantic attachment for –V  serves {V.sem(NP.sem)} {E(e,x,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)} becomes { x E(e,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)} –Now ‘x’ is available to be bound when V.sem is applied to NP.sem

– -application binds x to value of NP.sem (Meat) – -reduction replaces x within -expression to Meat –Value of VP.sem becomes: {E(e,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,Meat)} Similarly, we need a semantic attachment for S  NP VP {VP.sem(NP.sem)} to add the subject NP to our semantic representation of AyCaramba serves meat –We need another -expression in the value of VP.sem –But currently V.sem doesn’t give us one –So, we change V.sem to include another -expression –V  serves { x y E(e) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)}

VP semantics (V.sem(NP.sem) binds the outer - expression to the object NP (Meat) but leaves the inner -expression for subsequent binding to the subject NP when the semantics of S is determined {E(e) Isa(e,Serving) ^ Server(e,AyCaramba) ^ Served(e,Meat)}

Some Additional Problems to Solve Complex terms A restaurant serves meat. –‘a restaurant’: E x Isa(x,Restaurant) –E e Isa(e,Serving) ^ Server(e, ) ^ Served(e,Meat) –Allows quantified expressions to appear where terms can by providing rules to turn them into well-formed FOPC expressions Quantifier scope Every restaurant serves meat. Every restaurant serves every meat.

Appropriate representations for other constituents? –Adjective phrases: intersective semantics Nom  Adj Nom { x Nom.sem(x) ^ Isa(x,Adj.sem)} Adj  tiny x Isa(x, Restaurant) ^ Isa(x,Cheap) But….fake gun? Ex Isa(x, Gun) ^ AM(x,Fake)

Doing Compositional Semantics To incorporate semantics into grammar we must –Figure out right representation for a single constituent based on the parts of that constituent (e.g. Adj) –Figuring out the right representation for a category of constituents based on other grammar rules making use of that constituent (e.g Nom  Adj Nom) This gives us a set of function-like semantic attachments incorporated into our CFG –E.g. Nom  Adj Nom { x Nom.sem(x) ^ Isa(x,Adj.sem)}

What do we do with them? As we did with feature structures: –Alter an Early-style parser so when constituents (dot at the end of the rule) are completed, the attached semantic function applied and meaning representation created and stored with state Or, let parser run to completion and then walk through resulting tree running semantic attachments from bottom-up

Option 1 (Integrated Semantic Analysis) S  NP VP {VP.sem(NP.sem)} –VP.sem has been stored in state representing VP –NP.sem stored with the state for NP –When rule completed, go get value of VP.sem, go get NP.sem, and apply VP.sem to NP.sem –Store result in S.sem. As fragments of input parsed, semantic fragments created Can be used to block ambiguous representations

Drawback You also perform semantic analysis on orphaned constituents that play no role in final parse Hence, case for pipelined approach: Do semantics after syntactic parse

Non-Compositional Language What do we do with language whose meaning isn’t derived from the meanings of its parts –Metaphor: You’re the cream in my coffee. –She’s the cream in George’s coffee. –The break-in was just the tip of the iceberg. –This was only the tip of Shirley’s iceberg. –Idioms: The old man finally kicked the bucket. –The old man finally kicked the proverbial bucket. Solutions? –Mix lexical items with special grammar rules?

Summing Up Hypothesis: Principle of Compositionality –Semantics of NL sentences and phrases can be composed from the semantics of their subparts Rules can be derived which map syntactic analysis to semantic representation (Rule-to-Rule Hypothesis) –Lambda notation provides a way to extend FOPC to this end –But coming up with rule2rule mappings is hard Idioms, metaphors perplex the process