Natural Logic? Lauri Karttunen Cleo Condoravdi Annie Zaenen Palo Alto Research Center.

Slides:



Advertisements
Similar presentations
Artificial Intelligence
Advertisements

1 Knowledge Representation Introduction KR and Logic.
Brief Introduction to Logic. Outline Historical View Propositional Logic : Syntax Propositional Logic : Semantics Satisfiability Natural Deduction : Proofs.
Formal Criteria for Evaluating Arguments
Grammar: Meaning and Contexts * From Presentation at NCTE annual conference in Pittsburgh, 2005.
Inference Rules Universal Instantiation Existential Generalization
1 Containment, Exclusion and Implicativity: A Model of Natural Logic for Textual Inference (MacCartney and Manning)  Every firm saw costs grow more than.
Logic Programming Automated Reasoning in practice.
First-Order Logic (and beyond)
Semantics (Representing Meaning)
Inference and Reasoning. Basic Idea Given a set of statements, does a new statement logically follow from this. For example If an animal has wings and.
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
Logic Use mathematical deduction to derive new knowledge.
CAS LX 502 Semantics 10b. Presuppositions, take
Logic.
Logic Concepts Lecture Module 11.
Deduction and Induction
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
CAS LX 502 Semantics 1b. The Truth Ch. 1.
An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.
PHIL 120: Jan 8 Basic notions of logic
Knoweldge Representation & Reasoning
Let remember from the previous lesson what is Knowledge representation
Meaning and Language Part 1.
Syntax.
Copyright © Cengage Learning. All rights reserved.
THE PARTS OF SYNTAX Don’t worry, it’s just a phrase ELL113 Week 4.
Adapted from Discrete Math
Predicates and Quantifiers
Discrete Mathematics and Its Applications
Atomic Sentences Chapter 1 Language, Proof and Logic.
Declarative vs Procedural Programming  Procedural programming requires that – the programmer tell the computer what to do. That is, how to get the output.
The Problem page, Coherence, ideology How an ideological message is conveyed through language, and particularly through the following aspects of textual.
Pattern-directed inference systems
1 Knowledge Representation. 2 Definitions Knowledge Base Knowledge Base A set of representations of facts about the world. A set of representations of.
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Computational Semantics Day 5: Inference Aljoscha.
Fundamentals of Logic 1. What is a valid argument or proof? 2. Study system of logic 3. In proving theorems or solving problems, creativity and insight.
LECTURE 2: SEMANTICS IN LINGUISTICS
Presupposition is what the speaker assumes to be the case prior to making an utterance. Entailment, which is not a pragmatic concept, is what logically.
Key Concepts Representation Inference Semantics Discourse Pragmatics Computation.
Presentation about pragmatic concepts Implicatures Presuppositions
Chapter 2 Logic 2.1 Statements 2.2 The Negation of a Statement 2.3 The Disjunction and Conjunction of Statements 2.4 The Implication 2.5 More on Implications.
Artificial Intelligence 7. Making Deductive Inferences Course V231 Department of Computing Imperial College, London Jeremy Gow.
Artificial Intelligence “Introduction to Formal Logic” Jennifer J. Burg Department of Mathematics and Computer Science.
CS 285- Discrete Mathematics Lecture 4. Section 1.3 Predicate logic Predicate logic is an extension of propositional logic that permits concisely reasoning.
Lecture 2 (Chapter 2) Introduction to Semantics and Pragmatics.
Chapter 2 Fundamentals of Logic 1. What is a valid argument or proof?
Lecture 1 Ling 442.
Introduction to Philosophy Lecture 1-b What is Philosophy? (Part 2) By David Kelsey.
PHILOSOPHY OF LANGUAGE Some topics and historical issues of the 20 th century.
ComputingTextual Inferences Cleo Condoravdi Palo Alto Research Center Natural Language Understanding Stanford February 25, 2010.
Discrete Math by R.S. Chang, Dept. CSIE, NDHU1 Fundamentals of Logic 1. What is a valid argument or proof? 2. Study system of logic 3. In proving theorems.
Implicature. I. Definition The term “Implicature” accounts for what a speaker can imply, suggest or mean, as distinct from what the speaker literally.
Artificial Intelligence Logical Agents Chapter 7.
Knowledge Representation Lecture 2 out of 5. Last Week Intelligence needs knowledge We need to represent this knowledge in a way a computer can process.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
PRESUPPOSITION AND ENTAILMENT
Chapter 7. Propositional and Predicate Logic
2. The Logic of Compound Statements Summary
COMP 1380 Discrete Structures I Thompson Rivers University
Semantics (Representing Meaning)
Inductive / Deductive reasoning
Language, Logic, and Meaning
PRESUPPOSITION and ENTAILMENT
Logic Problems and Questions
Computer Security: Art and Science, 2nd Edition
CSNB234 ARTIFICIAL INTELLIGENCE
COMP 1380 Discrete Structures I Thompson Rivers University
Logic Logic is a discipline that studies the principles and methods used to construct valid arguments. An argument is a related sequence of statements.
Presentation transcript:

Natural Logic? Lauri Karttunen Cleo Condoravdi Annie Zaenen Palo Alto Research Center

Overview Part 1 Why Natural Logic MacCartneys NatLog Part 2 PARC Bridge Discussion

Anna Szabolski on the semantic enterprise (2005) On this view, model theory has primacy over proof theory. A language may be defined or described perfectly well without providing a calculus (thus, a logic) for it, but a calculus is of distinctly limited interest without a class of models with respect to which it is known to be sound and (to some degree) complete. It seems fair to say that (a large portion of) mainstream formal semantics as practiced by linguists is exclusively model theoretic. As I understand it, the main goal is to elucidate the meanings of expressions in a compositional fashion, and to do that in a way that offers an insight into natural language metaphysics (Bach 1989) and uncovers universals of the syntax/semantics interface 4.

Anna Szabolski The idea that our way of doing semantics (model-theory) is both insightful and computationally (psychologically) unrealistic has failed to intrigue formal semanticists into action. Why? There are various, to my mind respectable, possibilities. (i) Given that the field is young and still in the process of identifying the main facts it should account for, we are going for the insight as opposed to the potential of computational / psychological reality. (ii) We dont care about psychological reality and only study language in the abstract. (iii) We do care about potential psychological reality but are content to separate the elucidation of meaning (model theory) from the account of inferencing (proof theory). But if the machineries of model theory and proof theory are sufficiently different, option (iii) may end up with a picture where speakers cannot know what sentences mean, so to speak, only how to draw inferences from them. Is that the correct picture?

Why model theory is not in fashion in Computational Linguistics Computers dont have realistic models up to now; everything is syntax

Moss (2005) If one is seriously interested in entailment, why not study it axiomatically instead of building models? In particular, if one has a complete proof system, why not declare it to be the semantics? Indeed, why should semantics be founded on model theory rather than proof theory?

Why full-fledged proof theory is not in fashion in Computational Linguistics Too big an enterprise to be undertaken in one go FOL is undecidable. Ambitious attempt: Fracas (DRS) Need to work up our way through decidable logics: Mosss hierarchy Unfortunately limited to the logical connectives

Natural Logic? Long tradition: Aristotle, scholastics, Quine(?) Wittgenstein(?), Davidson, Parsons,...

Lakoff (i) We want to understand the relationship between grammar and reasoning. (ii) We require that significant generalizations, especially linguistic ones, be stated. (iii) On the basis of (i) and (ii), we have been led tentatively to the generative semantics hypothesis. We assume that hypothesis to see where it leads. Given these aims, empirical linguistic considerations play a role in determining what the logical forms of sentences can be. Let us now consider certain other aims. (iv) We want a logic in which all the concepts expressible in natural language can be expressed unambiguously, that is, in which all non- synonymous sentences (at least, all sentences with different truth conditions) have different logical forms. (v) We want a logic which is capable of accounting for all correct inferences made in natural language and which rules out incorrect ones. We will call any logic meeting the goals of (i)-(v) a 'natural logic'.

Basic idea Some inferences can be made on the basis of linguistic form alone. John and Mary danced. John danced. Mary danced. The boys sang beautifully. The boys sang. But: Often studied by philosophers interested in a limited number of phenomena Often ignoring the effect of lexical items.

Problem 1: impact of lexical items is pervasive John and Mary carried the piano. ?? John carried the piano. The boys sang allegedly. ?? The boys sang. There are no structural inferences without lexical items playing a role. When lexical items are taken into account, the domain of natural logic goes beyond what has been studied under that name up to now.

Problem 2: Need for disambiguation we cannot work on literal strings The members of the royal family are visiting dignitaries. visiting dignitaries can be boring. a. Therefore, the members of the royal family can be boring. b. Therefore, what the members of the royal family are doing can be boring.

Advantages of natural logic Lexico-syntactic Incremental

What is doable Syntactic approaches geared to specific inferences Examples: MacCartneys approach to Natural Logic PARCs Bridge Textual entailment (minimal world knowledge) Geared to existential claims (What happened, where, when)

Existential claims What happened? Who did what to whom? Microsoft managed to buy Powerset. Microsoft acquired Powerset. Shackleton failed to get to the South Pole. Shackleton did not reach the South Pole. The destruction of the file was not illegal. The file was destroyed. The destruction of the file was averted. The file was not destroyed.

Monotonicity What happened? Who did what to whom? Every boy managed to buy a small toy. Every small boy acquired a toy. Every explorer failed to get to the South Pole. No experienced explorer reached the South Pole. No file was destroyed. No sensitive file was destroyed. The destruction of a sensitive file was averted. A file was not destroyed. The creation of a new benefit was averted. A benefit was not created.

Recognizing Textual Inferences Recognizing Textual Inferences

MacCartneys Natural Logic (NatLog) Point of departure: Sanchez Valencias elaborations of Van Benthems Natural Logic Seven relevant relations: xy equivalencecouch sofa x=y x y forward entailmentcrow birdx y x y reverse entailmentAsian Thaix y x^y negationable^unablex y = 0 x y=U x|y alternationcat|dogx y = 0 x yU x y coveranimal non-apex y 0 x y=U x#y independencehungry#hippo

Table of joins for 7 basic entailment relations ^| # ^| # |# || ^| # |# # ^| # # ^^ | # || ^| # | |# |# ^| # # # ## # |# # # ^| # Cases with more than one possibility indicate loss of information. The join of # and # is totally uninformative.

Entailment relations between expressions differing in atomic edits (substitution, insertion, deletion) Substitutions: open classes (need to be of the same type) Synonyms: relation Hypernyms: relation (crow bird) Antonyms: | relation (hot|cold) Note: not ^ in most cases, no excluded middle. Other nouns: | (cat|dog) Other adjectives: # (weak#temporary) Verbs: ?? … Geographic meronyms: (in Kyoto in Japan) but note: not without the preposition Kyoto is beautiful Japan is beautiful

Substitutions: closed classes, example quantifiers: all every every some (non-vacuity assumed) some ^ no no | every (non-vacuity assumed) four or more two or more exactly four | exactly two at most four at least two (overlap at 2, 3, 4) most # ten or more

Deletions and insertions: default: (upward- monotone contexts are prevalent) e.g. red car car But doesnt hold for negation, non-intersective adjectives, implicatives.

Composition Bottom up nobody can enter without a bottle of wine nobody can enter without a bottle of liquor (nobody (can (enter (without wine))) lexical entailment: wine liquor without: downward monotone without wine without liquor can upward monotone, nobody downward monotone nobody can enter without wine nobody can enter without liquor

Composition: projectivity of logical connectives connective ^| # Negation (not) ^ |# Conjunction (and)/intersectio n ||## Disjunction (or) # # Conditional antecedent #### Conditional consequent ||## Biconditional######

Composition: projectivity of logical connectives connective ^| # Conjunction (and)/intersectio n ||## happy glad kiss touchkiss and hug touch and hug human ^ nonhumanliving human | living nonhuman French | GermanFrench wine | Spanish wine Metallic nonferrousmetallic pipe # nonferrous pipe swimming # hungry

Composition: projectivity of logical connectives connective ^| # Disjunction (or) # # happy glad happy or rich glad or rich kiss touchkiss or hug touch or hug human ^ nonhumanhuman or equine ^ nonhuman or equine French | GermanFrench or Spanish # German or Spanish more that 4 less than 63 or more than 4 3 or less than 6

Composition: projectivity of logical connectives connective ^| # Negation (not) ^ |# happy glad not happy not glad kiss touchnot kiss not touch human ^ nonhumannot human ^ not nonhuman French | Germannot French not German more that 4 less than 6not more than 4 | not less than 6 swimming # hungrynot swimming # not hungry

Compositionality: projectivity of quantifiers quantifier1 st argument2 nd argument ^|# ^|# some ## º## no |#|# |#|# every |#|# ||## not every ## ## at least two #### #### most###### ||## exactly one############ all but one############

Compositionality: projectivity of quantifiers: some, first argument quantifier1 st argument2 nd argument ^|# ^|# some ## ## couch,sofa: some couches sag some sofas sag finch,bird: some finches sing some birds sing boy,small boy: some boys sing some small boys sing human, non-human: Some humans sing some non-humans sing boy,girl: Some boys sing # Some girls sing animal,non-ape: Some animals breathe some non-apes breathe ^ | #

Compositionality: projectivity of quantifiers: some, second argument quantifier1 st argument2 nd argument ^|# ^|# some ## ## beautiful,pretty: some couches are pretty some couches are beautiful sing beautifully,sing: some finches sings beautifully some finches sing sing, sing beautifully: some finches sing some finches sing beautifully human,non-human: some humans sing some non-humans sing late|early: some people were early # some people were late ^ |

Compositionality: projectivity of quantifiers: no; first argument quantifier1 st argument2 nd argument ^|# ^|# no |#|# |#|# ^ | couch,sofa: no couches sag no sofas sag finch,bird: no finches sing no birds sing boy,small boy: no boys sing no small boys sing human, non-human: no humans sing | no non-humans sing boy, girl: no boys sing # no girls sing Animal,non-ape: no animals breathe | no non-apes breathe

Compositionality: projectivity of quantifiers: every; first argument quantifier1 st argument2 nd argument ^|# ^|# every |#|# ||## ^ | couch,sofa: every couch sags every sofa sags finch,bird: every finch sings every bird sings boy,small boy: every boy sings every small boy sings human, non-human: every human sings | every non-human sings boy, girl: every boy sings # every girl sings animal, non-ape: every animal breathes | every non-ape breathes

Compositionality: projectivity of quantifiers: not every; first argument quantifier1 st argument2 nd argument ^|# ^|# not every ## ## ^ | # couch, sofa: not every couch sags not every sofa sags finch, bird: not every finch sings not every bird sings boy, small boy: not every boy sings not every small boy sings human, non-human: not every human sings not every non-human sings boy, girl: not every boy sings # not every girl sings animal, non-ape: not every animal breathes not every non-ape breathes

Projectivity of Verbs most verbs are upward monotone and project ^, |, and as # humans ^ nonhumans eats humans # eats non-humans but there are a lot of exceptions verbs with sentential complements require special treatment: factives, counterfactives, implicatives… (Parc verb classes)

Factives ClassInference Pattern Positive Negative ++/-+ forget that is odd that forget that X X, not forget that X X is odd that X X, is not odd that X X +-/-- pretend that pretend to pretend that X not X, not pretend that X not X pretend to X not X, not pretend to X not X Abraham pretended that Sarah was his sister. Sarah was not his sister Howard did not pretend that it did not happen. It happened. host polarity complement polarity + + host polarity complement polarity - + host polarity complement polarity + - host polarity complement polarity - - forget that pretend that

Implicatives ++/-- manage to +-/-+ fail to manage to X X, not manage to X not X fail to X not X, not fail to X X ++ force to force X to Y Y +- refuse to refuse to X not X -- be able to not be able to X not X -+ hesitate to not hesitate to X X ClassInference Pattern Two-way implicatives One-way implicatives

Translating PARC classes into the MacCartney approach signdelinsexample implicatives++/--He managed to escape he escaped ++ He was forced to sell he sold -- He was permitted to live he did live +-/-+^^He failed to pay ^ he paid +-||He refused to fight | he fought -+ He hesitated to ask he asked factives+-/+ +-/- Neutral##He believed he had won/ he had won does not take the presuppositions of the implicatives into account

T. Ed didnt forget to force Dave to leave H. Dave left if(e)g(x i-1,e) projections h(x 0,x i ) joins 0 Ed didnt fail to force Dave to leave 1 Ed didnt force Dave to leave DEL(fail) ^ Context downward monotone: ^ ^ 2 Ed forced Dave to leave DEL(not) ^ Context upward monotone : ^ Join of ^,^: 3 Dave left DEL(force) Context upward monotone: Join of, :

t: We were not able to smoke h: We smoked Cuban cigars ixixi eiei f(e i )g(x i-1,e i )h(x 0,x i ) 0We were not able to smoke 1We did not smokeDEL(perm it) Downward monotone: 2We smokedDEL(not)^Upward monotone: ^ Join of,^: | 3We smoked Cuban cigars INS(C.cig ars) Upward monotone: Join of |, : | We end up with a contradiction

Why do the factives not work? MacCartneys system assumes that the implicatures switch when negation is inserted or deleted But that is not the case with factives and counterfactives, they need a special treatment

Other limtations De Morgans laws: Not all birds fly some birds do not fly Buy/sell, win/lose Doesnt work with atomic edits as defined. Needs larger units

Bridge vs NatLog NatLog Symmetrical between t and h Bottom up Local edits, more compositional Surface based Integrates monotonicity and implicatives tightly Bridge Asymmetrical between t and h Top down Global rewrites possible Normalized input Monotonicity calculus and implicatives less tightly coupled We need more power than NatLog allows for but it needs to be deployed in a more constrained way than the current Bridge system demonstrates

PARCs BRIDGE System PARCs BRIDGE System

Anna Szabolski 2005 Consider the model theoretic and the natural deduction treatments of the propositional connectives. The two ways of explicating conjunction and disjunction amount to the same thing indeed: if you know the one you can immediately guess the other. Not so with classical negation. The model theoretic definition is in one step: ¬p is true if and only if p is not true. In contrast, natural deduction obtains the same result in three steps. First, elimination and introduction rules for ¬ yield a notion of negation as in minimal logic. Then the rule Ex Falso Sequitur Quodlibet is added to obtain intuitionistic negation, and finally Double Negation Cancellation to obtain classical negation. While it may be a matter of debate which explication is more insightful, it seems clear that the two are intuitively not the same, even though eventually they deliver the same result.

Van Benthem Dictum de Omni et Nullo: admissible inferences of two kinds: downward monotonic (substituting stronger predicates for weaker ones), upward monotonic (substituting weaker predicates for stronger ones). Conservativity: Q AB iff Q A(BintersectionA)

Toward NL Understanding Local Textual Inference A measure of understanding a text is the ability to make inferences based on the information conveyed by it. Veridicality reasoning Did an event mentioned in the text actually occur? Temporal reasoning When did an event happen? How are events ordered in time? Spatial reasoning Where are entities located and along which paths do they move? Causality reasoning Enablement, causation, prevention relations between events

Knowledge about words for access to content The verb acquire is a hypernym of the verb buy The verbs get to and reach are synonyms Inferential properties of manage, fail, avert, not Monotonicity properties of every, a, no, not Every () (), A () (), No() (), Not () Restrictive behavior of adjectival modifiers small, experienced, sensitive The type of temporal modifiers associated with prepositional phrases headed by in, for, through, or even nothing (e.g. last week, every day) Construction of intervals and qualitative relationships between intervals and events based on the meaning of temporal expressions

Local Textual Inference Initiatives PASCAL RTE Challenge (Ido Dagan, Oren Glickman) 2005, 2006 PREMISE CONCLUSION TRUE/FALSE Rome is in Lazio province and Naples is in Campania. Rome is located in Lazio province. TRUE ( = entailed by the premise) Romano Prodi will meet the US President George Bush in his capacity as the president of the European commission. George Bush is the president of the European commission. FALSE (= not entailed by the premise)

World knowledge intrusion Romano Prodi will meet the US President George Bush in his capacity as the president of the European commission. George Bush is the president of the European commission. FALSE Romano Prodi will meet the US President George Bush in his capacity as the president of the European commission. Romano Prodi is the president of the European commission. TRUE G. Karas will meet F. Rakas in his capacity as the president of the European commission. F. Rakas is the president of the European commission. TRUE

Inference under a particular construal Romano Prodi will meet the US President George Bush in his capacity as the president of the European commission. George Bush is the president of the European commission. FALSE (= not entailed by the premise on the correct anaphoric resolution) G. Karas will meet F. Rakas in his capacity as the president of the European commission. F. Rakas is the president of the European commission. TRUE (= entailed by the premise on one anaphoric resolution)

Compositionality: projectivity of quantifiers: not every; second argument quantifier1 st argument2 nd argument ^|# ^|# not every ## ## ^ | #

Compositionality: projectivity of quantifiers quantifier1 st argument2 nd argument ^|# ^|# at least two #### ####

Compositionality: projectivity of quantifiers quantifier1 st argument2 nd argument ^|# ^|# most###### ||##

Compositionality: projectivity of quantifiers quantifier1 st argument2 nd argument ^|# ^|# exactly one############

Compositionality: projectivity of quantifiers quantifier1 st argument2 nd argument ^|# ^|# all but one############

some

Compositionality: projectivity of quantifiers: no; second argument quantifier1 st argument2 nd argument ^|# ^|# no |#|# |#|# ^ | #

Compositionality: projectivity of quantifiers: every; second argument quantifier1 st argument2 nd argument ^|# ^|# every |#|# ||## ^ | #

Composition: projectivity of logical connectives connective ^| # Biconditional######

Composition: projectivity of logical connectives connective ^| # Conditional antecedent #### kiss touch If she kissed her, she likes her if she touched her, she likes her human ^ nonhuman

Composition: projectivity of logical connectives connective ^| # Conditional consequent ||## kiss touch If he wins Ill kiss him if he wins Ill touch him human ^ nonhuman If it does this it shows that it is human | if it does this it shows that it is nonhuman French | German If he wins he gets French wine | if he wins he gets German wine