Lecture 10-1CS250: Intro to AI/Lisp What do you mean, “What do I mean?” continued... Lecture 10-1 November 30 th, 1999 CS250.

Slides:



Advertisements
Similar presentations
First-Order Logic Chapter 8.
Advertisements

Inference in First-Order Logic
Artificial Intelligence 8. The Resolution Method
Some Prolog Prolog is a logic programming language
First-Order Logic.
Inference Rules Universal Instantiation Existential Generalization
Standard Logical Equivalences
First-Order Logic (FOL) aka. predicate calculus. First-Order Logic (FOL) Syntax User defines these primitives: – Constant symbols (i.e., the "individuals"
Resolution.
ITCS 3153 Artificial Intelligence Lecture 15 First-Order Logic Chapter 9 Lecture 15 First-Order Logic Chapter 9.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
We have seen that we can use Generalized Modus Ponens (GMP) combined with search to see if a fact is entailed from a Knowledge Base. Unfortunately, there.
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
For Friday No reading Homework: –Chapter 9, exercise 4 (This is VERY short – do it while you’re running your tests) Make sure you keep variables and constants.
13 Automated Reasoning 13.0 Introduction to Weak Methods in Theorem Proving 13.1 The General Problem Solver and Difference Tables 13.2 Resolution.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Logic.
Some Thoughts to Consider 7 What is the difference between Knowledge Engineering and Programming? What is the difference between Knowledge Engineering.
F22H1 Logic and Proof Week 7 Clausal Form and Resolution.
Predicate Calculus Russell and Norvig: Chapter 8,9.
RESOLUTION: A COMPLETE INFERENCE PROCEDURE. I Then we certainly want to be able to conclude S(A); S(A) is true if S(A) or R(A) is true, and one of those.
Outline Recap Knowledge Representation I Textbook: Chapters 6, 7, 9 and 10.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
1 FOL Resolution based Inferencing Resolution based rules studied earlier can lead to inferences, rules such as modus ponen, unit resolution etc… Other.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2004.
1 Automated Reasoning Introduction to Weak Methods in Theorem Proving 13.1The General Problem Solver and Difference Tables 13.2Resolution Theorem.
Inference and Resolution for Problem Solving
Methods of Proof Chapter 7, second half.
Knoweldge Representation & Reasoning
Inference in First-Order Logic
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Fall 2004.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2005.
Inference in First-Order Logic Inference Rules with Quantifiers The three new inference rules are as follows: Universal Elimination: For any sentence ,
INFERENCE IN FIRST-ORDER LOGIC IES 503 ARTIFICIAL INTELLIGENCE İPEK SÜĞÜT.
Inference in First-Order logic Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
UIUC CS 497: Section EA Lecture #3 Reasoning in Artificial Intelligence Professor: Eyal Amir Spring Semester 2004.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Logical Agents Logic Propositional Logic Summary
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Propositional Logic: Methods of Proof (Part II) This lecture topic: Propositional Logic (two lectures) Chapter (previous lecture, Part I) Chapter.
CS Introduction to AI Tutorial 8 Resolution Tutorial 8 Resolution.
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 15 of 41 Friday 24 September.
Reasoning using First-Order Logic
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Lecture 7-2CS250: Intro to AI/Lisp Building Knowledge Bases Lecture 7-2 February 18 th, 1999 CS250.
Lecture 8-2CS250: Intro to AI/Lisp What do you mean, “What do I mean?” Lecture 8-2 November 18 th, 1999 CS250.
Inference in First Order Logic. Outline Reducing first order inference to propositional inference Unification Generalized Modus Ponens Forward and backward.
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
First-Order Logic Semantics Reading: Chapter 8, , FOL Syntax and Semantics read: FOL Knowledge Engineering read: FOL.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Lecture 8-2CS250: Intro to AI/Lisp What do you mean, “What do I mean?” Lecture 8-2 November 18 th, 1999 CS250.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
1 Chaining in First Order Logic CS 171/271 (Chapter 9, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Artificial Intelligence Logical Agents Chapter 7.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Announcements No office hours today!
Inference in first-order logic
Inference in First-Order Logic
Inference in First-Order Logic
Inference in first-order logic part 1
Example of Knowledge Base
Artificial Intelligence
Artificial Intelligence
Inference in first-order logic part 1
Artificial Intelligence
CS 416 Artificial Intelligence
Presentation transcript:

Lecture 10-1CS250: Intro to AI/Lisp What do you mean, “What do I mean?” continued... Lecture 10-1 November 30 th, 1999 CS250

Lecture 10-1CS250: Intro to AI/Lisp Steps in Building Decide what to talk about Decide on a vocabulary Encode general rules Encode an instance Pose queries

Lecture 10-1CS250: Intro to AI/Lisp General Ontologies Categories Measures Composite Objects Time, Space and Change Events and Processes Physical Objects Substances Mental Objects and Beliefs

Lecture 10-1CS250: Intro to AI/Lisp Categories Reification –How many people live on Earth? Inheritance Creating taxonomies –Kentucky Fried Chicken –Dewey decimal –LoC –MeSh

Lecture 10-1CS250: Intro to AI/Lisp Measures Examples: Height, mass, cost Measure = Units function + a Number

Lecture 10-1CS250: Intro to AI/Lisp Composite Objects Not inheritance –Difference between subclass and member Schema Script

Lecture 10-1CS250: Intro to AI/Lisp Composite Objects Not inheritance –Difference between subclass and member General event descriptions –Schema –Script

Lecture 10-1CS250: Intro to AI/Lisp Using Events to Represent Change What’s the problem? –Continuous time –Multiple agents –Actions of different durations Event calculus - Reify events

Lecture 10-1CS250: Intro to AI/Lisp Event Calculus Vocabulary Events are splotches in the space-time continuum Events have subevents Some events are intervals

Lecture 10-1CS250: Intro to AI/Lisp Examples Suppose we wish to represent facts about market manias  f f  BulbEating  SubEvent(f,TulipMania)  PartOf(Location(f), Holland)  s s  StockFrenzy  SubEvent(s,USBullMarket)  PartOf(Location(f), ??)  s s  StockFrenzy  SubEvent(s,USBullMarket)  TradedOn(Exchange(s), NASDAQ)

Lecture 10-1CS250: Intro to AI/Lisp Place How are places like intervals? Relation In holds among places Location function: Maps an object to the smallest place that contains it

Lecture 10-1CS250: Intro to AI/Lisp Processes Why do we need processes when we have events? How can we say: –Barry Sonnenfeld was flying some time yesterday –Barry was flying all day yesterday Kurt D. Fenstermacher: Sonnenfeld directed: Men in Black (1997) Get Shorty (1995) The Addams Family (1991) Kurt D. Fenstermacher: Sonnenfeld directed: Men in Black (1997) Get Shorty (1995) The Addams Family (1991) E(Flying(Barry), Yesterday) T(Flying(Barry), Yesterday)

Lecture 10-1CS250: Intro to AI/Lisp A Logical Blender Suppose Bill is accused of killing a zucchini, and when the cold, but efficient, Detective Frigerator (known to his pals as simply “Re”) questions the orange juice pitcher in FOPL, the orange juice has no idea how to say: “Bill was in the kitchen with the tomato all day yesterday”

Lecture 10-1CS250: Intro to AI/Lisp Composite Events Use And to combine two events with the usual semantics: And isn’t so bad, but disjunction is a bit more complicated -- how do we say: “I saw the whole thing, the beef or the broccoli stabbed the zucchini all afternoon.”  p,q,e T(And(p, q), e)  T(p, e)  T(q, e)

Lecture 10-1CS250: Intro to AI/Lisp Time & Intervals Time is pretty important –Divvy up time into: Moments and ExtendedIntervals –Define a couple handy functions Start End Time Date

Lecture 10-1CS250: Intro to AI/Lisp When Intervals Get Together Meet Before After During Overlap

Lecture 10-1CS250: Intro to AI/Lisp Objects in the Space-Time Continuum Remember that events are splotches of space-time Some events have coherence through time Need to capture the idea of an object existing through time

Lecture 10-1CS250: Intro to AI/Lisp Roman Empire Roman Empire spread across much of Eurasia, expanding and contracting, from 753 B.C. until the 5th century A.D.

Lecture 10-1CS250: Intro to AI/Lisp Roman Empire at 218 B.C.

Lecture 10-1CS250: Intro to AI/Lisp Roman Empire at 117 A.D.

Lecture 10-1CS250: Intro to AI/Lisp Roman Empire at 395 A.D.

Lecture 10-1CS250: Intro to AI/Lisp Fluents Roman Empire is an event –Subevents include First, Second and Third Punic Wars One of the first known hammer and anvil movements in battle (216 Cannae) A fluent allows us to capture the notion of the Roman Empire throughout time T(Male(Emperor(RomanEmpire)), 1stCenturyAD) T(In(Gaul, Roman Empire), AD12)

Lecture 10-1CS250: Intro to AI/Lisp Fluent Flavors Fluent is a function, f:Situations Fvalues –Domain is the set of all situations (states of the world) If Fvalues is {TRUE, FALSE} then it’s a Propositional fluent If Fvalues is {All situations} then it’s a Situational fluent

Lecture 10-1CS250: Intro to AI/Lisp Substances Less vs. fewer Intrinsic vs. extrinsic properties Substances are those things that are fungible

Lecture 10-1CS250: Intro to AI/Lisp Going, Like, Totally Mental What are other agents know, and what are they thinking? –“Everybody’s looking at me” –“They’re trying to kill me” –“You look like someone who knows where I can find extra virgin olive oil” Start with a Believes predicate Believes(Agent, x)

Lecture 10-1CS250: Intro to AI/Lisp Reification & You A good first pass: Treat Flies(Superman) as a propositional fluent –Relationships like Believes, Know and When between agents and propositions are propositional attitudes The problem: Can Clark fly? Believes(Agent, Flies(Superman))

Lecture 10-1CS250: Intro to AI/Lisp “It is clear.” Referential transparency –Any term can be substituted for an equal term –FOL is referentially transparent

Lecture 10-1CS250: Intro to AI/Lisp Knowing for Action Knowing preconditions: What do you need to know to do action a? Knowledge effects: What effect does performing action a have on an agent’s knowledge?

Lecture 10-1CS250: Intro to AI/Lisp Replacing that Zucchini Grocery shopping –Percepts –Actions –Goals –Environment

Lecture 10-1CS250: Intro to AI/Lisp You say you wanna resolution?

Lecture 10-1CS250: Intro to AI/Lisp Chain of Fools Forward chaining –Start with sentences, apply SdMP (GMP) to derive new conclusions –Good when adding new facts Backward chaining –Start from sentences and derive premises –Got goal? American(x) 

Lecture 10-1CS250: Intro to AI/Lisp Forward Chaining Renaming –Two sentences are renamings of one another if they are the same except for variable names for each rule that p unifies with a premise if the other premises are known then add conclusion to KB keep on chainin’

Lecture 10-1CS250: Intro to AI/Lisp Composition Define COMPOSE(T1, T2) to apply two substitutions in a row: SUBST(COMPOSE(T1, T2), p) = SUBST(T2, SUBST(T1, p))

Lecture 10-1CS250: Intro to AI/Lisp Forward Chaining in Action 1) American(x)  Weapon(y)  Nation(z)  Hostile(z)  Sells(x, y, z)  Criminal(x) 2) Owns(Nono, x)  Missile(x)  Sells(West, Nono, x) 3) Missile(x)  Weapon(x) 4) Enemy(x, America)  Hostile(x) ForwardChain(KB, American(West)) ForwardChain(KB, Nation(Nono)) ForwardChain(KB, Enemy(Nono, America)) ForwardChain(KB, Hostile(Nono)) ForwardChain(KB, Owns(Nono, M1)) ForwardChain(KB, Missile(M1)) ForwardChain(KB, Sells(West, Nono, M1)) ForwardChain(KB, Weapon(M1)) ForwardChain(KB, Criminal(West))

Lecture 10-1CS250: Intro to AI/Lisp What’s the Problem? Will-nilly inferencing

Lecture 10-1CS250: Intro to AI/Lisp Backward Chaining Start from what you’re trying to prove, and look for support When a query q is asked: If a matching fact q’ is known return the unifier for each rule whose consequent q’ matches q attempt to prove each premise of rule by backward chaining

Lecture 10-1CS250: Intro to AI/Lisp Revisiting Unification Can we unify: Knows(John, x) & Knows(x, Elizabeth)

Lecture 10-1CS250: Intro to AI/Lisp Now what’s wrong? Is this complete? Inference procedure i is complete iff KB |= i  whenever KB |=  PhD(x)  HighlyQualified(x)  PhD(x)  EarlyEarnings(x) HighlyQualified(x)  Rich(x) EarlEarnings(x)  Rich(x)

Lecture 10-1CS250: Intro to AI/Lisp Does a Complete Algorithm Exist? Kurt says yes –Any sentence that is entailed by another set of sentences can be proved from that set –In other words: We can find a complete inference procedure What is it?

Lecture 10-1CS250: Intro to AI/Lisp Resolution Remember Chapter 6?   ,         ,       Is this an improvement?

Lecture 10-1CS250: Intro to AI/Lisp Resolution Procedure Resolution is a refutation procedure: To prove KB |= , show KB   is unsatisfiable

Lecture 10-1CS250: Intro to AI/Lisp Resolution Procedure

Lecture 10-1CS250: Intro to AI/Lisp Canonical Forms CNF –Start with a bunch of disjunctions –Pretend all of them are joined with one big conjunct INF –Each sentence is an implication with a conjunction of atoms on the left, and a disjunction of atoms on the right

Lecture 10-1CS250: Intro to AI/Lisp Out of the Frying Pan? Created GMP, needed Horn clauses –But can’t always transform sentences into Horn clauses! –Find another procedure Stumble upon resolution, which needs CNF or INF –Can we always transform into CNF or INF?

Lecture 10-1CS250: Intro to AI/Lisp CNF vs. Horn The diff –In Horn, RHS must be an atom –In CNF, RHS is a disjunction MP can derive atomic conclusions, what about resolution? –Recast terms as implications of TRUE

Lecture 10-1CS250: Intro to AI/Lisp Conversion to CNF Can convert any FOL KB into CNF

Lecture 10-1CS250: Intro to AI/Lisp Skolemization Remove existential quantifiers by elimination –Like EE, but more general Replace existentially quantified variables with unique constants –What happens if there’s a universal quantification hiding inside? –Example: Everyone has a heart

Lecture 10-1CS250: Intro to AI/Lisp Resolution Proof To prove  : –Negate it,  –Convert it to CNF –Add to a CNF KB –Infer a contradiction

Lecture 10-1CS250: Intro to AI/Lisp Da Proof