Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Fall 2004.

Slides:



Advertisements
Similar presentations
Inference in first-order logic
Advertisements

Inference in First-Order Logic
Artificial Intelligence 8. The Resolution Method
Some Prolog Prolog is a logic programming language
Resolution Proof System for First Order Logic
Inference in first-order logic
First-Order Logic.
Inference Rules Universal Instantiation Existential Generalization
Standard Logical Equivalences
Resolution.
ITCS 3153 Artificial Intelligence Lecture 15 First-Order Logic Chapter 9 Lecture 15 First-Order Logic Chapter 9.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Inference and Reasoning. Basic Idea Given a set of statements, does a new statement logically follow from this. For example If an animal has wings and.
Artificial Intelligence Inference in first-order logic Fall 2008 professor: Luigi Ceccaroni.
We have seen that we can use Generalized Modus Ponens (GMP) combined with search to see if a fact is entailed from a Knowledge Base. Unfortunately, there.
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
For Friday No reading Homework: –Chapter 9, exercise 4 (This is VERY short – do it while you’re running your tests) Make sure you keep variables and constants.
13 Automated Reasoning 13.0 Introduction to Weak Methods in Theorem Proving 13.1 The General Problem Solver and Difference Tables 13.2 Resolution.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Logic.
RESOLUTION: A COMPLETE INFERENCE PROCEDURE. I Then we certainly want to be able to conclude S(A); S(A) is true if S(A) or R(A) is true, and one of those.
1 Resolution in First Order Logic CS 171/271 (Chapter 9, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Outline Recap Knowledge Representation I Textbook: Chapters 6, 7, 9 and 10.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
Inference in FOL All rules of inference for propositional logic apply to first-order logic We just need to reduce FOL sentences to PL sentences by instantiating.
First-Order Logic Inference
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2004.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9.
1 Automated Reasoning Introduction to Weak Methods in Theorem Proving 13.1The General Problem Solver and Difference Tables 13.2Resolution Theorem.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Methods of Proof Chapter 7, second half.
Knoweldge Representation & Reasoning
Inference in First-Order Logic
Artificial Intelligence
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2005.
1 Inference in First-Order Logic. 2 Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward chaining.
Cooperating Intelligent Systems Inference in first-order logic Chapter 9, AIMA.
Propositional Logic Reasoning correctly computationally Chapter 7 or 8.
INFERENCE IN FIRST-ORDER LOGIC IES 503 ARTIFICIAL INTELLIGENCE İPEK SÜĞÜT.
Inference in first-order logic I CIS 391 – Introduction to Artificial Intelligence AIMA Chapter (through p. 278) Chapter 9.5 (through p. 300)
Inference in First-Order logic Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
1 Inference in first-order logic Chapter 9. 2 Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Logical Inference 2 rule based reasoning
CS 416 Artificial Intelligence Lecture 12 First-Order Logic Chapter 9 Lecture 12 First-Order Logic Chapter 9.
Logical Agents Logic Propositional Logic Summary
CS Introduction to AI Tutorial 8 Resolution Tutorial 8 Resolution.
Computing & Information Sciences Kansas State University Lecture 14 of 42 CIS 530 / 730 Artificial Intelligence Lecture 14 of 42 William H. Hsu Department.
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
CPSC 386 Artificial Intelligence Ellen Walker Hiram College
1 Inference in First-Order Logic CS 271: Fall 2009.
1 Inference in First Order Logic CS 171/271 (Chapter 9) Some text and images in these slides were drawn from Russel & Norvig’s published material.
An Introduction to Artificial Intelligence – CE Chapter 9 - Inference in first-order logic Ramin Halavati In which we define.
1 First Order Logic CS 171/271 (Chapters 8 and 9) Some text and images in these slides were drawn from Russel & Norvig’s published material.
For Wednesday Finish reading chapter 10 – can skip chapter 8 No written homework.
Inference in first-order logic
Inference in First Order Logic. Outline Reducing first order inference to propositional inference Unification Generalized Modus Ponens Forward and backward.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
Inference in First-Order Logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
1 Chaining in First Order Logic CS 171/271 (Chapter 9, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Inference in first-order logic
Inference in First-Order Logic
Inference in First-Order Logic
Inference in first-order logic part 1
Artificial Intelligence
Inference in first-order logic part 1
Presentation transcript:

Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Fall 2004

CSE 471/598, CBS 598 by H. Liu2 Inference with quantifiers Previous Rules for propositional logic Modus Ponens (p 211) And-Elimination, plus Fig 7.11 (p 210) Resolution (p 213) What’s new now are variables and quantifiers How can we reuse the above rules?  By introducing more rules to remove V’s and Q’s More rules with SUBST( ,  )  - binding list,  - sentence SUBST({x/Sam}, …)  Sibling(x,John)) = Sibling(Sam,John)

CSE 471/598, CBS 598 by H. Liu3 Inference rules for quantifiers Universal Instantiation (UI) For any sentence , variable v, and ground term g, E.g., for all greedy kings are evil Existential Instantiation (EI) For any , v, and constant k new to KB, E.g., John has a crown on his head

CSE 471/598, CBS 598 by H. Liu4 UI and EI The new k in EI is used to name a specific object and called a Skolem constant in logic If it’s not new, what would happen? UI and EI are different in inference UI can be applied many times EI can be applied once, then the sentence with EQ can be removed  Is the new KB logically equivalent to the old KB?  They are only inferentially equivalent

CSE 471/598, CBS 598 by H. Liu5 Generalized Modus Ponens It raises Modus Ponens from Prop Logic to FOL – lifting It takes bigger steps in inference It’s focused - not randomly trying UIs If we know King(John),  y Greedy(y), and  x King(x) ^ Greddy(x) => Evil(x), we can apply GMP with Θ is {x/John, y/John}, q is Evil(x), and SUBST(Θ, q) = Evil(John)

CSE 471/598, CBS 598 by H. Liu6 Unification UNIFY (p,q)=  (unifier - the binding list) where SUBST( ,p) = SUBST( ,q) Some examples: Unifying Knows(Jo,x) with Knows(Jo,Ja)Knows(y,Bi)Knows(y,Mother(y)) Knows(x,Elizabeth) Standardizing apart: Rename one sentence to avoid name clashes Most general unifier Finding MGU algorithm in Fig 9.1  Occur-Check if the variable itself occurs inside the complex term; omitting it can result in unsound inferences

CSE 471/598, CBS 598 by H. Liu7 An example proof The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American. (p 280) To prove (infer) that West is a criminal. FOL, using these predicates American, Weapon, Sells(x,y,z), Hostile, Criminal, Owns(x,y), Missile, Enemy(x,America)  Datalog knowledge bases – FO definite clauses with no function symbols How do you prove it?

CSE 471/598, CBS 598 by H. Liu8 Example (continued) The proof can be very LONG for such a simple problem if using UI (substituting with a ground term). The branching factor increases as KB grows. Universal Instantiation has an enormous branching factor. We need more principled ways for proofs Combining atomic sentences to conjunctions, instantiating universal rules to match, then applying Generalized Modus Ponens Two ways to proceed

CSE 471/598, CBS 598 by H. Liu9 Forward chaining A first-order definite clause either is atomic or is an implication p1^p2^…^p3  p p’s are positive literals Start with the KB and generate new conclusions using Modes Ponens FC is usually used when a new fact is added into the KB, check if there are any new consequences Algorithm FOL-FC-ASK (Fig 9.3) An example of answering who is criminal - Criminal(x) (Fig 9.4)

CSE 471/598, CBS 598 by H. Liu10 Backward chaining Start with something to be proved and find implication sentences to conclude it The list of goals is a stack waiting to be worked on When all goals are satisfied, the proof succeeds BC is used when there is a goal to be proved Algorithm FOL-BC-ASK (Fig 9.6) An example of answering who is criminal -Criminal(x) (Fig 9.7) BC is a depth-first search algorithm that suffers from problems with repeated states and incompleteness Which chaining method to use? Any suggestion?

CSE 471/598, CBS 598 by H. Liu11 Completeness An incomplete proof procedure - there are sentences entailed by the KB, the procedure cannot find them (Fig 9.10). Significance of completeness (Math) All conjectures can be established mechanically We only need a set of fundamental axioms Significance to AI - a machine can solve any problem that can be stated in FOL Looking for a complete proof procedure

CSE 471/598, CBS 598 by H. Liu12 Completeness theorem If KB  , then KB  R  - Godel’s completeness theorem What’s the procedure R ? “There exists one” does tell us which one Resolution algorithm is one such procedure Entailment in FOL is semidecidable: We can show sentences follow, if they do; but we can’t always show if they don’t.

CSE 471/598, CBS 598 by H. Liu13 Resolution A refutation complete inference procedure Algorithm (Fig. 7.12): the same for both logics Resolution proves KB   by proving KB^!  unsatisfiable, or by deriving the empty clause Conjunctive normal form for FOL Resolution inference rule where UNIFY(l i, ¬m j ) =  First-order literals are complementary if one unifies with the negation of the other

CSE 471/598, CBS 598 by H. Liu14 Canonical forms for resolution Conjunctive normal form (CNF) CNF: Conjunction of disjunctions of literals The KB is one big, implicit conjunctions Implicative normal form (INF) CNF and INF are notational variants Skolemization Eliminate existential quantifiers  It is not as straightforward to replace with a new constant Skolem functions  The arguments of the Skolem function are all universally quantified variables

CSE 471/598, CBS 598 by H. Liu15 Conversion to normal form 1. Eliminate implications 2. Move NOT inwards 3. Standardize variables 4. Skolemize - removing E-Quantifier introducing a function associated with the variable 5. Drop universal quantifiers 6. Distribute ^ over v An example  Everyone is loved by someone  Everyone loves all animals  Everyone who loves all animals is loved by someone

CSE 471/598, CBS 598 by H. Liu16 Proof revisit Criminal(West)

CSE 471/598, CBS 598 by H. Liu17 Another example proof Everyone who loves all animals is loved by someone. Anyone who kills an animal is loved by no one. Jack loves all animals. Either Jack or Curiosity killed the cat, who is named Tuna. Did Curiosity kill the cat?

CSE 471/598, CBS 598 by H. Liu18 Resolution strategies To guide the fast proof using refutation Unit preference  Prefer inferences that produce shorter sentences Set of support  Use the negated query as the set of support Input resolution  Combines one of the input sentences (KB or Q) Subsumption  Keeps KB small by eliminating all sentences that are subsumed by an existing sentence

CSE 471/598, CBS 598 by H. Liu19 Completeness of resolution Resolution is refutation-complete If a set of sentences is unsatisfiable, then resolution will always be able to derive a contradiction – the empty clause.

CSE 471/598, CBS 598 by H. Liu20 Summary Simple FOL proofs are complex and long Generalized MP is natural and powerful, used in forward or backward chaining Resolution with refutation using resolution is complete There are strategies to guide search