Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)

Slides:



Advertisements
Similar presentations
Russell and Norvig Chapter 7
Advertisements

Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 20
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Logic.
Knowledge and reasoning – second part
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
Logic in general Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the sentences in the language.
Logical Agents Chapter 7.
Methods of Proof Chapter 7, second half.
INTRODUÇÃO AOS SISTEMAS INTELIGENTES Prof. Dr. Celso A.A. Kaestner PPGEE-CP / UTFPR Agosto de 2011.
Knoweldge Representation & Reasoning
Logic Agents and Propositional Logic CS 271: Fall 2007 Instructor: Padhraic Smyth.
Logical Agents Chapter 7 Feb 26, Knowledge and Reasoning Knowledge of action outcome enables problem solving –a reflex agent can only find way from.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7.
Logical Agents. Knowledge bases Knowledge base = set of sentences in a formal language Declarative approach to building an agent (or other system): 
February 20, 2006AI: Chapter 7: Logical Agents1 Artificial Intelligence Chapter 7: Logical Agents Michael Scherger Department of Computer Science Kent.
Propositional Logic: Methods of Proof (Part II)
Logical Agents (NUS) Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
CHAPTERS 7, 8 Oliver Schulte Logical Inference: Through Proof to Truth.
Explorations in Artificial Intelligence Prof. Carla P. Gomes Module 3 Logic Representations (Part 2)
Logical Agents Logic Propositional Logic Summary
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
1 Problems in AI Problem Formulation Uninformed Search Heuristic Search Adversarial Search (Multi-agents) Knowledge RepresentationKnowledge Representation.
Propositional Logic: Methods of Proof (Part II) This lecture topic: Propositional Logic (two lectures) Chapter (previous lecture, Part I) Chapter.
Logic Agents and Propositional Logic 부산대학교 전자전기컴퓨터공학과 인공지능연구실 김민호
An Introduction to Artificial Intelligence – CE Chapter 7- Logical Agents Ramin Halavati
S P Vimal, Department of CSIS, BITS, Pilani
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
1 Logical Agents Chapter 7. 2 A simple knowledge-based agent The agent must be able to: –Represent states, actions, etc. –Incorporate new percepts –Update.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
1 Logical Inference Algorithms CS 171/271 (Chapter 7, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
1 The Wumpus Game StenchBreeze Stench Gold Breeze StenchBreeze Start  Breeze.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
1 Logical Agents Chapter 7. 2 Logical Agents What are we talking about, “logical?” –Aren’t search-based chess programs logical Yes, but knowledge is used.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Logical Agents Russell & Norvig Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean)
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7 Part I. 2 Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents. Inference : Example 1 How many variables? 3 variables A,B,C How many models? 2 3 = 8 models.
CS666 AI P. T. Chung Logic Logical Agents Chapter 7.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Notes 7: Knowledge Representation, The Propositional Calculus
EA C461 Artificial Intelligence
Propositional Logic: Methods of Proof (Part II)
EA C461 – Artificial Intelligence Logical Agent
Notes 7: Knowledge Representation, The Propositional Calculus
Logical Inference: Through Proof to Truth
Logical Agents Chapter 7.
Artificial Intelligence Logical Agents
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
Logical Agents Chapter 7 Selected and slightly modified slides from
Logical Agents Reading: Russell’s Chapter 7
Artificial Intelligence: Agents and Propositional Logic.
CS 416 Artificial Intelligence
EA C461 – Artificial Intelligence Logical Agent
Methods of Proof Chapter 7, second half.
Propositional Logic: Methods of Proof (Part II)
Problems in AI Problem Formulation Uninformed Search Heuristic Search
Presentation transcript:

Methods of Proof Chapter 7, second half.

Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation of new sentences from old. –Resolution –Forward & Backward chaining Model checking Searching through truth assignments. Improved backtracking: Davis--Putnam-Logemann-Loveland (DPLL) Heuristic search in model space: Walksat.

Normal Form We first rewrite into conjunctive normal form (CNF). We like to prove: A “conjunction of disjunctions” (A   B)  (B   C   D) Clause literals Any KB can be converted into CNF. In fact, any KB can be converted into CNF-3 using clauses with at most 3 literals.

Example: Conversion to CNF B 1,1  (P 1,2  P 2,1 ) 1.Eliminate , replacing α  β with (α  β)  (β  α). (B 1,1  (P 1,2  P 2,1 ))  ((P 1,2  P 2,1 )  B 1,1 ) 2. Eliminate , replacing α  β with  α  β. (  B 1,1  P 1,2  P 2,1 )  (  (P 1,2  P 2,1 )  B 1,1 ) 3. Move  inwards using de Morgan's rules and double- negation: (  B 1,1  P 1,2  P 2,1 )  ((  P 1,2   P 2,1 )  B 1,1 ) 4. Apply distributive law (  over  ) and flatten: (  B 1,1  P 1,2  P 2,1 )  (  P 1,2  B 1,1 )  (  P 2,1  B 1,1 )

Resolution Resolution: inference rule for CNF: sound and complete! “If A or B or C is true, but not A, then B or C must be true.” “If A is false then B or C must be true, or if A is true then D or E must be true, hence this A is either true or false, B or C or D or E must be true.” Simplification

The resolution algorithm tries to prove: Generate all new sentences from KB and the query. One of two things can happen: 1.We find which is unsatisfiable. I.e. we can entail the query. 2.We find no contradiction: there is a model that satisfies the sentence (non-trivial) and hence we cannot entail the query. Resolution Algorithm

Resolution example KB = (B 1,1  (P 1,2  P 2,1 ))  B 1,1 α =  P 1,2 False in all worlds True!

Horn Clauses Resolution can be exponential in space and time. If we can reduce all clauses to “Horn clauses” resolution is linear in space and time A clause with at most 1 positive literal. e.g. Every Horn clause can be rewritten as an implication with a conjunction of positive literals in the premises and a single positive literal as a conclusion. e.g. 1 positive literal: definite clause 0 positive literals: Fact or integrity constraint: e.g. Forward Chaining and Backward chaining are sound and complete with Horn clauses and run linear in space and time.

Forward chaining Idea: fire any rule whose premises are satisfied in the KB, –add its conclusion to the KB, until query is found Forward chaining is sound and complete for Horn KB AND gate OR gate

Forward chaining example “AND” gate “OR” Gate

Forward chaining example

Backward chaining Idea: work backwards from the query q check if q is known already, or prove by BC all premises of some rule concluding q Hence BC maintains a stack of sub-goals that need to be proved to get to q. Avoid loops: check if new sub-goal is already on the goal stack Avoid repeated work: check if new sub-goal 1.has already been proved true, or 2.has already failed

Backward chaining example

we need P to prove L and L to prove P.

Backward chaining example

Forward vs. backward chaining FC is data-driven, automatic, unconscious processing, –e.g., object recognition, routine decisions May do lots of work that is irrelevant to the goal BC is goal-driven, appropriate for problem-solving, –e.g., Where are my keys? How do I get into a PhD program? Complexity of BC can be much less than linear in size of KB

Model Checking Two families of efficient algorithms: Complete backtracking search algorithms: DPLL algorithm Incomplete local search algorithms –WalkSAT algorithm

The DPLL algorithm Determine if an input propositional logic sentence (in CNF) is satisfiable. This is just backtracking search for a CSP. Improvements: 1.Early termination A clause is true if any literal is true. A sentence is false if any clause is false. 2.Pure symbol heuristic Pure symbol: always appears with the same "sign" in all clauses. e.g., In the three clauses (A   B), (  B   C), (C  A), A and B are pure, C is impure. Make a pure symbol literal true. (if there is a model for S, then making a pure symbol true is also a model). 3 Unit clause heuristic Unit clause: only one literal in the clause The only literal in a unit clause must be true. Note: literals can become a pure symbol or a unit clause when other literals obtain truth values. e.g.

The WalkSAT algorithm Incomplete, local search algorithm Evaluation function: The min-conflict heuristic of minimizing the number of unsatisfied clauses Balance between greediness and randomness

Hard satisfiability problems Consider random 3-CNF sentences. e.g., (  D   B  C)  (B   A   C)  (  C   B  E)  (E   D  B)  (B  E   C) m = number of clauses (5) n = number of symbols (5) –Hard problems seem to cluster near m/n = 4.3 (critical point)

Hard satisfiability problems

Median runtime for 100 satisfiable random 3- CNF sentences, n = 50

Inference-based agents in the wumpus world A wumpus-world agent using propositional logic:  P 1,1 (no pit in square [1,1])  W 1,1 (no Wumpus in square [1,1]) B x,y  (P x,y+1  P x,y-1  P x+1,y  P x-1,y ) (Breeze next to Pit) S x,y  (W x,y+1  W x,y-1  W x+1,y  W x-1,y ) (stench next to Wumpus) W 1,1  W 1,2  …  W 4,4 (at least 1 Wumpus)  W 1,1   W 1,2 (at most 1 Wumpus)  W 1,1   W 8,9 …  64 distinct proposition symbols, 155 sentences

KB contains "physics" sentences for every single square For every time t and every location [x,y], L x,y  FacingRight t  Forward t  L x+1,y Rapid proliferation of clauses. First order logic is designed to deal with this through the introduction of variables. Expressiveness limitation of propositional logic t+1t position (x,y) at time t of the agent.

Summary Logical agents apply inference to a knowledge base to derive new information and make decisions Basic concepts of logic: –syntax: formal structure of sentences –semantics: truth of sentences wrt models –entailment: necessary truth of one sentence given another –inference: deriving sentences from other sentences –soundness: derivations produce only entailed sentences –completeness: derivations can produce all entailed sentences Wumpus world requires the ability to represent partial and negated information, reason by cases, etc. Resolution is complete for propositional logic Forward, backward chaining are linear-time, complete for Horn clauses Propositional logic lacks expressive power