Download presentation

Presentation is loading. Please wait.

Published byYessenia Hitchens Modified over 2 years ago

1
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College

2
Knowledge Based Agents Representation –How is the knowledge stored by the agent? –Procedural vs. Declarative Reasoning –How is the knowledge used… To solve a problem? To generate more knowledge? Generic Functions –TELL (add a fact to the knowledge base) –ASK (get next action based on info in KB)

3
Hunt the Wumpus Game is played in a MxN grid One player, one wumpus, one or more pits Goal: find gold while avoiding wumpus and pits Percepts: –Glitter (gold is in this square) –Stench (wumpus is within 1 square N,E,S,W) –Breeze (pit is within 1 square N, E, S, W)

4
Wumpus Example stench[Wumpus]stench Glitter [gold] stench, breeze [start]breeze[Pit]breeze

5
Examples of reasoning If the player is in square (1, 0) and the percept is breeze, then there must be a pit in (0,0) or a pit in (2,0) or a pit in (1,1). If the player is in (0,0) [and still alive], there is not a pit in (0,0). If there is no breeze in (0,0), there is no pit in (0,1) If there is also no breeze in (0,1) then there is no pit in (1,1). Therefore, there must be a pit in (2,0)

6
Formalizing Reasoning Information is represented in sentences, which must have correct syntax ( 1 + 2 ) * 7 = 21 vs. 2 ) + 7 = * ( 1 21 The meaning of a sentence (semantics) defines its truth in each model (possible world) One sentence entails another sentence if the second one follows logically from the first; i.e. every model that has the first true, also has the second true. Inference is the process of deriving a specific sentence from a KB (where the sentence must be entailed by the KB)

7
Desirable properties of an inference algorithm Soundness –Only sentences that are entailed by a KB will be derived by the inference algorithm Completeness –Every sentence that is entailed by a KB will be derived by the inference algorithm (eventually) If these properties are true, then every sentence derived from a true KB will be true in the world. –All reasoning can be done in a model, not the world!

8
Propositional Logic Syntax: Sentence -> true, false, P, Q, R …primitive sent. sentencenot ( sentence sentence )and ( sentence sentence )or ( sentence sentence )implies (if) ( sentence sentence )if & only if (iff) ( sentence ) Note: propositional logic can be directly implemented in hardware using logic gates for operations

9
Propositional Logic Sentences If there is a pit at [1,1], there is a breeze at [1,0] P 11 B 10 There is a breeze at [2,2], if and only if there is a pit in the neighborhood B 22 ( P 21 P 23 P 12 P 32 ) There is no breeze at [2,2] B 22

10
Propositional Logic Inference Question: Does KB entail S? Method 1: Truth Table Entailment –Construct a truth table whose columns are all propositions used in the sentences in KB. –If S is true everywhere all sentences in KB are true, then KB entails S (otherwise not) Method 2: Proof –Proof by deduction –Proof by contradiction –Etc.

11
Truth Table Entailment ABC A BA CB C FFFFFF FFTFFF FTFFFF FTTFFT TFFFFF TFTFTF TTFTFF TTTTTT A,B, Entails A B A^C, C does not entail B C

12
Truth Table Entailment is… Sound –by definition, since it directly implements the definition of entailment Complete –Only when the KB (and therefore the truth table) is finite Time consuming –The truth table size is 2 number of statements, and we have to check every row!

13
Rules for Deductive Proofs Modus Ponens –Given: S1 S2 and S1, derive S2 And-elimination –Given: S1 S2, derive S1 –Given: S1 S2, derive S2 DeMorgan’s Law –Given ( A B) derive A B –Given ( A B) derive A B More on p. 210

14
Example Proof by Deduction Knowledge S1: B 22 ( P 21 P 23 P 12 P 32 )rule S2: B 22 observation Inferences S3: (B 22 (P 21 P 23 P 12 P 32 )) ((P 21 P 23 P 12 P 32 ) B 22 ) [S1,bi elim] S4: ((P 21 P 23 P 12 P 32 ) B 22 ) [S3, and elim] S5: ( B 22 ( P 21 P 23 P 12 P 32 )) [contrapos] S6: (P 21 P 23 P 12 P 32 ) [S2,S6, MP] S7: P 21 P 23 P 12 P 32 [S6, DeMorg]

15
Evaluation of Deductive Inference (using p. 110 rules) Sound –Yes, because the inference rules themselves are sound. (This can be proven using a truth table argument). Complete –If we allow all possible inference rules, we’re searching in an infinite space, hence not complete –If we limit inference rules, we run the risk of leaving out the necessary one… Monotonic –If we have a proof, adding information to the DB will not invalidate the proof

16
Resolution Resolution allows a complete inference mechanism (search-based) using only one rule of inference Resolution rule: –Given: P 1 P 2 P 3 … P n, and P 1 Q 1 … Q m –Conclude: P 2 P 3 … P n Q 1 … Q m Complementary literals P 1 and P 1 “cancel out” Why it works: –Consider 2 cases: P 1 is true, and P 1 is false

17
Resolution in Wumpus World There is a pit at 2,1 or 2,3 or 1,2 or 3,2 –P 21 P 23 P 12 P 32 There is no pit at 2,1 – P 21 Therefore (by resolution) the pit must be at 2,3 or 1,2 or 3,2 –P 23 P 12 P 32

18
Proof using Resolution To prove a fact P, repeatedly apply resolution until either: –No new clauses can be added, (KB does not entail P) –The empty clause is derived (KB does entail P) This is proof by contradiction: if we prove that KB P derives a contradiction (empty clause) and we know KB is true, then P must be false, so P must be true! To apply resolution mechanically, facts need to be in Conjunctive Normal Form (CNF) To carry out the proof, need a search mechanism that will enumerate all possible resolutions.

19
Conjunctive Normal Form for B 22 ( P 21 P 23 P 12 P 32 ) 1. Eliminate , replacing with two implications (B 22 ( P 21 P 23 P 12 P 32 )) ((P 21 P 23 P 12 P 32 ) B 22 ) 2. Replace implication (A B) by A B ( B 22 ( P 21 P 23 P 12 P 32 )) ( (P 21 P 23 P 12 P 32 ) B 22 ) 3. Move “inwards” (unnecessary parens removed) ( B 22 P 21 P 23 P 12 P 32 ) ( ( P 21 P 23 P 12 P 32 ) B 22 ) 4. Distributive Law ( B 22 P 21 P 23 P 12 P 32 ) ( P 21 B 22 ) ( P 23 B 22 ) ( P 12 B 22 ) ( P 32 B 22 ) (Final result has 5 clauses)

20
Resolution Example Given B 22 and P 21 and P 23 and P 32, prove P 12 ( B 22 P 21 P 23 P 12 P 32 ) ; P 12 ( B 22 P 21 P 23 P 32 ) ; P 21 ( B 22 P 23 P 32 ) ; P 23 ( B 22 P 32 ) ; P 32 ( B 22 ) ; B 22 [empty clause]

21
Mechanical Approach to Resolution Use DFS in “clause space” –Initial state is Not (whatever is being proven) –Goal state is Empty clause –Next state generator finds all clauses in the KB [that include negations of one or more propositions in the current state]; next states are resolutions of those clauses with current state Like all DFS, this is worst-case exponential (search all possible clauses)

22
Evaluation of Resolution Resolution is sound –Becase the resolution rule is true in all cases Resolution is complete –Provided a complete search method is used to find the proof, if a proof can be found it will –Note: you must know what you’re trying to prove in order to prove it! Resolution is exponential –The number of clauses that we must search grows exponentially… –If it didn’t we could use resolution to solve 3SAT in polynomial time!

23
Horn Clauses A Horn Clause is a CNF clause with exactly one positive literal –The positive literal is called the head –The negative literals are called the body –Prolog: head:- body1, body2, body3 … –English: “To prove the head, prove body1, …” –Implication: If (body1, body2 …) then head Horn Clauses form the basis of forward and backward chaining The Prolog language is based on Horn Clauses Deciding entailment with Horn Clauses is linear in the size of the knowledge base

24
Reasoning with Horn Clauses Forward Chaining –For each new piece of data, generate all new facts, until the desired fact is generated –Data-directed reasoning Backward Chaining –To prove the goal, find a clause that contains the goal as its head, and prove the body recursively –(Backtrack when you chose the wrong clause) –Goal-directed reasoning

25
First Order Logic (ch. 8) Has a greater expressive power than propositional logic –We no longer need a separate rule for each square to say which other squares are breezy/pits Allows for facts, objects, and relations –In programming terms, allows classes, functions and variables

Similar presentations

OK

February 20, 2006AI: Chapter 7: Logical Agents1 Artificial Intelligence Chapter 7: Logical Agents Michael Scherger Department of Computer Science Kent.

February 20, 2006AI: Chapter 7: Logical Agents1 Artificial Intelligence Chapter 7: Logical Agents Michael Scherger Department of Computer Science Kent.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google