Presentation is loading. Please wait.

Presentation is loading. Please wait.

Logical Agents ECE457 Applied Artificial Intelligence Spring 2007 Lecture #6.

Similar presentations


Presentation on theme: "Logical Agents ECE457 Applied Artificial Intelligence Spring 2007 Lecture #6."— Presentation transcript:

1 Logical Agents ECE457 Applied Artificial Intelligence Spring 2007 Lecture #6

2 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 2 Outline Logical reasoning Propositional Logic Wumpus World Inference Russell & Norvig, chapter 7

3 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 3 Logical Reasoning Recall: Game-playing with imperfect information Partially-observable environment Need to infer about hidden information Two new challenges How to represent the information we have (knowledge representation) How to use the information we have to infer new information and make decisions (knowledge reasoning)

4 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 4 Knowledge Representation Represent facts about the environment Many ways: ontologies, mathematical functions, … Statements that are either true or false Language To write the statements Syntax: symbols (words) and rules to combine them (grammar) Semantics: meaning of the statements Expressiveness vs. efficiency Knowledge base (KB) Contains all the statements Agent can TELL it new statements (update) Agent can ASK it for information (query)

5 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 5 Knowledge Representation Example: Language of arithmetic Syntax describes well-formed formulas (WFF) X + Y > 7 (WFF) X 7 @ Y + (not a WFF) Semantics describes meanings of formulas “X + Y > 7” is true if and only if the value of X and the value of Y summed together is greater than 7

6 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 6 Knowledge Reasoning Inference Discovering new facts and drawing conclusions based on existing information During ASK or TELL “All humans are mortal” “Socrates is human” Entailment A sentence  is inferred from sentences   is true given that the  are true  entails    

7 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 7 Propositional Logic Sometimes called “Boolean Logic” Sentences are true (T) or false (F) Words of the syntax include propositional symbols… P, Q, R, … P = “I’m hungry”, Q = “I have money”, R = “I’m going to a restaurant” … and logical connectives ¬negationNOT  conjunctionAND  disjunctionOR  implicationIF-THEN  biconditionalIF AND ONLY IF

8 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 8 Propositional Logic Atomic sentences Propositional symbols True or false Complex sentences Groups of propositional symbols joined with connectives, and parenthesis if needed (P  Q)  R Well-formed formulas following grammar rules of the syntax

9 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 9 Propositional Logic Complex sentences evaluate to true or false Using truth tables Semantics PQR P  Q(P  Q)  R TTTTT FTTFT TFTFT FFTFT TTFTF FTFFT TFFFT FFFFT

10 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 10 Propositional Logic Semantics PQ¬P¬P P  QP  QP  QP  Q TTFTTTT FTTFTTF TFFFTFF FFTFFTT Truth tables for all connectives Given each possible truth value of each propositional symbol, we can get the possible truth values of the expression

11 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 11 Propositional Logic Example Propositional symbols: A = “The car has gas” B = “I can go to the store” C = “I have money” D = “I can buy food” E = “The sun is shining” F = “I have an umbrella” G = “I can go on a picnic” If the car has gas, then I can go to the store A  B I can buy food if I can go to the store and I have money (B  C)  D If I can buy food and either the sun is not shining or I have an umbrella, I can go on a picnic D  (¬E  F)  G

12 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 12 DEFG¬E ¬E  FD  (¬E  F)D  (¬E  F)  G TTTTFTTT FTTTFTFT TFTTTTTT FFTTTTFT TTFTFFFT FTFTFFFT TFFTTTTT FFFTTTFT TTTFFTTF FTTFFTFT TFTFTTTF FFTFTTFT TTFFFFFT FTFFFFFT TFFFTTTF FFFFTTFT

13 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 13 Wumpus World 2D cave divided in rooms Gold Glitters Agent has to pick it up Pits Agent falls in and dies Agent feels breeze near pit Wumpus Agent gets eaten and dies if Wumpus alive Agent can kill Wumpus with arrow Agent smells stench near Wumpus (alive or dead) 4321 1 2 3 4

14 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 14 Wumpus World Initial state: (1,1) Goal: Get the gold and get back to (1,1) Actions: Turn 90°, move forward, shoot arrow, pick up gold Cost: +1000 for getting gold, -1000 for dying, -1 per action, -10 for shooting the arrow 4321 1 2 3 4

15 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 15 Exploring the Wumpus World 4321 1 2 3 4 OK Pit? Wumpus? OK

16 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 16 Wumpus World Logic Propositional symbols P i,j = “there is a pit at (i,j)” B i,j = “there is a breeze at (i,j)” S i,j = “there is a stench at (i,j)” W i,j = “there is a Wumpus at (i,j)” K i,j = “(i,j) is ok” Rules P i,j  (B i+1,j  B i-1,j  B i,j+1  B i,j-1 ) W i,j  (S i+1,j  S i-1,j  S i,j+1  S i,j-1 ) B i,j  (P i+1,j  P i-1,j  P i,j+1  P i,j-1 ) S i,j  (W i+1,j  W i-1,j  W i,j+1  W i,j-1 ) K i,j  (¬W i,j  ¬P i,j ) Have to be written out for every (i,j)

17 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 17 Wumpus World KB 4321 1 2 3 4 1.K 1,1 2.¬B 1,1 3.¬S 1,1 a. B 1,1  (P 2,1  P 1,2 ) b. S 1,1  (W 2,1  W 1,2 ) c. K 2,1  (¬W 2,1  ¬P 2,1 ) d. K 1,2  (¬W 1,2  ¬P 1,2 )

18 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 18 Wumpus World Inference B 1,1 P 1,2 P 2,1 ¬B 1,1 P 1,2  P 2,1 B 1,1  (P 1,2  P 2,1 ) TTTFTT TFTFTT TTFFTT TFFFFF FTTTTF FFTTTF FTFTTF FFFTFT 1. K 1,1 3. ¬S 1,1 2. ¬B 1,1 1. K 1,1 3. ¬S 1,1 5. ¬P 2,1 2. ¬B 1,1 4. ¬P 1,2

19 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 19 1. K 1,1 3. ¬S 1,1 5. ¬P 2,1 7. ¬W 2,1 2. ¬B 1,1 4. ¬P 1,2 6. ¬W 1,2 Wumpus World Inference S 1,1 W 1,2 W 2,1 ¬S 1,1 W 1,2  W 2,1 S 1,1  (W 1,2  W 2,1 ) TTTFTT TFTFTT TTFFTT TFFFFF FTTTTF FFTTTF FTFTTF FFFTFT

20 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 20 1. K 1,1 3. ¬S 1,1 5. ¬P 2,1 7. ¬W 2,1 2. ¬B 1,1 4. ¬P 1,2 6. ¬W 1,2 1. K 1,1 3. ¬S 1,1 5. ¬P 2,1 7. ¬W 2,1 9. K 2,1 2. ¬B 1,1 4. ¬P 1,2 6. ¬W 1,2 8. K 1,2 Wumpus World Inference P 1,2 W 1,2 K 1,2 ¬P 1,2 ¬W 1,2 ¬W 1,2  ¬P 1,2 K 1,2  (¬W 1,2  ¬P 1,2 ) TTTFFFF FTTTFFF TFTFTFF FFTTTTT TTFFFFT FTFTFFT TFFFTFT FFFTTTF

21 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 21 10.B 2,1 11.P 3,1 12.¬S 2,1 13.¬W 2,2 14.¬W 3,1 15.¬B 1,2 16.¬P 1,3 17.¬P 2,2 18.S 1,2 19.W 1,3 20.K 2,2 Wumpus World KB 1.K 1,1 2.¬B 1,1 3.¬S 1,1 4.¬P 1,2 5.¬P 2,1 6.¬W 1,2 7.¬W 2,1 8.K 1,2 9.K 2,1 4321 1 2 3 4 OK Pit? Wumpus? OK 10.B 2,1 11.P 2,2  P 3,1 12.¬S 2,1 13.¬W 2,2 14.¬W 3,1 15.¬B 1,2 16.¬P 1,3 17.¬P 2,2 18.S 1,2 19.W 1,3  W 2,2

22 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 22 Inference with Truth Tables Sound Only infers true conclusions from true premises Complete Finds all facts entailed by KB Time complexity = O(2 n ) Checks all truth values of all symbols Space complexity = O(n)

23 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 23 Inference with Rules Speed up inference by using inference rules Use along with logical equivalences No need to enumerate and evaluate every truth value

24 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 24 Rules and Equivalences Logical equivalences (α  β)  (β  α) (α  β)  (β  α) ((α  β)  γ)  (α  (β  γ)) ((α  β)  γ)  (α  (β  γ)) ¬(¬α)  α (α  β)  (¬β  ¬α) (α  β)  (¬α  β) (α  β)  ((α  β)  (β  α)) ¬(α  β)  (¬α  ¬β) ¬(α  β)  (¬α  ¬β) (α  (β  γ))  ((α  β)  (α  γ)) (α  (β  γ))  ((α  β)  (α  γ)) Inference rules (α  β), α β (α  β) α α, β (α  β) (α  β), ¬β α ) (α  β), (¬β  γ) (α  γ)

25 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 25 Wumpus World & Inference Rules KB: ¬B 1,1 1.B 1,1  (P 2,1  P 1,2 )  Biconditional elimination 2.(B 1,1  (P 2,1  P 1,2 ))  ((P 2,1  P 1,2 )  B 1,1 )  And elimination 3.(P 2,1  P 1,2 )  B 1,1  Contraposition 4.¬B 1,1  ¬(P 2,1  P 1,2 )  Modus Ponens 5.¬(P 2,1  P 1,2 )  De Morgan’s Rule 6.¬P 2,1  ¬P 1,2

26 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 26 Resolution Inference with rules is sound, but only complete if we have all the rules Resolution rule is both sound and complete ) (α  β), (¬β  γ) (α  γ) But it only works on disjunctions! Conjunctive normal form (CNF) 1. Eliminate biconditionals: (α  β)  ((α  β)  (β  α)) 2. Eliminate implications: (α  β)  (¬α  β) 3. Move/Eliminate negations: ¬(¬α)  α, ¬(α  β)  (¬α  ¬β), ¬(α  β)  (¬α  ¬β) 4. Distribute  over  : (α  (β  γ))  ((α  β)  (α  γ))

27 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 27 CNF Example 1.B 1,1  (P 2,1  P 1,2 )  Eliminate biconditionals 2.(B 1,1  (P 2,1  P 1,2 ))  ((P 2,1  P 1,2 )  B 1,1 )  Eliminate implications 3.(¬B 1,1  P 2,1  P 1,2 )  (¬(P 2,1  P 1,2 )  B 1,1 )  Move/Eliminate negations 4.(¬B 1,1  P 2,1  P 1,2 )  ((¬P 2,1  ¬P 1,2 )  B 1,1 )  Distribute  over  5.(¬B 1,1  P 2,1  P 1,2 )  (¬P 2,1  B 1,1 )  (¬P 1,2  B 1,1 )

28 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 28 Resolution Algorithm Given a KB Need to answer a query α KB  α ? Proof by contradiction Show that (KB  ¬α) is unsatisfiable i.e. leads to a contradiction If (KB  ¬α), then (KB  α) must be true

29 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 29 Resolution Algorithm Convert (KB  ¬α) into CNF For every pair of clauses that contain complementary symbols Apply resolution to generate a new clause Add new clause to sentence End when Resolution gives the empty clause (KB  α) No new clauses can be added (fail)

30 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 30 Wumpus World & Resolution (¬B 1,1  P 1,2  P 2,1 )  (¬P 1,2  B 1,1 )  (¬P 2,1  B 1,1 ) CNF form of B 1,1  (P 2,1  P 1,2 ) ¬B 1,1 Query: ¬P 1,2 (¬B 1,1  P 1,2  P 2,1 )  (¬P 1,2  B 1,1 )  (¬P 2,1  B 1,1 )  ¬ B 1,1  P 1,2 (¬B 1,1  P 1,2  P 2,1 )  (¬P 1,2  B 1,1 )  ¬P 2,1  P 1,2 (¬B 1,1  P 1,2  P 2,1 )  (¬P 1,2  B 1,1 )  Empty clause! KB  ¬P 1,2

31 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 31 Resolution Algorithm Sound Complete Not efficient

32 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 32 Horn Clauses Resolution algorithm can be further improved by using Horn clauses Disjunction clause with at most one positive symbol ¬α  ¬β  γ Can be rewritten as implication (α  β)  γ Inference in linear time! Using Modus Ponens Forward or backward chaining

33 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 33 Forward Chaining Data-driven reasoning Start with known symbols Infer new symbols and add to KB Use new symbols to infer more new symbols Repeat until query proven or no new symbols can be inferred Work forward from known data, towards proving goal 1. KB: α, β, δ, ε 2. (α  β)  γ 3. ( δ  ε )  λ 4. ( λ  γ)  q

34 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 34 Backward Chaining Goal-driven reasoning Start with query, try to infer it If there are unknown symbols in the premise of the query, infer them first If there are unknown symbols in the premise of these symbols, infer those first Repeat until query proven or its premise cannot be inferred Work backwards from goal, to prove needed information 1. KB: α, β, δ, ε 2. ( λ  γ)  q 3. ( δ  ε )  λ 4. (α  β)  γ

35 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 35 Forward vs. Backward Forward chaining Proves everything Goes to work as soon as new information is available Expands the KB a lot Improves understanding of the world Typically used for proving a world model Backward chaining Proves only what is needed for the goal Does nothing until a query is asked Expands the KB as little as needed More efficient Typically used for proofs by contradiction

36 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 36 Assumptions Utility-based agent Environment Fully observable / Partially observable (approximation) Deterministic / Strategic / Stochastic Sequential Static / Semi-dynamic Discrete / Continuous Single agent / Multi-agent

37 ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 37 Assumptions Updated Learning agent Environment Fully observable / Partially observable Deterministic / Strategic / Stochastic Sequential Static / Semi-dynamic Discrete / Continuous Single agent / Multi-agent


Download ppt "Logical Agents ECE457 Applied Artificial Intelligence Spring 2007 Lecture #6."

Similar presentations


Ads by Google