Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Unification ( 統一 ) Forward Chaining ( 前向き仮説推理 ) Backward Chaining ( 後ろ向き仮説推理 ) Unification ( 統一 ) Forward Chaining ( 前向き仮説推理 ) Backward Chaining ( 後ろ向き仮説推理.

Similar presentations


Presentation on theme: "1 Unification ( 統一 ) Forward Chaining ( 前向き仮説推理 ) Backward Chaining ( 後ろ向き仮説推理 ) Unification ( 統一 ) Forward Chaining ( 前向き仮説推理 ) Backward Chaining ( 後ろ向き仮説推理."— Presentation transcript:

1 1 Unification ( 統一 ) Forward Chaining ( 前向き仮説推理 ) Backward Chaining ( 後ろ向き仮説推理 ) Unification ( 統一 ) Forward Chaining ( 前向き仮説推理 ) Backward Chaining ( 後ろ向き仮説推理 )

2 2 Seven inference rules for propositional Logic R(1) Modus Ponens R(2) And-Elimination R(3) And-Introduction R(4) Or-Introduction R(5) Double-Negation Elimination R(6) Unit Resolution R(7) Logic connectives:    ,  ii  1   2  …   n  1,  2, …,  n  1   2  …   n  i          ,          ,    

3 3 The three new inference rules R (8) Universal Elimination: For any sentence , variable v, and ground term g: e. g.,  x Likes(x, IceCream), we can use the substitute {x/Rose} and infer Like(Rose, IceCream). R (9) Existential Elimination: For any sentence , variable v, and constant symbol k that does not appear elsewhere in the knowledge base: e. g.,  x Kill(x, Victim), we can infer Kill(Murderer, Victim), as long as Murderer does not appear elsewhere in the knowledge base. R (10) Existential Introduction: For any sentence , variable v that does not occur in , and ground term g that does occur in  : e. g., from Likes(Rose, IceCream) we can infer  x Likes(x, IceCream). SUBST({v/g},  )  v  Ground term is a term that contains no variables. SUBST({v/k},  )  v   v SUBST({g/v},  ) 

4 4 Example of proof ( 証明 ) Bob is a buffalo | 1. Buffalo(Bob) --f1 Pat is a pig | 2. Pig(Pat) --f2 Buffaloes run faster than pigs | 3.  x, y Buffalo(x)  Pig(y)  Faster(x,y) --r1 ---------------------------------------------------------------------------------------------------- To proof: Bob runs faster than Pat --------------------------------------------------------------------------------------------------- Apply R(3) to f1 And f2 | 4. Buffalo(Bob)  Pig(Pat) --f3 (And-Introduction) Apply R(8) to r1 {x/Bob, y/Pat} | 5. Buffalo(Bob)  Pig(Pat)  Faster(Bob,Pat) --f4 (Universal-Elimination) Apply R(1) to f3 And f4 | 6. Faster(Bob,Pat) --f5 (Inplication-Elimination )

5 5 Search with primitive ( 基本の ) inference rules Operators are inference rules States are sets of sentences Goal test checks state to see if it contains query ( 質問 ) sentence R(1) to R(10) are common inference pattern Problem: branching factors are huge, esp. for R(8) Idea: find a substitution that makes the rule premise match some known facts. 1 2 3 1 2 3 4 Apply R(3) to 1 & 2 1 2 3 4 5 Apply R(8) to 3 1 2 3 4 5 6 Apply R(1) to 4 & 5 Stored in working memory Stored in rule base

6 6 Rule Base Working Memory Interaction with Fire a Rule Select a Rule Matching Acting Inference Engine A Reasoning System

7 7 Unify Unification function, Unify, is to take two atomic sentences p and q and return a substitution that would make p and q look the same. A substitute unifies atomic sentences p and q if p =q For example, p q ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Knows(John, x) | Knows(John, Jane) | {x/Jane} Knows(John, x) | Knows(y, OJ) | {y/John, x/OJ} Knows(John, x) | Knows(y, Mother(y)) | {y/John, x/Mother(John)} e.g., Unify(Knows(John, x), Knows(John, Jane)) = {x/Jane} Idea: Unify rule premises with known facts, apply unifier to conclusion. e.g., if we know q and Knows(John, x)  Likes(John, x) then we can conclude Likes(John, Jane) Likes(John, OJ) Likes(John, Mother(John)) Premise 前提

8 8 String matching: string1 = string2 e.g. “rose” = “rose” if string1.equals(string2) “I am Rose” = “I am Rose” “I am ?x” = “I am Rose” “I am ?x” = “?y am Rose” I = ?y am = am ?x = Rose ? e.g. ?x is ?y and ?x Rose is rose and ?y ?x = Rose ?y = rose ?x = ?y e.g. husband(father(?x), Mike) husband(father(John), Mike) ?x = John

9 9 Forward chaining The GMP rule can be used in two ways. If we start with the sentences in the knowledge base and generate new conclusions that in turn can allow more inferences to be made. This is called forward chaining. That is to say, when a new fact p is added (told) to the KB for each rule such that p unifies with a premise if the other premises are known then add the conclusion to the KB and continue chaining. Forward chaining is usually used when a new fact is added to the database and we want to generate its consequences. It is data driven. Percepts => properties or categories. TELL

10 10 Forward chaining example Let us add facts r1, r2, r3, f1, f2, f3 in turn into KB. r1. Buffalo(x)  Pig(y)  Faster(x,y) r2. Pig(y)  Slug(z)  Faster(y,z) r3. Faster(x,y)  Faster(y,z)  Faster(x,z) f1. Buffalo(Bob) [r1-c1, Bob/x, yes] f2. Pig(Pat) [r1-c2, Pat/y, yes]  f4. Faster(Bob, Pat) f3. Slug(Steve) [r2-c2, Steve/z, yes] [r2, f2, f3, Pat/y, Steve/z, yes]  f5. Faster(Pat, Steve) [r3, f4, f5, Bob/x, Pat/y, Steve/z, yes]  f6. Faster(Bob, Steve)

11 11 Backward chaining We can use the GMP rule in a different way. It is to start with something we want to prove, find implication sentences that would allow us to conclude it, and them attempt to establish their premises in turn. This is called backward chaining. That is to say, when a query q is asked if a matching fact q’ is known, return the unifier for each rule whose consequent q’ match q attempt to prove each premise of the rule by backward chaining Two versions: to find a solution, find all solutions A sophisticated Backward chaining algorithm is - able to keep track of the unifier - able to avoid infinite loops ASK

12 12 Backward chaining example Faster(Bob, Pat) Goal: to prove Buffalo(x)Pig(y) Buffalo(Bob)Pig(Pat) {x/Bob} {}{} r1 {}{} {y/Pat} Bob is a buffalo | 1. Buffalo(Bob) --f1 Pat is a pig | 2. Pig(Pat) --f2 Buffaloes run faster than pigs | 3.  x, y Buffalo(x)  Pig(y)  Faster(x,y) --r1 Faster(x, y) Buffalo(x)  Pig(y)

13 13 Forward Chaining - 他の例 rules

14 14 Backward Chaining - 他の例 rules

15 15 Vehicles Rule Base Rule Base: bicycle: IF vehicleType=cycle AND num_wheels=2 AND motor=no THEN vehicle=Bicycle tricycle: IF vehicleType=cycle AND num_wheels=3 AND motor=no THEN vehicle=Tricycle motorcycle: IF vehicleType=cycle AND num_wheels=2 AND motor=yes THEN vehicle=Motorcycle sportsCar: IF vehicleType=automobile AND size=small AND num_doors=2 THEN vehicle=Sports_Car sedan: IF vehicleType=automobile AND size=medium AND num_doors=4 THEN vehicle=Sedan miniVan: IF vehicleType=automobile AND size=medium AND num_doors=3 THEN vehicle=MiniVan SUV: IF vehicleType=automobile AND size=large AND num_doors=4 THEN vehicle=Sports_Utility_Vehicle Cycle: IF num_wheels<4 THEN vehicleType=cycle Automobile: IF num_wheels=4 AND motor=yes THEN vehicleType=automobile vehicleType value = null size value = null num_wheels value = null num_doors value = null motor value = null vehicle value = null vehicleType value = null size value = null num_wheels value = null num_doors value = null motor value = null vehicle value = null size set to medium num_wheels set to 4 motor set to yes num_doors set to 3 Vehicle Rule Base Working memory forward/backward chaining demo../../agentSoft/ciagent/part1/rule/RuleApplet.html

16 16 --- Setting all Vehicles Rule Base variables to null --- Starting Inferencing Cycle --- vehicleType value = null size value = medium num_wheels value = 4 num_doors value = 3 motor value = yes vehicle value = null Testing rule bicycle Testing rule tricycle Testing rule motorcycle Testing rule sportsCar Testing rule sedan Testing rule miniVan Testing rule SUV Testing rule Cycle Testing rule Automobile -- Rules in conflict set: Automobile(2), Firing rule Automobile Testing rule bicycle Testing rule tricycle Testing rule motorcycle Testing rule sportsCar Testing rule sedan Testing rule miniVan Testing rule SUV Testing rule Cycle Testing rule Automobile -- Rules in conflict set: miniVan(3), Firing rule miniVan Testing rule bicycle Testing rule tricycle Testing rule motorcycle Testing rule sportsCar Testing rule sedan Testing rule miniVan Testing rule SUV -- Rules in conflict set: vehicleType value = automobile size value = medium num_wheels value = 4 num_doors value = 3 motor value = yes vehicle value = MiniVan --- Ending Inferencing Cycle --- Forward chaining trace log

17 17 --- Starting Demo BackwardChain --- vehicleType value = null size value = medium num_wheels value = 4 num_doors value = 3 motor value = yes vehicle value = null Evaluating rule bicycle Evaluating rule Cycle Rule Cycle is false, can't set vehicleType Evaluating rule Automobile Rule Automobile is true, setting vehicleType: = automobile Rule bicycle is false, can't set vehicle Evaluating rule tricycle Rule tricycle is false, can't set vehicle Evaluating rule motorcycle Rule motorcycle is false, can't set vehicle Evaluating rule sportsCar Rule sportsCar is false, can't set vehicle Evaluating rule sedan Rule sedan is false, can't set vehicle Evaluating rule miniVan Rule miniVan is true, setting vehicle: = MiniVan +++ Found Solution for goal: vehicle --- Stopping Demo BackwardChain! --- vehicleType value = automobile size value = medium num_wheels value = 4 num_doors value = 3 motor value = yes vehicle value = MiniVan Backward chaining trace log The goal is set as MiniVan

18 18 Question time Understand the forward chaining and the backward chaining algorithms and think examples for each algorithm. Ask me questions! Understand the forward chaining and the backward chaining algorithms and think examples for each algorithm. Ask me questions!


Download ppt "1 Unification ( 統一 ) Forward Chaining ( 前向き仮説推理 ) Backward Chaining ( 後ろ向き仮説推理 ) Unification ( 統一 ) Forward Chaining ( 前向き仮説推理 ) Backward Chaining ( 後ろ向き仮説推理."

Similar presentations


Ads by Google