Presentation is loading. Please wait.

Presentation is loading. Please wait.

Artificial Intelligence Knowledge Representation Problem 2.

Similar presentations


Presentation on theme: "Artificial Intelligence Knowledge Representation Problem 2."— Presentation transcript:

1 Artificial Intelligence Knowledge Representation Problem 2

2 Reverse translation Translate the following into English.  x hesitates(x)  lost(x) He who hesitates is lost.  x business(x)  like(x,Showbusiness) There is no business like show business.  x glitters(x)  gold(x) Not everything that glitters is gold.  x  t person(x)  time(t)  canfool(x,t) You can fool some of the people all the time.

3 Translating English to FOL Every gardener likes the sun.  x gardener(x)  likes(x,Sun) You can fool some of the people all of the time.  x  t person(x)  time(t)  can-fool(x,t) You can fool all of the people some of the time.  x  t (person(x)  time(t)  can-fool(x,t))  x (person(x)   t (time(t)  can-fool(x,t))) All purple mushrooms are poisonous.  x (mushroom(x)  purple(x))  poisonous(x) No purple mushroom is poisonous.  x purple(x)  mushroom(x)  poisonous(x)  x (mushroom(x)  purple(x))   poisonous(x) There are exactly two purple mushrooms.  x  y mushroom(x)  purple(x)  mushroom(y)  purple(y) ^  (x=y)   z (mushroom(z)  purple(z))  ((x=z)  (y=z)) Clinton is not tall.  tall(Clinton) X is above Y iff X is on directly on top of Y or there is a pile of one or more other objects directly on top of one another starting with X and ending with Y.  x  y above(x,y) ↔ (on(x,y)   z (on(x,z)  above(z,y))) Equivalent

4 Resolution for first-order logic for all x: (NOT(Knows(John, x)) OR IsMean(x) OR Loves(John, x)) John loves everything he knows, with the possible exception of mean things for all y: (Loves(Jane, y) OR Knows(y, Jane)) Jane loves everything that does not know her What can we unify? What can we conclude? Use the substitution: {x/Jane, y/John} Get: IsMean(Jane) OR Loves(John, Jane) OR Loves(Jane, John) Complete (i.e., if not satisfiable, will find a proof of this), if we can remove literals that are duplicates after unification Also need to put everything in canonical form first

5 Resolution

6 6 Converting sentences to CNF 1. Eliminate all ↔ connectives (P ↔ Q)  ((P  Q) ^ (Q  P)) 2. Eliminate all  connectives (P  Q)  (  P  Q) 3. Reduce the scope of each negation symbol to a single predicate  P  P  (P  Q)   P   Q  (P  Q)   P   Q  (  x)P  (  x)  P  (  x)P  (  x)  P 4. Standardize variables: rename all variables so that each quantifier has its own unique variable name

7 Converting sentences 5. Eliminate existential quantification by introducing Skolem constants/functions (  x)P(x)  P(c) c is a Skolem constant (a brand-new constant symbol that is not used in any other sentence) (  x)(  y)P(x,y)  (  x)P(x, f(x)) since  is within the scope of a universally quantified variable, use a Skolem function f to construct a new value that depends on the universally quantified variable f must be a brand-new function name not occurring in any other sentence in the KB. E.g., (  x)(  y)loves(x,y)  (  x)loves(x,f(x)) In this case, f(x) specifies the person that x loves

8 Generalized Modus Ponens

9 Modus Ponens - special case of Resolution p  q p q Sunday  Dr Yasser is teaching AI Sunday Dr Yasser teaching AI Using the tricks: p  q  p p  p  q  q, i.e. q

10 Sound rules of inference Each can be shown to be sound using a truth table RULEPREMISECONCLUSION Modus PonensA, A  BB And IntroductionA, BA  B And EliminationA  BA Double Negation  AA Unit ResolutionA  B,  BA ResolutionA  B,  B  CA  C

11 An example (  x)(P(x)  ((  y)(P(y)  P(f(x,y)))   (  y)(Q(x,y)  P(y)))) 2. Eliminate  (  x)(  P(x)  ((  y)(  P(y)  P(f(x,y)))   (  y)(  Q(x,y)  P(y)))) 3. Reduce scope of negation (  x)(  P(x)  ((  y)(  P(y)  P(f(x,y)))  (  y)(Q(x,y)   P(y)))) 4. Standardize variables (  x)(  P(x)  ((  y)(  P(y)  P(f(x,y)))  (  z)(Q(x,z)   P(z)))) 5. Eliminate existential quantification (  x)(  P(x)  ((  y)(  P(y)  P(f(x,y)))  (Q(x,g(x))   P(g(x))))) 6. Drop universal quantification symbols (  P(x)  ((  P(y)  P(f(x,y)))  (Q(x,g(x))   P(g(x)))))

12 Forward chaining Proofs start with the given axioms/premises in KB, deriving new sentences until the goal/query sentence is derived This defines a forward-chaining inference procedure because it moves “forward” from the KB to the goal [eventually]

13 Forward chaining Idea: fire any rule whose premises are satisfied in the KB, add its conclusion to the KB, until query is found

14 Forward chaining example

15 Backward chaining Proofs start with the goal query, find rules with that conclusion, and then prove each of the antecedents in the implication Keep going until you reach premises Avoid loops: check if new sub-goal is already on the goal stack Avoid repeated work: check if new sub- goal Has already been proved true Has already failed

16 Forward chaining example KB: allergies(X)  sneeze(X) cat(Y)  allergic-to-cats(X)  allergies(X) cat(Felix) allergic-to-cats(Lise) Goal: sneeze(Lise)

17 Reduction to propositional inference Suppose the KB contains just the following:  x King(x)  Greedy(x)  Evil(x) King(Ali) Greedy(Ali) Brother(Saad, Ali) Instantiating the universal sentence in all possible ways, we have: King(John)  Greedy(John)  Evil(John) King(Richard)  Greedy(Richard)  Evil(Richard) King(John) Greedy(John) Brother(Richard,John) The new KB is propositionalized: proposition symbols are King(John), Greedy(John), Evil(John), King(Richard), etc.

18 An example 1. Sameh is a lawyer. 2. Lawyers are rich. 3. Rich people have big houses. 4. Big houses are a lot of work. We would like to conclude that Sameh’s house is a lot of work.

19 Axiomatization 1 1. lawyer(Sameh) 2.  x lawyer(x)  rich(x) 3.  x rich(x)   y house(x,y) 4.  x,y rich(x)  house(x,y)  big(y) 5.  x,y ( house(x,y)  big(y)  work(y) ) 3 and 4, say that rich people do have at least one house and all their houses are big. Conclusion we want to show: house(Sameh, S_house)  work(Sameh, S_house) Or, do we want to conclude that Sameh has at least one house that needs a lot of work? I.e.  y house(Sameh,y)  work(y)

20 Hassan and the cat Everyone who loves all animals is loved by someone. Anyone who kills an animal is loved by no one. Mustafa loves all animals. Either Mustafa or Hassan killed the cat, who is named SoSo. Did Hassan kill the cat?

21 Practice example Did Hassan kill the cat Mustafa owns a dog. Every dog owner is an animal lover. No animal lover kills an animal. Either Hassan or Mustafa killed the cat, who is named SoSo. Did Hassan kill the cat? These can be represented as follows: A. (  x) Dog(x)  Owns(Mustafa,x) B. (  x) ((  y) Dog(y)  Owns(x, y))  AnimalLover(x) C. (  x) AnimalLover(x)  ((  y) Animal(y)   Kills(x,y)) D. Kills(Mustafa,SoSo)  Kills(Hassan,SoSo) E. Cat(SoSo) F. (  x) Cat(x)  Animal(x) G. Kills(Hassan, SoSo) GOAL

22 22 Convert to clause form A1. (Dog(D)) A2. (Owns(Mustafa,D)) B. (  Dog(y),  Owns(x, y), AnimalLover(x)) C. (  AnimalLover(a),  Animal(b),  Kills(a,b)) D. (Kills(Mustafa,SoSo), Kills(Hassan,SoSo)) E. Cat(SoSo) F. (  Cat(z), Animal(z)) Add the negation of query:  G: (  Kills(Hassan, SoSo))

23 23 The resolution refutation proof R1:  G, D, {}(Kills(Mustafa,SoSo)) R2: R1, C, {a/Mustafa, b/SoSo} (~AnimalLover(Mustafa), ~Animal(SoSo)) R3: R2, B, {x/Mustafa} (~Dog(y), ~Owns(Mustafa, y), ~Animal(SoSo)) R4: R3, A1, {y/D}(~Owns(Mustafa, D), ~Animal(SoSo)) R5: R4, A2, {}(~Animal(SoSo)) R6: R5, F, {z/SoSo}(~Cat(SoSo)) R7: R6, E, {} FALSE

24 The proof tree GG D C B A1 A2 F A R1: K(J,T) R2:  AL(J)   A(T) R3:  D(y)   O(J,y)   A(T) R4:  O(J,D),  A(T) R5:  A(T) R6:  C(T) R7: FALSE {} {a/J,b/T} {x/J} {y/D} {} {z/T} {}

25 Example knowledge base The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American. Prove that Col. West is a criminal

26 Example knowledge base... it is a crime for an American to sell weapons to hostile nations: American(x)  Weapon(y)  Sells(x,y,z)  Hostile(z)  Criminal(x) Nono … has some missiles, i.e.,  x Owns(Nono,x)  Missile(x): Owns(Nono,M 1 ) and Missile(M 1 ) … all of its missiles were sold to it by Colonel West Missile(x)  Owns(Nono,x)  Sells(West,x,Nono) Missiles are weapons: Missile(x)  Weapon(x) An enemy of America counts as "hostile“: Enemy(x,America)  Hostile(x) West, who is American … American(West) The country Nono, an enemy of America … Enemy(Nono,America)

27 Resolution proof: definite clauses 

28 Rule-Based Systems Also known as “production systems” or “expert systems” Rule-based systems are one of the most successful AI paradigms Used for synthesis (construction) type systems Also used for analysis (diagnostic or classification) type systems

29 Rule Based Reasoning The advantages of rule-based approach: The ability to use Good performance Good explanation The disadvantage are Cannot handle missing information Knowledge tends to be very task dependent

30 Other Reasoning There exist some other approaches as: Case-Based Reasoning Model-Based Reasoning Hybrid Reasoning Rule-based + case-based Rule-based + model-based Model-based + case-based

31 Expert System Uses domain specific knowledge to provide expert quality performance in a problem domain It is practical program that use heuristic strategies developed by humans to solve specific class of problems

32 Expert System Functionality replace human expert decision making when not available assist human expert when integrating various decisions provides an ES user with an appropriate hypothesis methodology for knowledge storage and reuse expert system – software systems simulating expert-like decision making while keeping knowledge separate from the reasoning mechanism

33 Expert System User User Interface Question&Answer Natural Language Graphical interface Inference Engine Explanation Knowledge editor General Knowledge Case-specific data

34 Expert System Components Global Database content of working memory (WM) Production Rules knowledge-base for the system Inference Engine rule interpreter and control subsystem

35 Rule-Based System knowledge in the form of if condition then effect (production) rules reasoning algorithm: (i)FR  detect(WM) (ii)R  select(FR) (iii)WM  apply R (iv)goto (i) conflicts in FR: examples – CLIPS (OPS/5), Prolog

36 Inference Engine It applies the knowledge to the solution of actual problem It is an interpreter for the knowledge base It performs the recognize-act control cycle

37 Weaknesses of Expert Systems Require a lot of detailed knowledge Restrict knowledge domain Not all domain knowledge fits rule format Expert consensus must exist Knowledge acquisition is time consuming Truth maintenance is hard to maintain Forgetting bad facts is hard

38 Expert Systems in Practice MYCIN example of medical expert system old well known reference great use of Stanford Certainty Algebra problems with legal liability and knowledge acquisition Prospector geological system knowledge encoded in semantic networks Bayesian model of uncertainty handling saved much money

39 Expert Systems in Practice XCON/R1 classical rule-based system configuration DEC computer systems commercial application, well used, followed by XSEL, XSITE failed operating after 1700 rules in the knowledge base FelExpert rule-based, baysian model, taxonomised, used in a number of applications ICON configuration expert system uses proof planning structure of methods

40

41

42


Download ppt "Artificial Intelligence Knowledge Representation Problem 2."

Similar presentations


Ads by Google