Presentation on theme: "EXPERT SYSTEMS AND KNOWLEDGE REPRESENTATION Ivan Bratko Faculty of Computer and Info. Sc. University of Ljubljana."— Presentation transcript:
1EXPERT SYSTEMS AND KNOWLEDGE REPRESENTATION Ivan Bratko Faculty of Computer and Info. Sc. University of Ljubljana
2EPERT SYSTEMSAn expert system behaves like an expert in some narrow area of expertiseTypical examples:Diagnosis in an area of medicineAdjusting between economy class and business class seats for future flightsDiagnosing paper problems in rotary printingVery fashionable area of AI in period
3SOME FAMOUS EXPERT SYSTEMS MYCIN (Shortliffe, Feigenbaum, late seventies)Diagnosis of infectionsPROSPECTOR (Buchanan, et al. ~1980)Ore and oil explorationINTERNIST (Popple, mid eighties)Internal medicine
5FEATURES OF EXPERT SYSTEMS Problem solving in the area of expertiseInteraction with user during and after problem solvingRelying heavily on domain knowledgeExplanation: ability of explain results to user
6REPRESENTING KNOWLEDGE WITH IF-THEN RULES Traditionally the most popular form of knowledge representation in expert systemsExamples of rules:if condition P then conclusion Cif situation S then action Aif conditions C1 and C2 hold then condition C does not hold
7DESIRABLE FEATURES OF RULES Modularity: each rule defines a relatively independent piece of knowledgeIncrementability: new rules added (relatively independently) of other rulesSupport explanationCan represent uncertainty
8TYPICAL TYPES OF EXPLANATION How explanationAnswers user’s questions of form: How did you reach this conclusion?Why explanationAnswers users questions: Why do you need this information?
9RULES CAN ALSO HANDLE UNCERTAINTY If condition A then conclusion B with certainty F
10EXAMPLE RULE FROM MYCIN if1 the infection is primary bacteremia, and2 the site of the culture is one of the sterilesites, and3 the suspected portal of entry of the organism is thegastrointestinal tractthenthere is suggestive evidence (0.7) that the identity of the organism is bacteroides.
11EXAMPLE RULES FROM AL/X Diagnosing equipment failure on oil platforms, Reiter 1980ifthe pressure in V-01 reached relief valve lift pressurethenthe relief valve on V-01 has lifted [N = 0.005, S = 400]NOT the pressure in V-01 reached relief valve lift pressure,and the relief valve on V-01 has liftedthe V-01 relief valve opened early (the set pressure has drifted)[N = 0.001, S = 2000]
12EXAMPLE RULE FROM AL3 Game playing, Bratko 1982 if1 there is a hypothesis, H, that a plan P succeeds, and2 there are two hypotheses,H1, that a plan R1 refutes plan P, andH2, that a plan R2 refutes plan P, and3 there are facts: H1 is false, and H2 is falsethen1 generate the hypothesis, H3, that the combinedplan ‘R1 or R2' refutes plan P, and2 generate the fact: H3 implies not(H)
13A TOY KNOWLEDGE BASE: WATER IN FLAT Flat floor planInference network
14IMPLEMENTING THE ‘LEAKS’ EXPERT SYSTEM Possible directly as Prolog rules:leak_in_bathroom :-hall_wet,kitchen_dry.Employ Prolog interpreter as inference engineDeficiences of this approach:Limited syntaxLimited inference (Prolog’s backward chaining)Limited explanation
15TAILOR THE SYNTAX WITH OPERATORS Rules:ifhall_wet and kitchen_drythenleak_in_bathroom.Facts:fact( hall_wet).fact( bathroom_dry).fact( window_closed).
16BACKWARD CHAINING RULE INTERPRETER is_true( P) :-fact( P).if Condition then P, % A relevant ruleis_true( Condition) % whose condition is trueis_true( P1 and P2) :-is_true( P1),is_true( P2).is_true( P1 or P2) :-is_true( P1) ; is_true( P2).
17A FORWARD CHAINING RULE INTERPRETER new_derived_fact( P), % A new fact!,write( 'Derived: '), write( P), nl,assert( fact( P)),forward % Continue;write( 'No more facts') % All facts derived
18FORWARD CHAINING INTERPRETER, CTD. new_derived_fact( Concl) :-if Cond then Concl, % A rulenot fact( Concl), % Rule's conclusion not yet a factcomposed_fact( Cond) % Condition true?composed_fact( Cond) :-fact( Cond) % Simple factcomposed_fact( Cond1 and Cond2) :-composed_fact( Cond1),composed_fact( Cond2) % Both conjuncts truecomposed_fact( Cond1 or Cond2) :-composed_fact( Cond1) ; composed_fact( Cond2).
19FORWARD VS. BACKWARD CHAINING Inference chains connect various types of info.:data goalsevidence hypothesesfindings, observations explanations, diagnosesmanifestations diagnoses, causesBackward chaining: “goal driven”Forward chaining: “data driven”
20FORWARD VS. BACKWARD CHAINING What is better?Checking a given hypothesis: backward more naturalIf there are many possible hypotheses: forward more natural (e.g. monitoring: data-driven)Often in expert reasoning: combination of bothe.g. medical diagnosiswater in flat: observe hall_wet, infer forward leak_in_bathroom, check backward kitchen_dry
21GENERATING EXPLANATION How explanation: How did you find this answer?Explanation = proof tree of final conclusionE.g.: System: There is leak in kitchen.User: How did you get this?
22% is_true( P, Proof) Proof is a proof that P is true :- op( 800, xfx, <=).is_true( P, P) :-fact( P).is_true( P, P <= CondProof) :-if Cond then P,is_true( Cond, CondProof).is_true( P1 and P2, Proof1 and Proof2) :-is_true( P1, Proof1),is_true( P2, Proof2).is_true( P1 or P2, Proof) :-is_true( P1, Proof);is_true( P2, Proof).
23WHY EXPLANTION Exploring “leak in kitchen”, system asks: Was there rain?User (not sure) asks:Why do you need this?System:To explore whether water came from outsideWhy explanation = chain of rules between current goal and original (main) goal
24UNCERTAINTY Propositions are qualified with: “certainty factors”, “degree of belief”, “subjective probability”, ...Proposition: CertaintyFactorif Condition then Conclusion: Certainty
25A SIMPLE SCHEME FOR HANDLING UNCERTAINTY c( P1 and P2) = min( c(P1), c(P2))c(P1 or P2) = max( c(P1), c(P2))Rule “if P1 then P2: C” implies: c(P2) = c(P1) * C
26DIFFICULTIES IN HANDLING UNCERTAINTY In our simple scheme, for example:Let c(a) = 0.5, c(b) = 0, then c( a or b) = 0.5Now, suppose that c(b) increases to 0.5Still: c( a or b) = 0.5This is counter intuitive!Proper probability calculus would fix thisProblem with probability calculus: Needs conditional probabilities - too many?!?
27TWO SCHOOLS RE. UNCERTAINTY School 1: Probabilities impractical! Humans don’t think in terms of probability calculus anyway!School 2: Probabilities are mathematically sound and are the right way!Historical development of this debate: School 2 eventually prevailed - Bayesian nets, see e.g. Russell & Norvig 2003 (AI, Modern Approach), Bratko 2001 (Prolog Programming for AI)