© C. Kemke Inexact Reasoning 1 COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.

Slides:



Advertisements
Similar presentations
Expert System Seyed Hashem Davarpanah
Advertisements

© C. Kemke Inexact Reasoning 1 1 Uncertainty and Rules We have already seen that expert systems can operate within the realm of uncertainty. There are.
FT228/4 Knowledge Based Decision Support Systems
Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
FT228/4 Knowledge Based Decision Support Systems
1 Chapter 3 Probability 3.1 Terminology 3.2 Assign Probability 3.3 Compound Events 3.4 Conditional Probability 3.5 Rules of Computing Probabilities 3.6.
Chapter 4: Reasoning Under Uncertainty
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 6.1 Chapter Six Probability.
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
CSNB234 ARTIFICIAL INTELLIGENCE
© 2002 Franz J. Kurfess Reasoning under Uncertainty 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© 2002 Franz J. Kurfess Introduction 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© Franz J. Kurfess Reasoning under Uncertainty CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© Franz J. Kurfess Reasoning under Uncertainty 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
Chapter 7 Probability. Definition of Probability What is probability? There seems to be no agreement on the answer. There are two broad schools of thought:
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Lecture 05 Rule-based Uncertain Reasoning
Probability (cont.). Assigning Probabilities A probability is a value between 0 and 1 and is written either as a fraction or as a proportion. For the.
Lecture 6: Descriptive Statistics: Probability, Distribution, Univariate Data.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 4: Reasoning Under Uncertainty
Abdul Rahim Ahmad MITM 613 Intelligent System Chapter 3b: Dealing with Uncertainty (Fuzzy Logic)
Knowledge-Based Systems Knowledge-Based Systems Dr. Marco Antonio Ramos Corchado Computer Science Department.
Fuzzy Logic. Lecture Outline Fuzzy Systems Fuzzy Sets Membership Functions Fuzzy Operators Fuzzy Set Characteristics Fuzziness and Probability.
1 Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Basic Principle of Statistics: Rare Event Rule If, under a given assumption,
 Basic Concepts in Probability  Basic Probability Rules  Connecting Probability to Sampling.
© C. Kemke Reasoning under Uncertainty 1 COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Copyright © 2010 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Copyright © 2010 Pearson Education, Inc. Chapter 6 Probability.
Uncertainty Management in Rule-based Expert Systems
Uncertainty in Expert Systems
Making sense of randomness
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Reasoning Under Uncertainty. 2 Objectives Learn the meaning of uncertainty and explore some theories designed to deal with it Find out what types of errors.
From Randomness to Probability Chapter 14. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
Reasoning with Uncertainty دكترمحسن كاهاني
International Conference on Fuzzy Systems and Knowledge Discovery, p.p ,July 2011.
Probability. What is probability? Probability discusses the likelihood or chance of something happening. For instance, -- the probability of it raining.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
Chapter 7. Propositional and Predicate Logic Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
UNIVERSITI TENAGA NASIONAL 1 CCSB354 ARTIFICIAL INTELLIGENCE Chapter 8.2 Certainty Factors Chapter 8.2 Certainty Factors Instructor: Alicia Tang Y. C.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Copyright © 2010 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Chapter 4: Reasoning Under Uncertainty Expert Systems: Principles and Programming, Fourth Edition Original by Course Technology Modified by Ramin Halavati.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
AP Statistics From Randomness to Probability Chapter 14.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Chapter 7. Propositional and Predicate Logic
Chapter 4 Probability.
Reasoning Under Uncertainty in Expert System
Basic Probabilistic Reasoning
Representing Uncertainty
Probability Probability underlies statistical inference - the drawing of conclusions from a sample of data. If samples are drawn at random, their characteristics.
Chapter 7. Propositional and Predicate Logic
28th September 2005 Dr Bogdan L. Vrusias
Certainty Factor Model
Presentation transcript:

© C. Kemke Inexact Reasoning 1 COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba

© C. Kemke Inexact Reasoning 2 Inexact Reasoning References:  Jackson, Chapter 19, Truth Maintenance Systems  Giarratano and Riley, Chapters 4 and 5  Luger and Stubblefield 'Artificial Intelligence', Addison- Wesley, 2002, Chapter 7

© C. Kemke Inexact Reasoning 3 Knowledge & Inexact Reasoning  inexact knowledge (truth of  not clear)  incomplete knowledge (lack of knowledge about  )  defaults, beliefs (assumption about truth of  )  contradictory knowledge (  true and false)  vague knowledge (truth of  not 0/1)

© C. Kemke Inexact Reasoning 4 Inexact Reasoning  CF Theory - uncertainty uncertainty about facts and conclusions  Fuzzy - vagueness truth not 0 or 1 but graded (membership fct.)  Truth Maintenance - beliefs, defaults assumptions about facts, can be revised  Probability Theory - likelihood of events statistical model of knowledge

© C. Kemke Inexact Reasoning 5 Inexact Reasoning not necessary... NOT necessary when assuming:  complete knowledge about the "world"  no contradictory facts or rules  everything is either true or false This corresponds formally to a complete consistent theory in First-Order Logic, i.e.  everything you have to model is contained in the theory, i.e. your theory or domain model is complete  facts are true or false (assuming your rules are true)  your sets of facts and rules contain no contradiction (are consistent)

© C. Kemke Inexact Reasoning 6 Exact Reasoning: Theories in First-Order Predicate Logic Theory (Knowledge Base) given as a set of well-formed formulae. Formulae include facts like mother (Mary, Peter) and rules like mother (x, y)  child (y, x) Reasoning based on applying rules of inference of first-order predicate logic, like Modus Ponens: If p and p  q given then q can be inferred (proven) p, p  q q

© C. Kemke Inexact Reasoning 7 Forms of Inexact Knowledge  uncertainty (truth not clear)  probabilistic models, multi-valued logic (true, false, don't know,...), certainty factor theory  incomplete knowledge (lack of knowledge)  P true or false not known (  defaults)  defaults, beliefs (assumptions about truth)  assume P is true, as long as there is no counter-evidence (i.e. that ¬P is true)  assume P is true with Certainty Factor  contradictory knowledge (true and false)  inconsistent fact base; somehow P and ¬P true  vague knowledge (truth value not 0/1; not crisp sets)  graded truth; fuzzy sets

© C. Kemke Inexact Reasoning 8 Inexact Knowledge - Example Person A walks on Campus towards the bus stop. A few hundred yards away A sees someone and is quite sure that it's his next-door neighbor B who usually goes by car to the University. A screams B's name. default - A wants to take a bus belief, (un)certainty - it's the neighbor B probability, default, uncertainty - the neighbor goes home by car default - A wants to get a lift default - A wants to go home Q: Which forms of inexact knowledge and reasoning are involved here?

© C. Kemke Inexact Reasoning 9 Examples of Inexact Knowledge Person A walks on Campus towards the bus stop. A few hundred yards away A sees someone and is quite sure that it's his next-door neighbor B who usually goes by car to the University. A screams B's name. Fuzzy - a few hundred yards define a mapping from "#hundreds" to 'few', 'many',... not uncertain or incomplete but graded, vague Probabilistic - the neighbor usually goes by car probability based on measure of how often he takes car; calculates always p(F) = 1 - p(¬F) Belief - it's his next-door neighbor B "reasoned assumption", assumed to be true Default - A wants to take a bus assumption based on commonsense knowledge

© C. Kemke Inexact Reasoning 10 Dealing with Inexact Knowledge Methods for representing and handling: 1.incomplete knowledge: defaults, beliefs  Truth Maintenance Systems (TMS); non-monotonic reasoning 2.contradictory knowledge: contradictory facts or different conclusions, based on defaults or beliefs  TMS, Certainty Factors,..., multi-valued logics 3.uncertain knowledge: hypotheses, statistics  Certainty Factors, Probability Theory 4.vague knowledge: "graded" truth  Fuzzy, rough sets 5.inexact knowledge and reasoning  involves 1-4; clear 0/1 truth value cannot be assigned

© C. Kemke Inexact Reasoning 11 Truth Maintenance Systems

© C. Kemke Inexact Reasoning 12 Truth Maintenance  Necessary when changes in the fact-base lead to inconsistency / incorrectness among the facts  non-monotonic reasoning  A Truth Maintenance System tries to adjust the Knowledge Base or Fact Base upon changes to keep it consistent and correct.  A TMS uses dependencies among facts to keep track of conclusions and allow revision / retraction of facts and conclusions.

© C. Kemke Inexact Reasoning 13 Non-monotonic Reasoning non-monotonic reasoning  The set of currently valid (believed) facts does NOT increase monotonically.  Adding a new fact might lead to an inconsistency which requires the removal of one of the contradictory facts.  Thus, the set of true (or: believed as true) facts can shrink and grow with reasoning.  This is why it’s called “non-monotonic reasoning”.  In classical logic (first-order predicate logic) this does not happen. Once a fact is asserted, it’s forever true.

© C. Kemke Inexact Reasoning 14 Non-monotonic Reasoning - Example Example: non-monotonic reasoning Your are a student, it's 8am , you are in bed. You slip out of your dreams and think: Today is Sunday. No classes today. l don't have to get up. You go back to sleep. You wake up again. It's 9:30am  now and it is slowly coming to your mind: Today is Tuesday. What an unpleasant surprise. P1 = today-is-TuesdayP2 = today-is-Sunday P3 = have-class-at-10amP4 = no-classes P5 = have-to-get-up P6 = can-stay-in-bed

© C. Kemke Inexact Reasoning 15 P1  P3  P5 P2  P4  P6 assume P2; conclude  P1 ; P4 ;  P3 ; P6 ;  P5 assume P1; conclude  P2 ; P3 ;  P4 ; P5 ;  P6 Non-monotonic Reasoning - Example P1 = today-is-TuesdayP2 = today-is-Sunday P3 = have-class-at-10amP4 = no-classes P5 = have-to-get-up P6 = can-stay-in-bed Assume: P1 and P2, P3 and P4, P5 and P6 are mutually exclusive, i.e. P1   P2, P3   P4, P5   P6 

© C. Kemke Inexact Reasoning 16 Truth Maintenance Theories  TMS are often based on dependency-directed backtracking to the point in reasoning where a wrong assumption was used.  McAllester (1978,1980) “propositional constraint propagation” employs a dependency network which reflects the justification of conclusions of new facts  Doyle (1979) justification based Truth Maintenance System

© C. Kemke Inexact Reasoning 17 Truth Maintenance Theories - McAllester McAllester “propositional constraint propagation”  network representing conclusions, where  proposition-nodes are connected if one of the nodes is a reason for concluding the other node. Example:  p  q (p  q) If p is known to be true, q can be concluded. Connections from p and  p  q to q mean that p and  p  q are reasons to conclude p.

© C. Kemke Inexact Reasoning 18 Truth Maintenance Theories - McAllester McAllester (1980) proposition-nodes are connected if one of the nodes is a reason for concluding the other node ( simplified version ). Example: Connections from p and  p  q to combination and then to q represent justification for q p  p  q q p

© C. Kemke Inexact Reasoning 19 Truth Maintenance Theories - Doyle Doyle (1979) deals with beliefs as justified assumptions. As long as there is no contra-evidence for a fact (belief) we can assume that it is true. IN p facts which support P; OUT p facts which prevent P. Distinguishes:  Premises - always true (IN p = OUT p =  )  Deductions - derived (IN p   ; OUT p =  )  Assumptions – depends (IN p =  ; OUT p   )

© C. Kemke Inexact Reasoning 20 Truth Maintenance Theories - Doyle Doyle (1979) As long as there is no contra-evidence for a fact (belief) we can assume that it is true. Theory is based on the concept of Support-Lists (SL). A Support-List of a Fact (Belief) P specifies Facts (Beliefs) which support the conclusion of the Fact P or prevent its conclusion. The TMS maintains and updates the set of current Facts/Beliefs if changes occur. Uses justification networks, similar to McAllester’s dependency networks.

© C. Kemke Inexact Reasoning 21 Truth Maintenance in CLIPS 1 logical  logical connection between condition- and action-part of a rule  if logical-part of condition is not true anymore, consequence-fact in action-part is retracted When fire-present is true, alarm-on can be concluded. When fire-present is retracted, alarm-on will also be retracted. (defrule fire-reaction (logical (fire-present)) => (assert (alarm-on)))

© C. Kemke Inexact Reasoning 22 Truth Maintenance in CLIPS 2 Dependencies  (dependents ) prints all current facts which depend on the indexed fact (are concluded from that fact)  (dependencies ) prints all current facts on which the indexed fact depends (from which the indexed fact can was concluded) dependents of fire-present dependencies of fire-present alarm-onnone

© C. Kemke Inexact Reasoning 23 Certainty Factor Theory

© C. Kemke Inexact Reasoning 24 Certainty Factor Theory u Certainty Factor CF of Hypothesis H  ranges between -1 (denial of H) and +1 (confirmation of H)  allows the ranking of hypotheses u Based on measures of belief MB and disbelief MD u MB is expressing the belief that H is true u MD is expressing the belief that H is not true u MB is not 1-MD - it’s not like probabilities u Experts determine values for MB, MD of H based on given evidence E  subjective

© C. Kemke Inexact Reasoning 25 Stanford Certainty Factor Theory u Certainty Factor CF of Hypothesis H is based on difference between Measure of Belief MB and Measure of Disbelief MD in hypothesis H, given evidence E.  Certainty Factor of hypothesis H given evidence E: CF (H|E) = MB(H|E) – MD(H|E) -1  CF(H)  1  Can integrate different experts’ assessments.  Basis to combine support/rejection for H within one rule and using different rules.

© C. Kemke Inexact Reasoning 26 Stanford Certainty Factor Theory  Remember the base rule for Certainty Factor CF (H|E) : CF (H|E) = MB(H|E) – MD(H|E) -1  CF(H)  1  Integrate Certainty Factors into reasoning.  CF-value for H calculated using CFs of premises P in rule CF(H) = CF(P1 and P2) = min (CF(P1),CF(P2)) CF(H) = CF(P1 or P2) = max (CF(P1),CF(P2))  CF-value for H combined from different rules, experts,... CF(H) = CF1 + CF2 – CF1∙ CF2if both CF1,CF2 > 0 CF(H) = CF1 + CF2 + CF1∙ CF2if both CF1,CF2  0 CF(H) = CF1 + CF2 else 1 – min ( |CF1|,|CF2| )

© C. Kemke Inexact Reasoning 27 Characteristics of Certainty Factors Aspect (Believed) Probability MBMDCF Certainly trueP(H|E) = 1101 Certainly false P(  H|E) = 1 01 No evidenceP(H|E) = P(H)000 Ranges measure of belief 0 ≤ MB ≤ 1 measure of disbelief 0 ≤ MD ≤ 1 certainty factor -1 ≤ CF ≤ +1

© C. Kemke Inexact Reasoning 28 Probability Theory

© C. Kemke Inexact Reasoning 29 Basics of Probability Theory  mathematical approach to process uncertain information  sample space (event) set: S = {x 1, x 2, …, x n }  collection of all possible events  probability p(x i ) is likelihood that the event x i  S occurs  non-negative values in [0,1]  total probability of the sample space  is 1,  p(x i, x i  S) = 1  experimental probability  based on the frequency of events  subjective probability (CF Theories, like Dempster-Shafer,...)  based on expert assessment

© C. Kemke Inexact Reasoning 30 Compound Probabilities  for independent events  do not affect each other in any way  example: cards and events “hearts” and “queen”  joint probability of independent events A and B P(A  B) = |A  B| / |S| = P(A) * P(B) where |S| is the number of elements in S  union probability of independent events A and B P(A  B) = P(A) + P(B) - P(A  B) = P(A) + P(B) - P(A) * P (B) Situation in which either event occurs. Subtract probability of their accidental co-occurrence - P(A  B) is already included in P(A)+P(B) and would otherwise be counted twice.

© C. Kemke Inexact Reasoning 31 Compound Probabilities  For mutually exclusive events  can not occur together at the same time  Examples: one dice and events “1” and “6”; one coin and events “heads” and “tail”  joint probability of two different events A and B P(A  B) = 0 Throw dice and show both “1” and “6” cannot happen.  union probability of two events A and B P(A  B) = P(A) + P(B) Throw coin and show either “heads” or “tail”. This is also called “special addition”.

© C. Kemke Inexact Reasoning 32 Conditional Probabilities  describes dependent events  affect each other in some way  Example: Throw dice twice; second throw has to give larger value than first throw.  conditional probability of event A given that event B has already occurred P(A|B) = P(A  B) / P(B)  example: B = throw(x); A = throw(y>x)  See next slide.

© C. Kemke Inexact Reasoning 33 Conditional Probabilities  Example: B = throw(x); A = throw(y>x)  P(A|B) = P(throw x and then throw y with y>x)  P(A|B) = P(A  B) / P(B)  P(A  B) = P(throw x)  P(throw y, y>x) = 1/6  (1/6  (6-x))  If x=5 then P(A  B) = 1/6  1/6  (6-5) = 1/36  If x=1 then P(A  B) = 1/6  1/6  5 = 5/36  P(B) = P(throw x) = 1/6  P(A|B) = P(A  B) / P(B)  If x=1 then P(A|B) = 5/36*6 = 5/6   If x=5 then P(A|B) = 5/36*1 = 5/36  0.14

© C. Kemke Inexact Reasoning 34 Bayesian Approaches  derive the probability of a cause given a symptom  has gained importance recently due to advances in efficiency  more computational power available  better methods  especially useful in diagnostic systems  medicine, computer help systems  inverse or a posteriori probability  inverse to conditional probability of an earlier event given that a later one occurred

© C. Kemke Inexact Reasoning 35 Bayes’ Rule for Single Event  single hypothesis H, single event E  P(H | E) = (P(E | H) * P(H)) / P(E) or  P(H | E) = (P(E | H) * P(H) / (P(E | H) * P(H) + P(E |  H) * P(  H) )

© C. Kemke Inexact Reasoning 36 Example

© C. Kemke Inexact Reasoning 37 Fred and the Cookie Bowls  Suppose there are two bowls full of cookies.  Bowl #1 has 10 chocolate chip cookies and 30 plain cookies, while bowl #2 has 20 of each.  Fred picks a bowl at random, and then picks a cookie at random. We may assume there is no reason to believe Fred treats one bowl differently from another, likewise for the cookies.  The cookie turns out to be a plain one.  How probable is it that Fred picked it out of bowl #1? From:

© C. Kemke Inexact Reasoning 38 The Cookie Bowl Problem “What’s the probability that Fred picked bowl #1, given that he has a plain cookie?”  Event A is that Fred picked bowl #1.  Event B is that Fred picked a plain cookie.  Compute P(A|B). We need:  P(A) - the probability that Fred picked bowl #1 regardless of any other information. Since Fred is treating both bowls equally, it is 0.5.  P(B) is the probability of getting a plain cookie regardless of any information on the bowls. It is computed as the sum of the probability of getting a plain cookie from a bowl multiplied by the probability of selecting this bowl. We know that the probability of getting a plain cookie from bowl #1 is 0.75, and the probability of getting one from bowl #2 is 0.5. Since Fred is treating both bowls equally the probability of selecting any one of the bowls is 0.5 (see next slide).  Thus, the probability of getting a plain cookie overall is 0.75× ×0.5 =  P(B|A) is the probability of getting a plain cookie given that Fred has selected bowl #1. From the problem statement, we know this is 0.75, since 30 out of 40 cookies in bowl #1 are plain.

© C. Kemke Inexact Reasoning 39 The Cookie Bowls Bowl #1Bowl #2Totals Chocolate Plain Total40 80 Bowl #1Bowl #2Totals Chocolate Plain Total The table on the right is derived from the table on the left by dividing each entry by the total Number of cookies in each bowl by type of cookie Relative frequency of cookies in each bowl by type of cookie

© C. Kemke Inexact Reasoning 40 Fred and the Cookie Bowl  Given all this information, we can compute the probability of Fred having selected bowl #1 (event A) given that he got a plain cookie (event B), as such:  As we expected, it is more than half.

© C. Kemke Inexact Reasoning 41 Fuzzy Set Theory

© C. Kemke Inexact Reasoning 42 Fuzzy Set Theory (Zadeh) Aimed to model and formalize "vague" Natural Language terms and expressions. Example: Peter is relatively tall. Define a set of fuzzy sets (predicates or categories), like tall, small. Each fuzzy subset has an associated membership function mapping (exact) domain values into a (graded) membership value. tall would be one fuzzy subset defined by such a function which takes the height (e.g. in inches) as input, and determines a fuzzy membership-value (between 0 and 1) for tall and small as output.

© C. Kemke Inexact Reasoning 43 Fuzzy Set Membership Function If Peter is 6' high, and the fuzzy membership value of tall for 6' is 0.9, then Peter is quite tall.

© C. Kemke Inexact Reasoning 44 Review Inexact Reasoning u uncertain reasoning – uncertainty about facts and/or rules – CF Theory u vagueness – truth not 0 or 1 - Fuzzy sets and Fuzzy logic u beliefs, defaults – assumptions about truth, can be revised – non-monotonic reasoning, Truth Maintenance System u likelihood of event – statistical model of knowledge - Probability Theory

© C. Kemke Inexact Reasoning 45 Other Forms of Representing and Reasoning with Inexact Knowledge u Logics  Explicit modeling of Belief- and Knows- Operators in Modal Logic or Autoepistemic Logic. u Probabilistic Reasoning  Bayes’ Theory  Dempster-Shafer Theory

© C. Kemke Inexact Reasoning 46