CSNB234 ARTIFICIAL INTELLIGENCE

Slides:



Advertisements
Similar presentations
Introductory Mathematics & Statistics for Business
Advertisements

Chapter 10: The t Test For Two Independent Samples
FT228/4 Knowledge Based Decision Support Systems
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
1 Inferences with Uncertainty Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson Copyright 1998, Prentice Hall, Upper Saddle.
 Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty? n Basic.
Uncertainty in Expert Systems (Certainty Factors).
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Hypothesis Testing A hypothesis is a claim or statement about a property of a population (in our case, about the mean or a proportion of the population)
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
FT228/4 Knowledge Based Decision Support Systems
Foundations of Artificial Intelligence 1 Bayes’ Rule - Example  Medical Diagnosis  suppose we know from statistical data that flu causes fever in 80%
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Certainty and Evidence
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
© 2002 Franz J. Kurfess Reasoning under Uncertainty 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
Topic 6: Introduction to Hypothesis Testing
EPIDEMIOLOGY AND BIOSTATISTICS DEPT Esimating Population Value with Hypothesis Testing.
Copyright ©2010 Pearson Education, Inc. publishing as Prentice Hall 9- 1 Basic Marketing Research: Using Microsoft Excel Data Analysis, 3 rd edition Alvin.
Solved the Maze? Start at phil’s house. At first, you can only make right turns through the maze. Each time you cross the red zigzag sign (under Carl’s.
Evaluating Hypotheses Chapter 9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics.
© Franz J. Kurfess Reasoning under Uncertainty 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
Inductive Reasoning Bayes Rule. Urn problem (1) A B A die throw determines from which urn to select balls. For outcomes 1,2, and 3, balls are picked from.
Lecture 05 Rule-based Uncertain Reasoning
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
Chapter 9: Introduction to the t statistic
Standard Error of the Mean
INFERENTIAL STATISTICS – Samples are only estimates of the population – Sample statistics will be slightly off from the true values of its population’s.
Hypothesis Testing:.
Statistical Techniques I
Comparing Two Population Means
Estimation of Statistical Parameters
PARAMETRIC STATISTICAL INFERENCE
Maximum Likelihood Estimator of Proportion Let {s 1,s 2,…,s n } be a set of independent outcomes from a Bernoulli experiment with unknown probability.
© C. Kemke Reasoning under Uncertainty 1 COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
Biostatistics Class 6 Hypothesis Testing: One-Sample Inference 2/29/2000.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Belief-Function Formalism Computes the probability that the evidence supports a proposition Also known as the Dempster-Shafer theory Bayesian Formalism.
Lecture 16 Section 8.1 Objectives: Testing Statistical Hypotheses − Stating hypotheses statements − Type I and II errors − Conducting a hypothesis test.
Chapter 5 Parameter estimation. What is sample inference? Distinguish between managerial & financial accounting. Understand how managers can use accounting.
1 Chapter 8 Hypothesis Testing 8.2 Basics of Hypothesis Testing 8.3 Testing about a Proportion p 8.4 Testing about a Mean µ (σ known) 8.5 Testing about.
Uncertainty Management in Rule-based Expert Systems
Uncertainty in Expert Systems
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Reasoning with Uncertainty دكترمحسن كاهاني
International Conference on Fuzzy Systems and Knowledge Discovery, p.p ,July 2011.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
Hypothesis Testing Introduction to Statistics Chapter 8 Feb 24-26, 2009 Classes #12-13.
Education 793 Class Notes Inference and Hypothesis Testing Using the Normal Distribution 8 October 2003.
UNIVERSITI TENAGA NASIONAL 1 CCSB354 ARTIFICIAL INTELLIGENCE Chapter 8.2 Certainty Factors Chapter 8.2 Certainty Factors Instructor: Alicia Tang Y. C.
SAMPLING DISTRIBUTION OF MEANS & PROPORTIONS. SAMPLING AND SAMPLING VARIATION Sample Knowledge of students No. of red blood cells in a person Length of.
SAMPLING DISTRIBUTION OF MEANS & PROPORTIONS. SAMPLING AND SAMPLING VARIATION Sample Knowledge of students No. of red blood cells in a person Length of.
The accuracy of averages We learned how to make inference from the sample to the population: Counting the percentages. Here we begin to learn how to make.
Chapter 10: The t Test For Two Independent Samples.
Chapter 12 Certainty Theory (Evidential Reasoning) 1.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Reasoning Under Uncertainty in Expert System
Basic Probabilistic Reasoning
Dealing with Uncertainty
28th September 2005 Dr Bogdan L. Vrusias
Certainty Factor Model
basic probability and bayes' rule
Presentation transcript:

CSNB234 ARTIFICIAL INTELLIGENCE Chapter 8.2 Certainty Factors (CF) Instructor: Alicia Tang Y. C. UNIVERSITI TENAGA NASIONAL

Uncertainty: Introduction In Expert Systems, we must often attempt to draw correct conclusions from poorly formed and uncertain evidence using unsound inference rules. This is not an impossible task; we do it successfully in almost every aspect of our daily survival. UNIVERSITI TENAGA NASIONAL

Uncertainty: Introduction Doctors deliver correct medical treatment for ambiguous symptoms; we understand natural language statements that are incomplete or ambiguous and so on. One of the characteristics of information available to human experts is its imperfection Information can be incomplete, inconsistent, uncertain However, we are good at drawing valid conclusion from such information UNIVERSITI TENAGA NASIONAL

So, how to define the term “Uncertainty”? Uncertainty can be defined as the lack of the exact knowledge that would enable us to reach a perfectly reliable conclusion. This is because information available to us can be in its imperfect, such as inconsistent, incomplete, or unsure, or all three. An example: unknown data or imprecise language UNIVERSITI TENAGA NASIONAL

There are many approaches to representing uncertainty in AI. UNIVERSITI TENAGA NASIONAL

Uncertainty Handling Methods E.g. Suppose: If x is a bird then x flies Abductive reasoning would say that “All fly things are birds” By property inheritance “All birds can fly” but, remember the case that Penguin cannot fly? Abductive reasoning Property inheritance Fuzzy logic Certainty Factor (CF) Bayes theorem Dempster-Shafer theory UNIVERSITI TENAGA NASIONAL

Evaluation Criteria for uncertainty handling methods Expressive power Logical correctness Computational efficiency of inference UNIVERSITI TENAGA NASIONAL

Scheme used by expert system in Handling Uncertainty MYCIN uses Certainty Factor The CF can be used to rank hypotheses in order of importance. For example if a patient has certain symptoms that suggest several possible diseases, then the disease with the higher CF would be the one that is first investigated. REVEAL uses Fuzzy logic PROSPECTOR uses Bayes theorem UNIVERSITI TENAGA NASIONAL

Purpose Design element semantics, and Formulas Certainty Factors Purpose Design element semantics, and Formulas

UNIVERSITI TENAGA NASIONAL Certainty Factor (CF) When experts put together the rule base they must agree on a CF to go with each rule. This CF reflects their confidence in the rule’s reliability. Certainty measures may be adjusted to tune the system’s overall performance, although slight variations in this confidence measure tend to have little effect on the overall running of the system. UNIVERSITI TENAGA NASIONAL

UNIVERSITI TENAGA NASIONAL Certainty Factor Certainty factors measure the confidence that is placed on a conclusion based on the evidence known so far. A certainty factor is the difference between the following two components : CF = MB[h:e] - MD[h:e] A positive CF means the evidence supports the hypothesis since MB > MD. UNIVERSITI TENAGA NASIONAL

Certainty Factor Computation CF[h:e] = MB[h:e] - MD[h:e] …………………… (I) CF[h:e] is the certainty of a hypothesis h given the evidence e. MB[h:e] is the measure of belief in h given e. MD[h:e] is the measure of disbelief in h given e. CFs can range from -1 (completely false) to +1 (completely true) with fractional values in between, and zero representing ignorance. MDs and MBs can range between 0 to 1 only. 1 - 0 0 - 1 UNIVERSITI TENAGA NASIONAL

More equations for CF computation use MB(P1 AND P2) = MIN(MB(P1), MB(P2)) ……. (II) MB(P1 OR P2) = MAX(MB(P1), MB(P2)) ……… (III) the MB in the negation of a fact can be derived as: MB(NOT P1) = 1 - MB(P1) ………………………. (IV) UNIVERSITI TENAGA NASIONAL

UNIVERSITI TENAGA NASIONAL Each rule can have an credibility (attenuation) A number from 0 to 1 which indicates its reliability. The credibility is then multiplied by the MB for the conclusion of the rule. MB(Conclusion) = MB(conditions) * credibility ….. (V) & MB[h:e1,e2] = MB[h:e1] + MB[h:e2] * (1-MB[h:e1]) …….. (VI) Credibility for each rule The goal of a rule UNIVERSITI TENAGA NASIONAL

A CF Calculation Example Rule 1 IF X drives a Myvi AND X reads the Berita Harian THEN X will vote Barisan Nasional Rule 2 IF X loves the setia song OR X supports Vision 2020 Rule 3 IF X uses unleaded petrol OR X does not support Vision 2020 THEN X will not vote Barisan Nasional The set of 3 rules For deducing The chances of “KL People will vote for BN” UNIVERSITI TENAGA NASIONAL

UNIVERSITI TENAGA NASIONAL Let us assume that the individual MBs for the Conditions are as follows: X drives a Myvi car 0.9 X reads the Utusan Malaysia 0.7 X loves the 1Malaysia song 0.8 X supports Vision 2020 0.6 X uses unleaded petrol 0.7 UNIVERSITI TENAGA NASIONAL

UNIVERSITI TENAGA NASIONAL While the credibility of each rule is as follows: Rule 1 0.7 Rule 2 0.8 Rule 3 0.6 UNIVERSITI TENAGA NASIONAL

UNIVERSITI TENAGA NASIONAL The hypothesis (i.e. we want to test this) To determine: CF[ X votes BN: Rule 1, Rule 2, Rule 3 ] Rule1 and Rule2 give the MB in the proposition “X votes BN” : MB[X votes BN: Rule 1] = MIN (0.9, 0.7) * 0.7 = 0.49 -- using II and V MB[X votes BN: Rule 2] = MAX (0.8, 0.6) * 0.8 = 0.64 -- using III and V MB[X votes BN: Rule 3] = MAX (0.7), (1-0.6)) * 0.6 = 0.42 -- using II, IV and V UNIVERSITI TENAGA NASIONAL

UNIVERSITI TENAGA NASIONAL Combining the Rule 1 and Rule 2: MB[X votes BN: Rule1, Rule2] = MB[X votes BN: Rule1] + MB[X votes BN: Rule2] * ( 1 - MB[X votes BN: Rule 1] ) ---- using (VI) = 0.49 + 0.64 * (1 - 0.49) = 0.82 Combining the three rules: CF[ X votes BN: Rule 1, Rule 2, Rule 3 ] = MB[X votes BN: Rule 1, Rule 2] - MD[X votes BN: Rule 3] = 0.82 - 0.42 = 0.4 I disbelieve you will note I believe you won’t vote After we obtain the CF for the hypothesis, what do you think is the answer for the question: “Will someone in KL vote for BN party”? UNIVERSITI TENAGA NASIONAL

In an expert system that implements “uncertainty handling” The answer is “May be” (and not a “yes” or a “no”) Isn’t it exactly the way you and I say it! Certainty Factor has been criticised to be excessively ad-hoc. The semantic of the certainty value can be subjective and relative. But the human expert’s confidence in his reasoning is also approximate, heuristic and informal UNIVERSITI TENAGA NASIONAL

UNIVERSITI TENAGA NASIONAL Advantages of CF scheme: a simple computational model that permits experts to estimate their confidence in conclusion it permits the expressions of belief and disbelief in each hypothesis (expression of multiple sources of evidence is thus allowed) gathering the value of CF is easier than those in other methods UNIVERSITI TENAGA NASIONAL

UNIVERSITI TENAGA NASIONAL Bayesian Approach (I) Bayesian approach (or Bayes theorem) is based on formal probability theory. It provides a way of computing the probability of a hypothesis (without sampling) following from a particular piece of evidence, given only the probabilities with which the evidence follows from actual cause. To use this approach, reliable statistical data that define the prior probabilities for each hypothesis must be available As these requirements are rarely satisfied on real-world problem, so only a few systems have been built based on bayesian reasoning UNIVERSITI TENAGA NASIONAL

Bayesian approach (II) p(E | Hi) * p(Hi) p(Hi | E) = ------------------------------ n  p(E | Hk) * p(Hk) k= 1 evidence hypothesis Here, as you can see, a number of assumptions (i.e. independence of evidence) which cannot be made for many applications (such as in medical cases). UNIVERSITI TENAGA NASIONAL

Bayes theorem (III) where: You will get an A if you study every night for one week before exam where: p(Hi | E) is the probability that Hi is true given evidence E. p(Hi) is the probability that Hi is true overall. p(E | Hi) is the probability of observing evidence E when Hi is true. n is the number of possible hypotheses. Those who obtained an ‘A’ and they indeed studied every night before exam If there are not many cases of success of people who obtained an ‘A’ by studying hard then your chances of getting an ‘A’ by ‘hardworking’ is also lower! UNIVERSITI TENAGA NASIONAL

UNIVERSITI TENAGA NASIONAL Advantages: Most significant is their sound theoretical foundation in probability theory. Most mature uncertainty reasoning methods Well defined semantics for decision making Main disadvantage: They require a significant amount of probability data to construct a knowledge base. UNIVERSITI TENAGA NASIONAL

Dempster-Shafer theory (1967, Arthur Shafer) This theory was designed as a mathematical theory of evidence where a value between 0 and 1 is assigned to some fact as its degree of support. Similar to Bayesian method but is more general. As the belief in a fact and its negation need not sum to one ‘1’. Both values can be zero (reflecting that no information is available to make a judgment) UNIVERSITI TENAGA NASIONAL

Dempster-Shafer theory It has a belief function, Bel(x) Belief function measures the likelihood that the evidence support x. It is also used to compute the probability that the evidence supports a proposition. UNIVERSITI TENAGA NASIONAL

Reasoning from first principles It is normally supported by having a system’s structural and behavioral properties described declaratively. Model-based diagnosis is an example of system that reasons from first principle. UNIVERSITI TENAGA NASIONAL