AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction, or what is uncertainty? Introduction, or what is uncertainty? Basic probability theory Basic probability.
Advertisements

FT228/4 Knowledge Based Decision Support Systems
Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
 Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty? n Basic.
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
FT228/4 Knowledge Based Decision Support Systems
Probability Simple Events
Chapter 4 Probability and Probability Distributions
COUNTING AND PROBABILITY
1 1 PRESENTED BY E. G. GASCON Introduction to Probability Section 7.3, 7.4, 7.5.
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham 2005.
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Certainty and Evidence
KETIDAKPASTIAN (UNCERTAINTY) Yeni Herdiyeni – “Uncertainty is defined as the lack of the exact knowledge that would enable.
Introduction to Probabilities Farrokh Alemi, Ph.D. Saturday, February 21, 2004.
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
Slides are based on Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty?
Chapter 4 Probability.
A/Prof Geraint Lewis A/Prof Peter Tuthill
2-1 Sample Spaces and Events Conducting an experiment, in day-to-day repetitions of the measurement the results can differ slightly because of small.
Lecture 05 Rule-based Uncertain Reasoning
Engineering Probability and Statistics - SE-205 -Chap 2 By S. O. Duffuaa.
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Probability, Bayes’ Theorem and the Monty Hall Problem
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Conditional Probability and Independence If A and B are events in sample space S and P(B) > 0, then the conditional probability of A given B is denoted.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Introduction to Probability  Probability is a numerical measure of the likelihood that an event will occur.  Probability values are always assigned on.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Independence and Bernoulli Trials. Sharif University of Technology 2 Independence  A, B independent implies: are also independent. Proof for independence.
Uncertainty Management in Rule-based Expert Systems
Uncertainty in Expert Systems
Making sense of randomness
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Copyright © Cengage Learning. All rights reserved. Elementary Probability Theory 5.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Probability is a measure of the likelihood of a random phenomenon or chance behavior. Probability describes the long-term proportion with which a certain.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved. Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and.
Education as a Signaling Device and Investment in Human Capital Topic 3 Part I.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Lecture 7 Dustin Lueker. 2STA 291 Fall 2009 Lecture 7.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
PROBABILITY AND BAYES THEOREM 1. 2 POPULATION SAMPLE PROBABILITY STATISTICAL INFERENCE.
Copyright © Cengage Learning. All rights reserved. 2 Probability.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
STROUD Worked examples and exercises are in the text Programme 29: Probability PROGRAMME 29 PROBABILITY.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Chap 4-1 Chapter 4 Using Probability and Probability Distributions.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
Virtual University of Pakistan
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Chapter 4 Probability.
Bayes for Beginners Stephanie Azzopardi & Hrvoje Stojic
Basic Probabilistic Reasoning
Chapter 2.3 Counting Sample Points Combination In many problems we are interested in the number of ways of selecting r objects from n without regard to.
CHAPTER 4 PROBABILITY Prem Mann, Introductory Statistics, 8/E Copyright © 2013 John Wiley & Sons. All rights reserved.
LESSON 5: PROBABILITY Outline Probability Events
28th September 2005 Dr Bogdan L. Vrusias
basic probability and bayes' rule
Presentation transcript:

AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Contents Defining Uncertainty Basic probability theory Bayesian reasoning Bias of the Bayesian method Certainty factors theory and evidential reasoning

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Defining Uncertainty Information can be incomplete, inconsistent, uncertain, or all three. In other words, information is often unsuitable for solving a problem. Uncertainty is defined as the lack of the exact knowledge that would enable us to reach a perfectly reliable conclusion. Classical logic permits only exact reasoning. It assumes that perfect knowledge always exists and the law of the excluded middle can always be applied: IFA is trueIFA is false THENA is not falseTHENA is not true

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Sources of Uncertain Knowledge Weak implications. Domain experts and knowledge engineers have the painful task of establishing concrete correlations between IF (condition) and THEN (action) parts of the rules. Therefore, expert systems need to have the ability to handle vague associations, for example by accepting the degree of correlations as numerical certainty factors.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Sources of Uncertain Knowledge Imprecise language. Our natural language is ambiguous and imprecise. We describe facts with such terms as often and sometimes, frequently and hardly ever. As a result, it can be difficult to express knowledge in the precise IF- THEN form of production rules. However, if the meaning of the facts is quantified, it can be used in expert systems. In 1944, Ray Simpson asked 355 high school and college students to place 20 terms like often on a scale between 1 and 100. In 1968, Milton Hakel repeated this experiment.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Sources of Uncertain Knowledge Imprecise language

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Sources of Uncertain Knowledge Unknown data. When the data is incomplete or missing, the only solution is to accept the value “unknown” and proceed to an approximate reasoning with this value. Combining the views of different experts. Large expert systems usually combine the knowledge and expertise of a number of experts. Unfortunately, experts often have contradictory opinions and produce conflicting rules. To resolve the conflict, the knowledge engineer has to attach a weight to each expert and then calculate the composite conclusion. But no systematic method exists to obtain these weights.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Basic Probability Theory The concept of probability has a long history that goes back thousands of years when words like “probably”, “likely”, “maybe”, “perhaps” and “possibly” were introduced into spoken languages. However, the mathematical theory of probability was formulated only in the 17th century. The probability of an event is the proportion of cases in which the event occurs. Probability can also be defined as a scientific measure of chance.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Basic Probability Theory Probability can be expressed mathematically as a numerical index with a range between zero (an absolute impossibility) to unity (an absolute certainty). Most events have a probability index strictly between 0 and 1, which means that each event has at least two possible outcomes: favourable outcome or success, and unfavourable outcome or failure.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Basic Probability Theory If s is the number of times success can occur, and f is the number of times failure can occur, then and p + q = 1 If we throw a coin, the probability of getting a head will be equal to the probability of getting a tail. In a single throw, s = f = 1, and therefore the probability of getting a head (or a tail) is 0.5.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Conditional Probability Let A be an event in the world and B be another event. Suppose that events A and B are not mutually exclusive, but occur conditionally on the occurrence of the other. The probability that event A will occur if event B occurs is called the conditional probability. Conditional probability is denoted mathematically as p(A|B) in which the vertical bar represents "given" and the complete probability expression is interpreted as –“Conditional probability of event A occurring given that event B has occurred”.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Conditional Probability The number of times A and B can occur, or the probability that both A and B will occur, is called the joint probability of A and B. It is represented mathematically as p(A  B). The number of ways B can occur is the probability of B, p(B), and thus Similarly, the conditional probability of event B occurring given that event A has occurred equals

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Conditional Probability Hence and Substituting the last equation into the equation yields the Bayesian rule:

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Bayesian Rule where: p(A|B) is the conditional probability that event A occurs given that event B has occurred; p(B|A) is the conditional probability of event B occurring given that event A has occurred; p(A) is the probability of event A occurring; p(B) is the probability of event B occurring.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © The Joint Probability

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © The Joint Probability If the occurrence of event A depends on only two mutually exclusive events, B and NOT B, we obtain: where  is the logical function NOT. Similarly, Substituting this equation into the Bayesian rule yields:

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Bayesian Reasoning Suppose all rules in the knowledge base are represented in the following form: IFEis true THENHis true {with probability p} This rule implies that if event E occurs, then the probability that event H will occur is p. In expert systems, H usually represents a hypothesis and E denotes evidence to support this hypothesis.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Bayesian Reasoning The Bayesian rule expressed in terms of hypotheses and evidence looks like this: where: p(H) is the prior probability of hypothesis H being true; p(E|H) is the probability that hypothesis H being true will result in evidence E; p(  H) is the prior probability of hypothesis H being false; p(E|  H) is the probability of finding evidence E even when hypothesis H is false.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Bayesian Reasoning In expert systems, the probabilities required to solve a problem are provided by experts. An expert determines the prior probabilities for possible hypotheses p(H) and p(  H), and also the conditional probabilities for observing evidence E if hypothesis H is true, p(E|H), and if hypothesis H is false, p(E|  H). Users provide information about the evidence observed and the expert system computes p(H|E) for hypothesis H in light of the user-supplied evidence E. Probability p(H|E) is called the posterior probability of hypothesis H upon observing evidence E.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Bayesian Reasoning We can take into account both multiple hypotheses H 1, H 2,..., H m and multiple evidences E 1, E 2,..., E n. The hypotheses as well as the evidences must be mutually exclusive and exhaustive. Single evidence E and multiple hypotheses follow: Multiple evidences and multiple hypotheses follow:

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Bayesian Reasoning This requires to obtain the conditional probabilities of all possible combinations of evidences for all hypotheses, and thus places an enormous burden on the expert. Therefore, in expert systems, conditional independence among different evidences assumed. Thus, instead of the unworkable equation, we attain:

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Ranking Potentially True Hypotheses Let us consider a simple example: –Suppose an expert, given three conditionally independent evidences E 1, E 2,..., E n, creates three mutually exclusive and exhaustive hypotheses H 1, H 2,..., H m, and provides prior probabilities for these hypotheses – p( H 1 ), p( H 2 ) and p( H 3 ), respectively. The expert also determines the conditional probabilities of observing each evidence for all possible hypotheses.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © The Prior and Conditional Probabilities Assume that we first observe evidence E 3. The expert system computes the posterior probabilities for all hypotheses as:

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © The Prior and Conditional Probabilities thus After evidence E 3 is observed, belief in hypothesis H 2 increases and becomes equal to belief in hypothesis H 1. Belief in hypothesis H 3 also increases and even nearly reaches beliefs in hypotheses H 1 and H 2.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © The Prior and Conditional Probabilities Suppose now that we observe evidence E 1. The posterior probabilities are calculated as hence Hypothesis H 2 has now become the most likely one.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © The Prior and Conditional Probabilities After observing evidence E 2, the final posterior probabilities for all hypotheses are calculated: hence Although the initial ranking was H 1, H 2 and H 3, only hypotheses H 1 and H 3 remain under consideration after all evidences (E 1, E 2 and E 3 ) were observed.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Exercise From which bowl is the cookie? To illustrate, suppose there are two bowls full of cookies. Bowl #1 has 10 chocolate chip and 30 plain cookies, while bowl #2 has 20 of each. Our friend Fred picks a bowl at random, and then picks a cookie at random. We may assume there is no reason to believe Fred treats one bowl differently from another, likewise for the cookies. The cookie turns out to be a plain one. How probable is it that Fred picked it out of bowl #1? (taken from wikipedia.org)

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Solution Let H1 correspond to bowl #1, and H2 to bowl #2. It is given that the bowls are identical from Fred's point of view, thus P(H1) = P(H2), and the two must add up to 1, so both are equal to 0.5. The datum D is the observation of a plain cookie. From the contents of the bowls, we know that P(D | H1) = 30/40 = 0.75 and P(D | H2) = 20/40 = 0.5. Bayes' formula then yields Before observing the cookie, the probability that Fred chose bowl #1 is the prior probability, P(H1), which is 0.5. After observing the cookie, we revise the probability to P(H1|D), which is 0.6.

AI – CS364 Uncertainty Management 21 st September 2006Bogdan L. Vrusias © Closing Questions??? Remarks??? Comments!!! Evaluation!