Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction, or what is uncertainty? Introduction, or what is uncertainty? Basic probability theory Basic probability.
Advertisements

FT228/4 Knowledge Based Decision Support Systems
Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
 Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty? n Basic.
Uncertainty in Expert Systems (Certainty Factors).
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
FT228/4 Knowledge Based Decision Support Systems
Final Exam: May 10 Thursday. If event E occurs, then the probability that event H will occur is p ( H | E ) IF E ( evidence ) is true THEN H ( hypothesis.
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 6.1 Chapter Six Probability.
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
Slides are based on Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty?
Fundamentals of Forensic DNA Typing Slides prepared by John M. Butler June 2009 Appendix 3 Probability and Statistics.
A/Prof Geraint Lewis A/Prof Peter Tuthill
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Lecture 05 Rule-based Uncertain Reasoning
QM Spring 2002 Business Statistics Some Probability Essentials.
Chapter 4: Basic Probability
Does Naïve Bayes always work?
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Probability, Bayes’ Theorem and the Monty Hall Problem
Machine Learning Queens College Lecture 3: Probability and Statistics.
Statistics 3502/6304 Prof. Eric A. Suess Chapter 4.
1 NA387 Lecture 6: Bayes’ Theorem, Independence Devore, Sections: 2.4 – 2.5.
BUSINESS MATHEMATICS AND STATISTICS THE ADDITION AND THE MULTIPLICATION THEOREM OF PROBABILITY A PRESENTATION ON.
Populations, Samples, and Probability. Populations and Samples Population – Any complete set of observations (or potential observations) may be characterized.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
CHAPTER 5 Probability: Review of Basic Concepts
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
 Basic Concepts in Probability  Basic Probability Rules  Connecting Probability to Sampling.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Week 71 Hypothesis Testing Suppose that we want to assess the evidence in the observed data, concerning the hypothesis. There are two approaches to assessing.
Uncertainty Management in Rule-based Expert Systems
Uncertainty in Expert Systems
Section 3.2 Notes Conditional Probability. Conditional probability is the probability of an event occurring, given that another event has already occurred.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Copyright © Cengage Learning. All rights reserved. Elementary Probability Theory 5.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 4 Introduction to Probability Experiments, Counting Rules, and Assigning Probabilities.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved. Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and.
Education as a Signaling Device and Investment in Human Capital Topic 3 Part I.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bayes Theorem. Prior Probabilities On way to party, you ask “Has Karl already had too many beers?” Your prior probabilities are 20% yes, 80% no.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 24 of 41 Monday, 18 October.
Probability Michael J. Watts
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
STATISTICS 6.0 Conditional Probabilities “Conditional Probabilities”
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Chap 4-1 Chapter 4 Using Probability and Probability Distributions.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham.
Lecture 1.31 Criteria for optimal reception of radio signals.
CHAPTER 5 Probability: What Are the Chances?
Review of Probability.
Does Naïve Bayes always work?
Chapter 4 Probability.
Decision Tree Analysis
Ch3: Model Building through Regression
Reasoning Under Uncertainty in Expert System
Lecture 3 Probability By Aziza Munir.
Basic Probabilistic Reasoning
Section 11.7 Probability.
28th September 2005 Dr Bogdan L. Vrusias
Probability Rules Rule 1.
Certainty Factor Model
Basic Probability Chapter Goal:
basic probability and bayes' rule
Presentation transcript:

Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.

Uncertainty Uncertainty is the lack of exact knowledge that would enable us to reach a fully reliable solution – Classical logic assumes perfect knowledge exists: IFA is true THENB is true Describing uncertainty: – If A is true, then B is true with probability P

Probability Theory The probability of an event is the proportion of cases in which the event occurs – Numerically ranges from zero to unity (i.e. 0 to 1) P ( success ) + P ( failure ) = 1

Conditional Probability Suppose events A and B are not mutually exclusive, but occur conditionally on the occurrence of the other – The probability that event A will occur if event B occurs is called the conditional probability probability of A given B

Conditional Probability The probability that both A and B occur is called the joint probability of A and B, written p ( A ∩ B )

Conditional Probability Similarly, the conditional probability that event B will occur if event A occurs can be written as:

Conditional Probability

The Bayesian rule (named after Thomas Bayes, an 18th-century British mathematician): The Bayesian Rule

If event A depends on exactly two mutually exclusive events, B and ¬ B, we obtain: Similarly, if event B depends on exactly two mutually exclusive events, A and ¬ A, we obtain: The Bayesian Rule

Substituting p ( B ) into the Bayesian rule yields: The Bayesian Rule

Expert systems use the Bayesian rule to rank potentially true hypotheses based on evidences The Bayesian Rule

If event E occurs, then the probability that event H will occur is p ( H | E ) IF E ( evidence ) is true THEN H ( hypothesis ) is true with probability p The Bayesian Rule

Expert identifies prior probabilities for hypotheses p ( H ) and p ( ¬ H ) Expert identifies conditional probabilities for: – p ( E | H ): Observing evidence E if hypothesis H is true – p ( E | ¬ H ): Observing evidence E if hypothesis H is false Can be burdensome to the expert.... The Bayesian Rule

Experts provide p ( H ), p ( ¬ H ), p ( E | H ), and p ( E | ¬ H ) Users describe observed evidence E – Expert system calculates p ( H | E ) using Bayesian rule – p ( H | E ) is the posterior probability that hypothesis H occurs upon observing evidence E What about multiple hypotheses and evidences? The Bayesian Rule

p(A)p(A)

Expand the Bayesian rule to work with multiple hypotheses ( H 1... H m ) and evidences ( E 1... E n ) Assuming conditional independence among evidences E 1... E n The Bayesian Rule

Expert is given three conditionally independent evidences E 1, E 2, and E 3 – Expert creates three mutually exclusive and exhaustive hypotheses H 1, H 2, and H 3 – Expert provides prior probabilities p ( H 1 ), p ( H 2 ), p ( H 3 ) – Expert identifies conditional probabilities for observing each evidence E i for all possible hypotheses H k Bayesian Rule Example

Expert data: Bayesian Rule Example

expert system computes posterior probabilities user observes E 3

Bayesian Rule Example user observes E 1 expert system computes posterior probabilities

Bayesian Rule Example expert system computes posterior probabilities user observes E 2

Bayesian Rule Example Initial expert-based ranking: – p ( H 1 ) = 0.40; p ( H 2 ) = 0.35; p ( H 3 ) = 0.25 Expert system ranking after observing E 1, E 2, E 3 : – p ( H 1 ) = 0.45; p ( H 2 ) = 0.0; p ( H 3 ) = 0.55 Success hinges on the expert defining all probabilities! – Also dependent on the knowledge engineer interpreting and programming expert data