5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.

Slides:



Advertisements
Similar presentations
FT228/4 Knowledge Based Decision Support Systems
Advertisements

Representations for KBS: Uncertainty & Decision Support
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
1 Essential Probability & Statistics (Lecture for CS598CXZ Advanced Topics in Information Retrieval ) ChengXiang Zhai Department of Computer Science University.
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
FT228/4 Knowledge Based Decision Support Systems
Chapter 4: Reasoning Under Uncertainty
Final Exam: May 10 Thursday. If event E occurs, then the probability that event H will occur is p ( H | E ) IF E ( evidence ) is true THEN H ( hypothesis.
CS 484 – Artificial Intelligence1 Announcements Homework 8 due today, November 13 ½ to 1 page description of final project due Thursday, November 15 Current.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
1. Probability 2. Random variables 3. Inequalities 4. Convergence of random variables 5. Point and interval estimation 6. Hypotheses testing 7. Nonparametric.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
CSE (c) S. Tanimoto, 2008 Bayes Nets 1 Probabilistic Reasoning With Bayes’ Rule Outline: Motivation Generalizing Modus Ponens Bayes’ Rule Applying.
Lecture 05 Rule-based Uncertain Reasoning
Thanks to Nir Friedman, HU
Chap 4-1 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 4 Probability.
Uncertainty Chapter 13.
For Monday after Spring Break Read Homework: –Chapter 13, exercise 6 and 8 May be done in pairs.
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Bayesian Decision Theory Making Decisions Under uncertainty 1.
1 NA387 Lecture 6: Bayes’ Theorem, Independence Devore, Sections: 2.4 – 2.5.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Naive Bayes Classifier
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
 Basic Concepts in Probability  Basic Probability Rules  Connecting Probability to Sampling.
Probability & Statistics I IE 254 Exam I - Reminder  Reminder: Test 1 - June 21 (see syllabus) Chapters 1, 2, Appendix BI  HW Chapter 1 due Monday at.
Random Experiment Random Variable: Continuous, Discrete Sample Space: S Event: A, B, E Null Event Complement of an Event A’ Union of Events (either, or)
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Uncertainty Management in Rule-based Expert Systems
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Uncertainty in Expert Systems
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Reasoning Under Uncertainty. 2 Objectives Learn the meaning of uncertainty and explore some theories designed to deal with it Find out what types of errors.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
4 Proposed Research Projects SmartHome – Encouraging patients with mild cognitive disabilities to use digital memory notebook for activities of daily living.
Probability You’ll probably like it!. Probability Definitions Probability assignment Complement, union, intersection of events Conditional probability.
Education as a Signaling Device and Investment in Human Capital Topic 3 Part I.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 24 of 41 Monday, 18 October.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Naive Bayes Classifier. REVIEW: Bayesian Methods Our focus this lecture: – Learning and classification methods based on probability theory. Bayes theorem.
Essential Probability & Statistics
Lecture 1.31 Criteria for optimal reception of radio signals.
Review of Probability.
Chapter 4 Probability.
Quick Review Probability Theory
Quick Review Probability Theory
Decision Tree Analysis
Reasoning Under Uncertainty in Expert System
Uncertainty Chapter 13.
Basic Probabilistic Reasoning
Representing Uncertainty
Professor Marie desJardins,
Class #21 – Monday, November 10
28th September 2005 Dr Bogdan L. Vrusias
Certainty Factor Model
basic probability and bayes' rule
Presentation transcript:

5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn

5/17/20152 Uncertainty Dealing with incomplete and uncertain data is an important part of many AI systems Approaches –Ad Hoc Uncertainty Factors –Classical Probability Theory –Fuzzy Set Theory –Dempster Shaffer Theory of Evidence

5/17/20153 Using Probabilistic Reasoning Relevant world or domain is random –more knowledge does not allow us to describe situation more precisely Relevant domain is not random, rarely have access to enough data –experiments too costly or too dangerous Domain is not random, just not described in sufficient detail –need to get more knowledge into the system

5/17/20154 Certainty Factor Questions How are certainties associated with rule inputs (e.g. antecedents)? How does the rule translate input certainty to output certainty (i.e. how deterministic is the rule?) How do you determine the certainty of facts supported by several rules?

5/17/20155 Ad Hoc Approach Minimum of value on the interval [0,1] associated with each rule antecedent is a rule’s input certainty Assume some attenuation or deterministic rule will be used as a multiplier to map input certainty to output certainty When several rules supporting the same fact the maximum if the rule output certainties will be the overall certainty of the fact

5/17/20156 Ad Hoc Example

5/17/20157 Ad Hoc Example Rule A1 translates input to output (0.9) * (1.0) = 1.0 Rule A2 translates input to output (0.25) * (1.0) = 0.25 Fact supported by A1 and A2 max(0.25, 0.9) = 0.9 Input to Rule A7 min(0.9, 0.25) = 0.25 Rule A7 translates input to output (0.25) * (0.8) = (0.2)

5/17/20158 Probability Axioms P(E) = Number of desired outcomes Total number of outcomes = | event | / |sample space| P(not E) = P(~E) = 1 – P(E)

5/17/20159 Additive Laws P(A or B) = P(A  B) = P(A) + P(B) – P(A  B) If A and B are mutually exclusive A  B =  P(A  B ) = 0 P(A or B) = P(A) + P(B)

5/17/ Multiplicative Laws P(A and B) = P(A  B) = P(A) * P(B|A) = P(B) * P(A|B) For independent events P(B|A) = P(B) P(A|B) = P(A) P(A  B) = P(A) * P(B)

5/17/ Bayes Theorem P(H i |E)  Probability H i is true given evidence E P(E | H i )  Probability E is observed given H i P(H i ) = H i true regardless of evidence P(H i |E) = P(E | H i ) * P(H i ) = P(E | H i ) * P(H i ) k  P(E | H k ) * P(H k ) P(E) n=1

5/17/ Bayes Example Prior Probability it will rain P(H) = 0.8 Conditional probabilities: Geese on the lake, given rain tomorrow P(E|H) = 0.2 Geese on lake, with no rain tomorrow P(E | ~H) = 0.025

5/17/ Bayes Example Evidence P(E) = P(E | H) * P(H) + P(E | ~H) * P(~H) = (0.02)*(0.8) + (0.025)*(0.2) = (0.016) + (0.005) = Posterior probability Rain given geese on lake P(H | E) = (P( E | H) * P(H)) / P(E) = (0.016 / 0.021) =

5/17/ Bayes Example Posterior probability No rain given geese on lake P(~H | E) = (P( E | ~H) * P(~H)) / P(E) = (0.005 / 0.021) =

5/17/ Weakness of Bayes Approach Difficult to get all apriori conditional and joint probabilities required Database of priorities is hard to modify because of large number of interactions Lots of calculations required Outcomes must be disjoint Accuracy depends on complete hypothesis

5/17/ Problems Which Can Make Use of Probabilistic Inference Information available is of varying certainty or completeness Need nearly optimal solutions Need to justify decisions in favor of alternate decisions General rules of inference are known or can be found for the problem

5/17/ Fuzzy Set Theory In ordinary set theory every element “x” from a given universe is either in or out of a set S x  S x  S In fuzzy set theory set membership is not so easily determined

5/17/ When is a pile of chalk big? If we have three pieces of chalk in the room is that considered a big pile of chalk? Some people might say, yes that is a big pile and some would not. Someplace between those three pieces of chalk and a whole room full of chalk the pile of chalk turns from a small pile into a big pile. This could be a different spot for different people.

5/17/ Membership Function F:[0,1] n  [0,1] x  S f(x) x  S 1/x

5/17/ Possibilistic Logic Dependent Events Probabilistic Logic Independent Events A aa Bbb not A1-a A and Bmin(a,b)a * b A or Bmax(a,b)a + b – a*b A  B not A or (A and B) max(1 - a, b)(1 - a) + a * b A xor Bmax(min(a, 1 - b), min(1 - a, b)) a+b-2ab+a 2 b+ab 2 -a 2 b 2

5/17/ Possibilistic Example Assume P(X) = 0.5, P(Y) = 0.1, P(Z) = 0.2 Determine P(X  (Y or Z)) P(Y or Z) = max(P(Y), P(Z)) = max(0.1, 0.2) = 0.2 P(X  (Y or Z)) = max(1 – P(X), P(Y or Z)) = max(1 – 0.5, 0.2) = max(0.5, 0.2) = 0.5

5/17/ Probabilistic Example Assume P(X) = 0.5, P(Y) = 0.1, P(Z) = 0.2 Determine P(X  (Y or Z)) P(Y or Z) = P(Y) + P(Z) – P(Y) * P(Z) = – 0.1 * 0.2 = 0.3 – 0.02 = 0.28 P(X  (Y or Z)) = not P(X) + P(X) * P(Y or Z) = (1 – 0.5) * 0.28) = = 0.64

5/17/ Bayesian Inference

5/17/ Bayesian Inference Symptoms S1: Clanking Sound S2: Low pickup S3: Starting problem S4: Parts are hard to find Conclusion C1: Repair Estimate > $250

5/17/ Bayeisan Inference Intermediate Hypotheses H1: Thrown connecting rod H2: Wrist Pin Loose H3: Car Out of Tune Secondary Hypotheses H4: Replace or Rebuild Engine H5: Tune Engine

5/17/ Bayeisan Inference These must be known in advance P(H 1 ), P(H 2 ), P(H 3 ) P(S | H i ) for i = 1, 2, 3 Computed using Bayes formula P(S) =  P(H i ) P(S | H i ) P(H i | S) for i = 1, 2, 3

5/17/ Bayesian Inference H4: Replace or Rebuild Engine P(H4) = P(H1 or H2) = max(P(H1 | S), P(H2 | S)) H5: Tune Engine P(H5) = not (H1 or H2) and H3 = min(1 – max(P(H1 | S), P(H2 | S)), P(H3)) C1 : Repair Estimate > $250 P(C1) = P(H4 or P(H5 and S4)) = max(P(H4 | S), min(P(H5 | S), V) note: V = 1 if S4 is true and 0 otherwise