Presentation is loading. Please wait.

Presentation is loading. Please wait.

5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.

Similar presentations


Presentation on theme: "5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn."— Presentation transcript:

1 5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn

2 5/17/20152 Uncertainty Dealing with incomplete and uncertain data is an important part of many AI systems Approaches –Ad Hoc Uncertainty Factors –Classical Probability Theory –Fuzzy Set Theory –Dempster Shaffer Theory of Evidence

3 5/17/20153 Using Probabilistic Reasoning Relevant world or domain is random –more knowledge does not allow us to describe situation more precisely Relevant domain is not random, rarely have access to enough data –experiments too costly or too dangerous Domain is not random, just not described in sufficient detail –need to get more knowledge into the system

4 5/17/20154 Certainty Factor Questions How are certainties associated with rule inputs (e.g. antecedents)? How does the rule translate input certainty to output certainty (i.e. how deterministic is the rule?) How do you determine the certainty of facts supported by several rules?

5 5/17/20155 Ad Hoc Approach Minimum of value on the interval [0,1] associated with each rule antecedent is a rule’s input certainty Assume some attenuation or deterministic rule will be used as a multiplier to map input certainty to output certainty When several rules supporting the same fact the maximum if the rule output certainties will be the overall certainty of the fact

6 5/17/20156 Ad Hoc Example

7 5/17/20157 Ad Hoc Example Rule A1 translates input to output (0.9) * (1.0) = 1.0 Rule A2 translates input to output (0.25) * (1.0) = 0.25 Fact supported by A1 and A2 max(0.25, 0.9) = 0.9 Input to Rule A7 min(0.9, 0.25) = 0.25 Rule A7 translates input to output (0.25) * (0.8) = (0.2)

8 5/17/20158 Probability Axioms P(E) = Number of desired outcomes Total number of outcomes = | event | / |sample space| P(not E) = P(~E) = 1 – P(E)

9 5/17/20159 Additive Laws P(A or B) = P(A  B) = P(A) + P(B) – P(A  B) If A and B are mutually exclusive A  B =  P(A  B ) = 0 P(A or B) = P(A) + P(B)

10 5/17/201510 Multiplicative Laws P(A and B) = P(A  B) = P(A) * P(B|A) = P(B) * P(A|B) For independent events P(B|A) = P(B) P(A|B) = P(A) P(A  B) = P(A) * P(B)

11 5/17/201511 Bayes Theorem P(H i |E)  Probability H i is true given evidence E P(E | H i )  Probability E is observed given H i P(H i ) = H i true regardless of evidence P(H i |E) = P(E | H i ) * P(H i ) = P(E | H i ) * P(H i ) k  P(E | H k ) * P(H k ) P(E) n=1

12 5/17/201512 Bayes Example Prior Probability it will rain P(H) = 0.8 Conditional probabilities: Geese on the lake, given rain tomorrow P(E|H) = 0.2 Geese on lake, with no rain tomorrow P(E | ~H) = 0.025

13 5/17/201513 Bayes Example Evidence P(E) = P(E | H) * P(H) + P(E | ~H) * P(~H) = (0.02)*(0.8) + (0.025)*(0.2) = (0.016) + (0.005) = 0.021 Posterior probability Rain given geese on lake P(H | E) = (P( E | H) * P(H)) / P(E) = (0.016 / 0.021) = 0.7619

14 5/17/201514 Bayes Example Posterior probability No rain given geese on lake P(~H | E) = (P( E | ~H) * P(~H)) / P(E) = (0.005 / 0.021) = 0.2381

15 5/17/201515 Weakness of Bayes Approach Difficult to get all apriori conditional and joint probabilities required Database of priorities is hard to modify because of large number of interactions Lots of calculations required Outcomes must be disjoint Accuracy depends on complete hypothesis

16 5/17/201516 Problems Which Can Make Use of Probabilistic Inference Information available is of varying certainty or completeness Need nearly optimal solutions Need to justify decisions in favor of alternate decisions General rules of inference are known or can be found for the problem

17 5/17/201517 Fuzzy Set Theory In ordinary set theory every element “x” from a given universe is either in or out of a set S x  S x  S In fuzzy set theory set membership is not so easily determined

18 5/17/201518 When is a pile of chalk big? If we have three pieces of chalk in the room is that considered a big pile of chalk? Some people might say, yes that is a big pile and some would not. Someplace between those three pieces of chalk and a whole room full of chalk the pile of chalk turns from a small pile into a big pile. This could be a different spot for different people.

19 5/17/201519 Membership Function F:[0,1] n  [0,1] x  S f(x) x  S 1/x

20 5/17/201520 Possibilistic Logic Dependent Events Probabilistic Logic Independent Events A aa Bbb not A1-a A and Bmin(a,b)a * b A or Bmax(a,b)a + b – a*b A  B not A or (A and B) max(1 - a, b)(1 - a) + a * b A xor Bmax(min(a, 1 - b), min(1 - a, b)) a+b-2ab+a 2 b+ab 2 -a 2 b 2

21 5/17/201521 Possibilistic Example Assume P(X) = 0.5, P(Y) = 0.1, P(Z) = 0.2 Determine P(X  (Y or Z)) P(Y or Z) = max(P(Y), P(Z)) = max(0.1, 0.2) = 0.2 P(X  (Y or Z)) = max(1 – P(X), P(Y or Z)) = max(1 – 0.5, 0.2) = max(0.5, 0.2) = 0.5

22 5/17/201522 Probabilistic Example Assume P(X) = 0.5, P(Y) = 0.1, P(Z) = 0.2 Determine P(X  (Y or Z)) P(Y or Z) = P(Y) + P(Z) – P(Y) * P(Z) = 0.1 + 0.2 – 0.1 * 0.2 = 0.3 – 0.02 = 0.28 P(X  (Y or Z)) = not P(X) + P(X) * P(Y or Z) = (1 – 0.5) + 0.2 * 0.28) = 0.5 + 0.14 = 0.64

23 5/17/201523 Bayesian Inference

24 5/17/201524 Bayesian Inference Symptoms S1: Clanking Sound S2: Low pickup S3: Starting problem S4: Parts are hard to find Conclusion C1: Repair Estimate > $250

25 5/17/201525 Bayeisan Inference Intermediate Hypotheses H1: Thrown connecting rod H2: Wrist Pin Loose H3: Car Out of Tune Secondary Hypotheses H4: Replace or Rebuild Engine H5: Tune Engine

26 5/17/201526 Bayeisan Inference These must be known in advance P(H 1 ), P(H 2 ), P(H 3 ) P(S | H i ) for i = 1, 2, 3 Computed using Bayes formula P(S) =  P(H i ) P(S | H i ) P(H i | S) for i = 1, 2, 3

27 5/17/201527 Bayesian Inference H4: Replace or Rebuild Engine P(H4) = P(H1 or H2) = max(P(H1 | S), P(H2 | S)) H5: Tune Engine P(H5) = not (H1 or H2) and H3 = min(1 – max(P(H1 | S), P(H2 | S)), P(H3)) C1 : Repair Estimate > $250 P(C1) = P(H4 or P(H5 and S4)) = max(P(H4 | S), min(P(H5 | S), V) note: V = 1 if S4 is true and 0 otherwise


Download ppt "5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn."

Similar presentations


Ads by Google