Presentation is loading. Please wait.

Presentation is loading. Please wait.

Incorporating New Information to Decision Trees (posterior probabilities) MGS3100 - Chapter 6 Part 3.

Similar presentations


Presentation on theme: "Incorporating New Information to Decision Trees (posterior probabilities) MGS3100 - Chapter 6 Part 3."— Presentation transcript:

1 Incorporating New Information to Decision Trees (posterior probabilities)
MGS Chapter 6 Part 3

2 How We Will Use Bayes' Theorem
Prior information can be based on the results of previous experiments, or expert opinion, and can be expressed as probabilities. If it is desirable to improve on this state of knowledge, an experiment can be conducted. Bayes' Theorem is the mechanism used to update the state of knowledge with the results of the experiment to provide a posterior distribution.

3 Bayes’ Theorem Used to revise probabilities based upon new data Prior
Posterior probabilities

4 P(A|B) * P(B) = P(AB) = P(B|A) * P(A)
How Bayes' Theorem Works Let the experiment be A and the prediction be B. Let’s assume that both have occurred. The probability of both A and B together is P(A∩B), or simply P(AB). The law of conditional probability says that this probability can be found as the product of the conditional probability of one, given the other, times the probability of the other. That is: P(A|B) * P(B) = P(AB) = P(B|A) * P(A) Simple algebra shows that: P(B|A) = P(A|B) * P(B) P(A)    This is Bayes' Theorem.

5 Sequential Decisions Would you hire a market research group or a consultant (or a psychic) to get more info about states of nature? How would additional info cause you to revise your probabilities of states of nature occuring? Draw a new tree depicting the complete problem.

6 Problem: Marketing Cellular Phones
The design and product-testing phase has just been completed for Sonorola’s new line of cellular phones. Three alternatives are being considered for a marketing/production strategy for this product: 1. Aggressive (A) 2. Basic (B) 3. Cautious (C) Management decides to categorize the level of demand as either strong (S) or weak (W).

7 This decision is indicated in the TreePlan by
Here, we reproduce the last slide of the Sonorola problem from lecture slides part 2. Of the three expected values, choose 12.85, the branch associated with the Basic strategy. This decision is indicated in the TreePlan by the number 2 in the decision node.

8 Marketing Department Reports on the state of the market Encouraging
Discouraging

9 Find the Conditional Probability based on the prior track record:
First, find out the reliability of the source of information (in this case, the marketing research group). Find the Conditional Probability based on the prior track record: For two events A and B, the conditional probability [P(A|B)], is the probability of event A given that event B will occur. For example, P(E|S) is the conditional probability that marketing gives an encouraging report given that the market is in fact going to be strong.

10 If marketing were perfectly reliable, P(E|S) = 1.
However, marketing has the following “track record” in predicting the market: P(E|S) = 0.6 P(D|S) = 1 - P(E|S) = 0.4 P(D|W) = 0.7 P(E|W) = 1 - P(D|W) = 0.3 Here is the same information displayed in tabular form:

11 Calculating the Posterior Probabilities: Suppose that marketing has come back with an encouraging report. Knowing this, what is the probability that the market is in fact strong [P(S|E)]? Note that probabilities such as P(S) and P(W) are initial estimates called a prior probabilities. Conditional probabilities such as P(S|E) are called posterior probabilities. The domestic tractor division has already estimated the prior probabilities as P(S) = 0.45 and P(W) = 0.55. Now, use Bayes’ Theorem (see appendix for a formal description) to determine the posterior probabilities.

12 P(E|S) P(E|W) P(D|S) P(D|W) =SUM(B12:C12) =B3*B$8 =SUM(B12:B13) =B12/$D12

13 Appendix Bayes Theorem
Bayes' theorem is a result in probability theory, which gives the conditional probability distribution of a random variable A given B in terms of the conditional probability distribution of variable B given A and the marginal probability distribution of A alone. In the context of Bayesian probability theory and statistical inference, the marginal probability distribution of A alone is usually called the prior probability distribution or simply the prior. The conditional distribution of A given the "data" B is called the posterior probability distribution or just the posterior.


Download ppt "Incorporating New Information to Decision Trees (posterior probabilities) MGS3100 - Chapter 6 Part 3."

Similar presentations


Ads by Google