Presentation on theme: "Phil 148 Probability. The importance of understanding chances: A great many injustices are perpetrated upon people who have a poor understanding of mathematical."— Presentation transcript:
The importance of understanding chances: A great many injustices are perpetrated upon people who have a poor understanding of mathematical truths by those who do. People have a financial incentive to fool you out of your money. Since money comes in numerical increments, a lack of skill with numbers inevitably translates into a lack of skill with money.
Common fallacies of probability: The Gambler’s Fallacy – Is assuming that the odds of a single truly random event are affected in any way by previous iterations of the same or other truly random event. Ignoring the Law of Large Numbers – Is assuming there must be other explanations for very improbable events. In many cases there may be other explanations than luck, but remember that it might just be like the truckload of pennies mentioned by the text.
Heuristics (the common language of probability) People, in order to quickly make sense of a vast array of information, rely on simple decisional gimmicks known in the research as heuristics. These heuristics are generally useful, but prone to break down in systematic ways. The tendency of a heuristic to supply the wrong answer is referred to as a bias. Some examples follow:
Representativeness Which is the more likely hand in poker? Hand A : (A , K , Q , J , 10 ) Hand B : (3 , J , 2 , 7 , 9 )
Representativeness bias Both hands are actually equally likely, though most say that hand B is more likely. People very quickly recognize that Hand A represents a successful hand, which is a much less common outcome than what Hand B represents, which is a nothing-hand. The heuristic that selects patterns that indicate success has then contributed to misjudging probability, and thus is a bias.
Other Examples: See the text, pp. 280-281, for discussion questions.
a priori probability: We generalize the probability of any random event in the following way: The probability of any hypothesis ‘Pr(h)’ = # of outcomes that confirm the hypothesis/total # of possible outcomes.
a priori probability and other chance events: It is easy to misapply a priori probability. It is important to limit its usage to events that are actually random. For one example, consider the case of a batter in a baseball game: – Only two outcomes are available in each at bat: an out, or the batsman on base. This means that the batter has a 50% a priori chance of getting on base. – However, this a priori probability is of little interest because the batter getting on base is not at heart a random event. It is based on many interdependent skills and actions and mental states of both the batter and the pitcher. In real practice, batters fail to get on base much more than 50% of the time. – It is true that luck of some kinds has some influence in baseball, but a priori probabilities don’t tell much of a useful story.
Rules of Probability (1) 1.Negation: The probability that an event will not occur is 1 minus the probability that it will occur Pr (not-h) = 1 – Pr(h)
Rules of Probability (2i) 2i. Conjunction with Independence: Given two independent events (events that do not affect each others’ a priori probabilities) the probability of their both occurring is the product of their individual probabilities. Pr(h1 & h2) = Pr(h1) * Pr(h2)
Rules of Probability (2g) 2g. Conjunction (general, dependent) Given two events, the probability of their both occurring is the probability of the first occurring times the probability of the second occurring, given that the first has occurred. Pr(h1 & h2) = Pr(h1) * Pr(h2|h1)
Rules of Probability (3e) 3e. Disjunction with Exclusivity: The probability that at least one of two mutually exclusive events will occur is the sum of the probabilities that each will occur. Pr(h1 (exclusive)or h2) = Pr(h1) + Pr(h2)
Rules of Probability (3g) 3g. Disjunction (general, inclusive) The probability that at least one of two events will occur is the sum of the probabilities that each of them will occur, minus the probability that they will both occur. Pr(h1 (inclusive)or h2) = [Pr(h1) + Pr(h2)] – Pr(h1 & h2)
Rules of Probability (4) 4. Series with Independence: The probability that an event will occur at least once in a series of independent trials is 1 minus the probability that it will not occur in that number of trials. Pr(h at least once in n trials) = 1 – Pr(not-h) n