2 Sample SpaceThe possible outcomes of a random experiment are called the basic outcomes, and the set of all basic outcomes is called the sample space. The symbol S will be used to denote the sample space.
3 Sample Space - An Example - What is the sample space for a roll of a single six-sided die?S = [1, 2, 3, 4, 5, 6]
4 Mutually ExclusiveIf the events A and B have no common basic outcomes, they are mutually exclusive and their intersection A B is said to be the empty set indicating that A B cannot occur.More generally, the K events E1, E2, , EK are said to be mutually exclusive if every pair of them is a pair of mutually exclusive events.
5 Venn DiagramsVenn Diagrams are drawings, usually using geometric shapes, used to depict basic concepts in set theory and the outcomes of random experiments.
6 Intersection of Events A and B ABBAB(a) AB is the striped area(b) A and B are Mutually Exclusive
7 Collectively Exhaustive Given the K events E1, E2, . . ., EK in the sample space S. If E1 E2 EK = S, these events are said to be collectively exhaustive.
8 ComplementLet A be an event in the sample space S. The set of basic outcomes of a random experiment belonging to S but not to A is called the complement of A and is denoted by A.
10 Unions, Intersections, and Complements A die is rolled. Let A be the event “Number rolled is even” and B be the event “Number rolled is at least 4.” ThenA = [2, 4, 6] and B = [4, 5, 6]
11 Classical Probability The classical definition of probability is the proportion of times that an event will occur, assuming that all outcomes in a sample space are equally likely to occur. The probability of an event is determined by counting the number of outcomes in the sample space that satisfy the event and dividing by the number of outcomes in the sample space.
12 Classical Probability The probability of an event A iswhere NA is the number of outcomes that satisfy the condition of event A and N is the total number of outcomes in the sample space. The important idea here is that one can develop a probability from fundamental reasoning about the process.
13 CombinationsThe counting process can be generalized by using the following equation to compare the number of combinations of n things taken k at a time.
14 Relative FrequencyThe relative frequency definition of probability is the limit of the proportion of times that an event A occurs in a large number of trials, n,where nA is the number of A outcomes and n is the total number of trials or outcomes in the population. The probability is the limit as n becomes large.
15 Subjective Probability The subjective definition of probability expresses an individual’s degree of belief about the chance that an event will occur. These subjective probabilities are used in certain management decision procedures.
16 Probability Postulates Let S denote the sample space of a random experiment, Oi, the basic outcomes, and A, an event. For each event A of the sample space S, we assume that a number P(A) is defined and we have the postulatesIf A is any event in the sample space SLet A be an event in S, and let Oi denote the basic outcomes. Thenwhere the notation implies that the summation extends over all the basic outcomes in A.3. P(S) = 1
17 Let A be an event and A its complement. The the complement rule is: Probability RulesLet A be an event and A its complement. The the complement rule is:
18 Probability Rules The Addition Rule of Probabilities: Let A and B be two events. The probability of their union is
19 Probability Rules Conditional Probability: Let A and B be two events. The conditional probability of event A, given that event B has occurred, is denoted by the symbol P(A|B) and is found to be:provided that P(B > 0).
20 Probability Rules Conditional Probability: Let A and B be two events. The conditional probability of event B, given that event A has occurred, is denoted by the symbol P(B|A) and is found to be:provided that P(A > 0).
21 Probability Rules The Multiplication Rule of Probabilities: Let A and B be two events. The probability of their intersection can be derived from the conditional probability asAlso,
22 Statistical Independence Let A and B be two events. These events are said to be statistically independent if and only ifFrom the multiplication rule it also follows thatMore generally, the events E1, E2, . . ., Ek are mutually statistically independent if and only if
24 Joint and Marginal Probabilities In the context of bivariate probabilities, the intersection probabilities P(Ai Bj) are called joint probabilities. The probabilities for individual events P(Ai) and P(Bj) are called marginal probabilities. Marginal probabilities are at the margin of a bivariate table and can be computed by summing the corresponding row or column.
25 Probabilities for the Television Viewing and Income Example Viewing FrequencyHigh IncomeMiddleIncomeLow IncomeTotalsRegular0.040.130.21Occasional0.100.110.060.27Never0.170.220.520.410.321.00
27 Probability Rules Rule for Determining the Independence of Attributes Let A and B be a pair of attributes, each broken into mutually exclusive and collectively exhaustive event categories denoted by labels A1, A2, . . ., Ah and B1, B2, . . ., Bk. If every Ai is statistically independent of every event Bj, then the attributes A and B are independent.
28 Odds RatioThe odds in favor of a particular event are given by the ratio of the probability of the event divided by the probability of its complement. The odds in favor of A are
29 Overinvolvement Ratio The probability of event A1 conditional on event B1divided by the probability of A1 conditional on activity B2 is defined as the overinvolvement ratio:An overinvolvement ratio greater than 1,Implies that event A1 increases the conditional odds ration in favor of B1:
30 Let A and B be two events. Then Bayes’ Theorem states that: and
31 Bayes’ Theorem (Alternative Statement) Let E1, E2, , Ek be mutually exclusive and collectively exhaustive events and let A be some other event. The conditional probability of Ei given A can be expressed as Bayes’ Theorem:
32 Bayes’ Theorem - Solution Steps - Define the subset events from the problem.Define the probabilities for the events defined in step 1.Compute the complements of the probabilities.Apply Bayes’ theorem to compute the probability for the problem solution.