Presentation is loading. Please wait.

Chapter 4 Probability.

Presentation on theme: "Chapter 4 Probability."— Presentation transcript:

Chapter 4 Probability

Sample Space The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic outcomes is called the sample space. The symbol S will be used to denote the sample space.

Sample Space - An Example -
What is the sample space for a roll of a single six-sided die? S = [1, 2, 3, 4, 5, 6]

Mutually Exclusive If the events A and B have no common basic outcomes, they are mutually exclusive and their intersection A  B is said to be the empty set indicating that A  B cannot occur. More generally, the K events E1, E2, , EK are said to be mutually exclusive if every pair of them is a pair of mutually exclusive events.

Venn Diagrams Venn Diagrams are drawings, usually using geometric shapes, used to depict basic concepts in set theory and the outcomes of random experiments.

Intersection of Events A and B
AB B A B (a) AB is the striped area (b) A and B are Mutually Exclusive

Collectively Exhaustive
Given the K events E1, E2, . . ., EK in the sample space S. If E1  E2  EK = S, these events are said to be collectively exhaustive.

Complement Let A be an event in the sample space S. The set of basic outcomes of a random experiment belonging to S but not to A is called the complement of A and is denoted by A.

Venn Diagram for the Complement of Event A

Unions, Intersections, and Complements
A die is rolled. Let A be the event “Number rolled is even” and B be the event “Number rolled is at least 4.” Then A = [2, 4, 6] and B = [4, 5, 6]

Classical Probability
The classical definition of probability is the proportion of times that an event will occur, assuming that all outcomes in a sample space are equally likely to occur. The probability of an event is determined by counting the number of outcomes in the sample space that satisfy the event and dividing by the number of outcomes in the sample space.

Classical Probability
The probability of an event A is where NA is the number of outcomes that satisfy the condition of event A and N is the total number of outcomes in the sample space. The important idea here is that one can develop a probability from fundamental reasoning about the process.

Combinations The counting process can be generalized by using the following equation to compare the number of combinations of n things taken k at a time.

Relative Frequency The relative frequency definition of probability is the limit of the proportion of times that an event A occurs in a large number of trials, n, where nA is the number of A outcomes and n is the total number of trials or outcomes in the population. The probability is the limit as n becomes large.

Subjective Probability
The subjective definition of probability expresses an individual’s degree of belief about the chance that an event will occur. These subjective probabilities are used in certain management decision procedures.

Probability Postulates
Let S denote the sample space of a random experiment, Oi, the basic outcomes, and A, an event. For each event A of the sample space S, we assume that a number P(A) is defined and we have the postulates If A is any event in the sample space S Let A be an event in S, and let Oi denote the basic outcomes. Then where the notation implies that the summation extends over all the basic outcomes in A. 3. P(S) = 1

Let A be an event and A its complement. The the complement rule is:
Probability Rules Let A be an event and A its complement. The the complement rule is:

Probability Rules The Addition Rule of Probabilities:
Let A and B be two events. The probability of their union is

Probability Rules Conditional Probability:
Let A and B be two events. The conditional probability of event A, given that event B has occurred, is denoted by the symbol P(A|B) and is found to be: provided that P(B > 0).

Probability Rules Conditional Probability:
Let A and B be two events. The conditional probability of event B, given that event A has occurred, is denoted by the symbol P(B|A) and is found to be: provided that P(A > 0).

Probability Rules The Multiplication Rule of Probabilities:
Let A and B be two events. The probability of their intersection can be derived from the conditional probability as Also,

Statistical Independence
Let A and B be two events. These events are said to be statistically independent if and only if From the multiplication rule it also follows that More generally, the events E1, E2, . . ., Ek are mutually statistically independent if and only if

Bivariate Probabilities
. . . Bk A1 P(A1B1) P(A1B2) P(A1Bk) A2 P(A2B1) P(A2B2) P(A2Bk) . Ah P(AhB1) P(AhB2) P(AhBk) Figure 4.1 Outcomes for Bivariate Events

Joint and Marginal Probabilities
In the context of bivariate probabilities, the intersection probabilities P(Ai  Bj) are called joint probabilities. The probabilities for individual events P(Ai) and P(Bj) are called marginal probabilities. Marginal probabilities are at the margin of a bivariate table and can be computed by summing the corresponding row or column.

Probabilities for the Television Viewing and Income Example
Viewing Frequency High Income Middle Income Low Income Totals Regular 0.04 0.13 0.21 Occasional 0.10 0.11 0.06 0.27 Never 0.17 0.22 0.52 0.41 0.32 1.00

Tree Diagrams P(A1  B1) = .04 P(A1  B2) = .13 P(A1  B3) = .04
P(S) = 1 P(A2  B2) = .11 P(A2  B3) = .06 P(A3) = .52 P(A3 B1) = .13 P(A3  B2) = .17 P(A3  B3) = .22

Probability Rules Rule for Determining the Independence of Attributes
Let A and B be a pair of attributes, each broken into mutually exclusive and collectively exhaustive event categories denoted by labels A1, A2, . . ., Ah and B1, B2, . . ., Bk. If every Ai is statistically independent of every event Bj, then the attributes A and B are independent.

Odds Ratio The odds in favor of a particular event are given by the ratio of the probability of the event divided by the probability of its complement. The odds in favor of A are

Overinvolvement Ratio
The probability of event A1 conditional on event B1divided by the probability of A1 conditional on activity B2 is defined as the overinvolvement ratio: An overinvolvement ratio greater than 1, Implies that event A1 increases the conditional odds ration in favor of B1:

Let A and B be two events. Then Bayes’ Theorem states that: and

Bayes’ Theorem (Alternative Statement)
Let E1, E2, , Ek be mutually exclusive and collectively exhaustive events and let A be some other event. The conditional probability of Ei given A can be expressed as Bayes’ Theorem:

Bayes’ Theorem - Solution Steps -
Define the subset events from the problem. Define the probabilities for the events defined in step 1. Compute the complements of the probabilities. Apply Bayes’ theorem to compute the probability for the problem solution.

Download ppt "Chapter 4 Probability."

Similar presentations

Ads by Google