We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
supports HTML5 video
Published byTrevor Doyle
Modified over 4 years ago
Chapter 4 Probability and Probability DistributionsNow that you have learned to describe a data set, how can you use sample data to draw conclusions about the sampled populations? The technique involves a statistical tool called probability. To use this tool correctly, you must first understand how it works. The first part of this chapter will teach you the new language of probability, presenting the basic concepts with simple examples. ©1998 Brooks/Cole Publishing/ITP
The variables that we measured in Chapters 1 and 2 can now be redefined as random variables, whose values depend on the chance selection of the elements in the sample. Using probability as a tool, you can develop probability distributions that serve as models for discrete random variables, and you can describe these random variables using a mean and standard deviation similar to those in Chapter 2. ©1998 Brooks/Cole Publishing/ITP
Specific Topics 1. Experiments and Events2. Relative frequency definition of probability 3. Counting Rules (Combinations) 4. Intersections, unions, and complements 5. Conditional probability and independence 6. Additive and Multiplicative Rules of Probability 7. 8. Random variables 9. Probability distributions for discrete random variables 10. The mean and standard deviation for a discrete random variable ©1998 Brooks/Cole Publishing/ITP
4.1 and 4.2 The Role of Probability in Statistics and Events and the Sample SpaceWhen a population is known, probability is used to describe the likelihood of observing a particular sample outcome, e.g., a 50% or .5 chance of getting a head or a tail in a fair toss of a coin. When the population is unknown and only a sample from that population is available, probability is used in making statements about the makeup of the population, that is, in making statistical inferences. We use the term experiment to describe either uncontrolled events in nature or controlled situations in a laboratory. ©1998 Brooks/Cole Publishing/ITP
Definition: An event is an outcome of an experiment. Definition: An experiment is the process by which an observation (or measurement) is obtained. Definition: An event is an outcome of an experiment. See Example 4.1 for a listing of some events. Definition: Two events are mutually exclusive if, when one event occurs, the other cannot, and vice versa, e.g., a head or a tail in a toss of a coin. Definition: An event that cannot be decomposed is called a simple event, e.g., a head or a tail in the toss of a coin. Definition: A set of all simple events is called the sample space, e.g., a head and a tail in the toss of a coin. Definition: An event is a collection of one or more simple events, e.g., the toss of two heads in a row. ©1998 Brooks/Cole Publishing/ITP
Event A : Observe an odd number Event B : Observe a number less than 4 Example 4.1 Experiment: Toss a die and observe the number that appears on the upper face. Solution Event A : Observe an odd number Event B : Observe a number less than 4 Event E1: Observe a 1 Event E2: Observe a 2 Event E3: Observe a 3 Event E4: Observe a 4 Event E5: Observe a 5 Event E6: Observe a 6 ©1998 Brooks/Cole Publishing/ITP
Figure 4.1 Venn diagram for die tossing©1998 Brooks/Cole Publishing/ITP
See Figure 4.1 for a Venn diagram for die tossing.Venn diagram: The outer box represents the sample space, which contains all of the simple events; the inner circles represent events and contain simple events. See Figure 4.1 for a Venn diagram for die tossing. Also see Examples 4.2 and 4.3 for examples of experiments. Tree diagram: For displaying the sample space of an experiment, each successive level of branching in the tree corresponds to a step required to generate the final outcome. See Example 4.4 for an example of a tree diagram. ©1998 Brooks/Cole Publishing/ITP
Figure 4.2 Tree diagram for Example 4.4 ©1998 Brooks/Cole Publishing/ITP
4.3 Calculating Probabilities Using Simple EventsRelative frequency = Frequency / n Requirements for Simple-Event Probabilities, - Each probability must lie between 0 and 1. - The sum of the probabilities for all simple events in S equals 1. Definition: The probability of an event A is equal to the sum of the probabilities of the simple events contained in A. ©1998 Brooks/Cole Publishing/ITP
Example 4.5 Toss two fair coins and record the outcome. Find the probability of observing exactly one head in the two tosses. Solution ©1998 Brooks/Cole Publishing/ITP
Calculating the probability of an event: 1. List all simple events in the sample space. 2. Assign an appropriate probability to each simple event. 3. Determine which simple events results in the event of interest. 4. Sum the probabilities of the simple events that result in the event of interest/ Be careful to satisfy two conditions in your calculation: - Include all simple events in the sample space. - Assign realistic probabilities to the simple events. ©1998 Brooks/Cole Publishing/ITP
Counting Rule for Combinations:See Examples 4.14 and 4.15 for examples of the counting rules, including the use of counting rules to solve a probability problem. ©1998 Brooks/Cole Publishing/ITP
4.5 Event Composition and Event RelationsCompound events can be formed by unions or intersections of other events. Definition: The intersection of events A and B, denoted by A Ç B, is the event that A or B occur. Definition: The union of events A and B, denoted by A È B, is the event that A or B or both occur. See Figures 4.7 and 4.8 for Venn diagrams illustrating union and intersection. ©1998 Brooks/Cole Publishing/ITP
Figure 4.7 Venn diagram of A Ç B©1998 Brooks/Cole Publishing/ITP
Figure 4.9 Two disjoint eventsExample 4.16 illustrates the use of a Venn diagram to determine probabilities. Definition: When two events A and B are mutually exclusive, it means that when A occurs, B cannot, and vice versa. Mutually exclusive events are also referred to as disjoint events. Figure Two disjoint events ©1998 Brooks/Cole Publishing/ITP
P(A Ç B) = 0 and P(A È B) = P(A) + P(B)When events A and B are mutually exclusive: P(A Ç B) = 0 and P(A È B) = P(A) + P(B) If P(A) and P(B) are known, we do not need to break (A È B) down into simple events—we can simply sum them. See Example 4.17. Definition: The complement of an event A, denoted AC, consists of all the simple events in the sample space S that are not in A. Figure The complement of an event ©1998 Brooks/Cole Publishing/ITP
4.6 Conditional Probability and IndependenceThe conditional probability of A, given that B has occurred, is denoted as P(A | B), where the vertical bar is read “given” and the events appearing to the right of the bar are those that you know have occurred. Definition: The conditional probability of B, given that A has occurred, is The conditional probability of A, given that B has occurred, is ©1998 Brooks/Cole Publishing/ITP
P(A| B) = P(A) or P(B | A) = P(B)Definition: Two events A and B are said to be independent if and only if either P(A| B) = P(A) or P(B | A) = P(B) otherwise, the events are said to be dependent. Two events are independent if the occurrence or nonoccurrence of one of the events does not change the probability of the occurrence of the other event. ©1998 Brooks/Cole Publishing/ITP
Additive Rule of Probability: Given two events, A and B, the probability of their union, A È B, is equal to See Figure 4.12 for a representation of the Additive Rule. Figure 4.12 ©1998 Brooks/Cole Publishing/ITP
Multiplicative Rule of Probability: The probability that both of the two events, A and B, occur is If A and B are independent, Similarly, if A, B, and C are mutually independent events, then the probability that A, B, and C occur is See Example 4.21 for an example of the Multiplicative Rule of probability. ©1998 Brooks/Cole Publishing/ITP
Then, applying the Multiplicative Rule, we haveExample 4.24 Consider the experiment in which three coins are tossed. Let A be the event that the toss results in at least one head. Find P (A ). Solution AC is the collection of simple events implying the event “three tails,” and because AC is the complement of A , P (A ) = 1 - P (AC ) Then, applying the Multiplicative Rule, we have and ©1998 Brooks/Cole Publishing/ITP
4.8 Discrete Random Variables and Their Probability DistributionsDefinition: A variable x is a random variable if the value that it assumes, corresponding to the outcome of an experiment, is a chance or random event. Definition: The probability distribution for a discrete random variable is a formula, table, or graph that provides p(x), the probability associated with each of the values of x. Requirements for a Discrete Probability Distribution: See Example 4.25 for an example involving probability distributions. ©1998 Brooks/Cole Publishing/ITP
s = E [(x - m ) 2 ] = S(x - m ) 2 p(x)Definition: Let x be a discrete random variable with probability distribution p(x). The mean or expected value of x is given as m = E(x) = Sxp(x) where the elements are summed over all values of the random variable x. Definition: Let x be a discrete random variable with probability distribution p(x) and mean m. The variance of x is s = E [(x - m ) 2 ] = S(x - m ) 2 p(x) where the summation is over all values of the random variable x. Definition: The standard deviation s of a random variable x is equal to the square root of its variance. See Examples 4.26, 4.27, and 4.28 for examples of the calculation of the mean variance and standard deviation. ©1998 Brooks/Cole Publishing/ITP
Introduction to Probability Experiments, Outcomes, Events and Sample Spaces What is probability? Basic Rules of Probability Probabilities of Compound Events.
Probability Simple Events
1 Chapter 3 Probability 3.1 Terminology 3.2 Assign Probability 3.3 Compound Events 3.4 Conditional Probability 3.5 Rules of Computing Probabilities 3.6.
Week 21 Basic Set Theory A set is a collection of elements. Use capital letters, A, B, C to denotes sets and small letters a 1, a 2, … to denote the elements.
COUNTING AND PROBABILITY
© 2011 Pearson Education, Inc
Copyright © Cengage Learning. All rights reserved. 8.6 Probability.
Introduction to Probability
Probability and Statistics Dr. Saeid Moloudzadeh Sample Space and Events 1 Contents Descriptive Statistics Axioms of Probability Combinatorial.
Elementary Probability Theory
Chapter 4 Probability.
1 Pertemuan 07 Peubah Acak Diskrit Matakuliah: I0134 -Metode Statistika Tahun: 2007.
Chapter Two Probability. Probability Definitions Experiment: Process that generates observations. Sample Space: Set of all possible outcomes of an experiment.
Slide 1 Statistics Workshop Tutorial 4 Probability Probability Distributions.
Chapter 5 Several Discrete Distributions General Objectives: Discrete random variables are used in many practical applications. These random variables.
Basic Concepts and Approaches
Copyright ©2011 Nelson Education Limited. Probability and Probability Distributions CHAPTER 4 Part 2.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
5.1 Basic Probability Ideas
© 2019 SlidePlayer.com Inc. All rights reserved.