Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.

Slides:



Advertisements
Similar presentations
1 Chapter 3 Probability 3.1 Terminology 3.2 Assign Probability 3.3 Compound Events 3.4 Conditional Probability 3.5 Rules of Computing Probabilities 3.6.
Advertisements

Chapter 4 Probability and Probability Distributions
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin A Survey of Probability Concepts Chapter 5.
© 2011 Pearson Education, Inc
Economics 105: Statistics Any questions? Go over GH2 Student Information Sheet.
Business and Economics 7th Edition
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 4-1 Business Statistics: A Decision-Making Approach 7 th Edition Chapter.
Chapter 4 Probability.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
Introduction to Probability and Statistics
Chapter 2: Probability.
Probability Distributions
Visualizing Events Contingency Tables Tree Diagrams Ace Not Ace Total Red Black Total
Chap 4-1 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 4 Probability.
1 Basic Probability Statistics 515 Lecture Importance of Probability Modeling randomness and measuring uncertainty Describing the distributions.
PROBABILITY (6MTCOAE205) Chapter 2 Probability.
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Copyright ©2011 Nelson Education Limited. Probability and Probability Distributions CHAPTER 4 Part 2.
Chapter 4 Probability See.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Independence and Bernoulli.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
“PROBABILITY” Some important terms Event: An event is one or more of the possible outcomes of an activity. When we toss a coin there are two possibilities,
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
11-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Probability and Statistics Chapter 11.
Ex St 801 Statistical Methods Probability and Distributions.
Theory of Probability Statistics for Business and Economics.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
3 - 1 © 1998 Prentice-Hall, Inc. Chapter 3 Probability.
3 - 1 © 2000 Prentice-Hall, Inc. Statistics for Business and Economics Probability Chapter 3.
LECTURE IV Random Variables and Probability Distributions I.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review Instructor: Anirban Mahanti Office: ICT Class.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin A Survey of Probability Concepts Chapter 5.
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
1 CHAPTERS 14 AND 15 (Intro Stats – 3 edition) PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
Random Variables Presentation 6.. Random Variables A random variable assigns a number (or symbol) to each outcome of a random circumstance. A random variable.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin A Survey of Probability Concepts Chapter 5.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved. Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Probability (outcome k) = Relative Frequency of k
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Fall 2002Biostat Probability Probability - meaning 1) classical 2) frequentist 3) subjective (personal) Sample space, events Mutually exclusive,
BIA 2610 – Statistical Methods
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Chapter 4 Probability Concepts Events and Probability Three Helpful Concepts in Understanding Probability: Experiment Sample Space Event Experiment.
PROBABILITY AND BAYES THEOREM 1. 2 POPULATION SAMPLE PROBABILITY STATISTICAL INFERENCE.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
3-1 Copyright © 2014, 2011, and 2008 Pearson Education, Inc.
PROBABILITY 1. Basic Terminology 2 Probability 3  Probability is the numerical measure of the likelihood that an event will occur  The probability.
L Basic Definitions: Events, Sample Space, and Probabilities l Basic Rules for Probability l Conditional Probability l Independence of Events l Combinatorial.
APPENDIX A: A REVIEW OF SOME STATISTICAL CONCEPTS
Chapter 3 Probability.
Chapter 4 Probability Concepts
Chapter 6 6.1/6.2 Probability Probability is the branch of mathematics that describes the pattern of chance outcomes.
PROBABILITY AND PROBABILITY RULES
What is Probability? Quantification of uncertainty.
Quick Review Probability Theory
Statistics for Business and Economics
Basic Probability aft A RAJASEKHAR YADAV.
Unit 1: Basic Probability
CHAPTER 4 PROBABILITY Prem Mann, Introductory Statistics, 8/E Copyright © 2013 John Wiley & Sons. All rights reserved.
A random experiment gives rise to possible outcomes, but any particular outcome is uncertain – “random”. For example, tossing a coin… we know H or T will.
Presentation transcript:

Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions chapter two

2-2 Notation and Definitions Summation Sign Σ The sum of the variable X from the first value, i = 1, to the last value, i = n, where n is the last value.

2-3 Definitions Statistical or Random Experiment Any process of observation or measurement That has more than one possible outcome For which there is uncertainty about the outcome Examples: tossing a coin, drawing cards from a deck. Sample Space or Population The set of all possible outcomes of an experiment What is the sample space for tossing two fair coins? Sample Point Each individual outcome or member of the sample space

2-4 Definitions Event A particular collection of outcomes A subset of the sample space Events are Mutually exclusive if the occurrence of one prevents the occurrence of another Equally likely if one is just as likely to occur as another Collectively exhaustive if they exhaust all possible outcomes of an experiment.

2-5 Example Suppose the MTSU baseball team is playing a doubleheader. It terms of winning or losing each game, there are four possible outcomes for MTSU: WW, WL, LW, LL. Note that each outcome is an event. What is the sample space? Are the events: mutually exclusive? equally likely? collectively exhaustive?

2-6 Figure 2-1 Venn diagram.

2-7 Venn Diagram The rectangle is the sample space The circles represent events Fig. 2-1(a): outcomes that belong to A and its complement A’ Fig. 2-1(b): Union of A and B, A U B Fig. 2-1(c): Intersection of A and B, A ∩ B Fig. 2-1(d): A and B are mutually exclusive How can you represent number of MTSU baseball wins?

2-8 Random Variables A variable whose numerical value is determined by the outcome of an experiment. Example: Toss two coins and observe the number of heads. Possible outcomes are TT0 TH1The number of heads is a random HT1variable (as is MTSU’s number of wins). HH2 These are discrete random variables that are countable. Continuous r.v.’s can take any value within a range (such as the height or weight of students in this class).

2-9 Classical Probability Classical or A Priori definition If an experiment can result in n mutually exclusive and equally likely outcomes And if m outcomes are favorable to event A Then the probability of A, P(A), is m/n P(A) = (number of outcomes favorable to A) (total number of outcomes) Examples: coin toss, dice roll, card draw.

2-10 Empirical Probability Relative Frequency or Empirical Definition The number of occurrences of a given event divided by the total number of occurrences. Or, relative frequency = absolute frequency divided by total number of occurrences. If in n trials, m are favorable to A, then P(A) = m/n, if n is sufficiently large. “Large” depends on context. Example: frequency distribution of eye color among MTSU students.

2-11 Properties of Probabilities 0 < P(A) < 1 If A, B, C,…are mutually exclusive events P(A + B + C+…) = P(A) + P(B) + P(C)…. The probability that any one of these events occurs is the sum of their individual probabilities. If A, B, C, …are mutually exclusive and a collectively exhaustive set of events P(A + B + C+…) = P(A) + P(B) + P(C)….= 1

2-12 Some Rules of Probability The events A, B, C,…are said to be statistically independent if the probability that they occur together is the product of their individual probabilities. P(ABC…) = P(A)P(B)P(C)…. P(ABC…) is the probability of events ABC… occurring simultaneously or jointly, called a joint probability. P(A), P(B), P(C),…are called unconditional, marginal or individual probabilities. Example: Toss two coins. What is the probability of a head on the first coin and a head on the second coin?

2-13 Some Rules of Probability If events A, B, C,…are not mutually exclusive P(A + B) = P(A) + P(B) – P(AB) Where P(AB) is the joint probability that A and B occur together. For every event A there is an event A’, the complement of A, where P(A + A’) = 1 P(AA’) = 0

2-14 Conditional Probability The probability that event A occurs knowing that event B has occurred The conditional probability of A, conditional on event B occurring, is P(A|B) = P(AB)/P(B) for P(B) > 0; And P(B|A) = P(AB)/P(A) for P(A) > 0 Example: 300 males and 200 females take an accounting class. Of these, 100 males and 60 females are accounting majors. If a student chosen at random from the class is an accounting major, what is the probability that the student is male?

2-15 Conditional Probability Conditional and unconditional probabilities are generally different, unless the two events are independent, then P(A|B) = P(AB)/P(B) = [P(A)P(B)]/P(B) = P(A) Since P(AB) = P(A)P(B) when the two events are independent.

2-16 Bayes’ Theorem Use the knowledge that an event B has occurred to update the probability that an event A has occurred. P(A|B) = P(B|A)P(A) P(B|A)P(A) + P(B|A’)P(A’) Where A’ is the complement of A P(A) is called the prior probability P(A|B) is called the posterior probability

2-17 Bayes’ Theorem Example: Suppose a woman has two coins in her purse, one is fair and one is two-headed. She takes a coin at random from her purse and tosses it. A head shows up. What is the probability that the coin is two headed?

2-18 Random Variables Probability Random variables are numerical representations of the outcomes or events of a sample space Since we can assign probabilities to outcomes and events, we can assign probabilities to random variables. Probability Distributions The possible values taken by a random variable and the probabilities of occurrence of those values.

2-19 Discrete Random Variables Probability Mass Function (PMF) For a discrete r.v. X taking values x 1, x 2,…. f(X = x i ) = P(X = x i )i = 1, 2, …. is the PMF or probability function (PF). Properties of PMF or PF 0 < f(x i ) < 1 ∑ x f(x i ) = 1

2-20 Example: Two tosses of a coin Possible outcomes: TT, TH, HT, HH Let X = number of heads Number of heads (X)PF f(x) 0¼ 1½ 2¼ Sum1.0 A graph of the PMF (PF) is shown in Fig. 2-2.

2-21 Figure 2-2 The probability mass function (PMF) of the number of heads in two tosses of a coin (Example 2.13).

2-22 Continuous Random Variables Probability Density Function (PDF) A continuous r.v. can take an uncountably infinite number of values The probability that a continuous r.v. takes a particular value is always zero The probability for a continuous r.v. is always measured over an interval x2 P(x 1 < X < x 2 ) = ∫ x1 f(x)dx See Fig. 2-3

2-23 Figure 2-3 The PDF of a continuous random variable.

2-24 Continuous Random Variables Since P(X = x 1 ) = 0, thenthe following are equivalent P(x 1 < X < x 2 )

2-25 Cumulative Distribution Function (CDF) F(X) = P(X < x) or the probability that X takes a value less than or equal to x F(-∞) = 0 and F(∞) = 1 F(x) is nondecreasing, for x 2 > x 1 then F(x 2 ) > F(x 1 ) P(X > k) = 1 – F(k) P(x 1 < X < x 2 ) = F(x 2 ) – F(x 1 )

2-26 Example: Number of Heads in 4 Tosses XValue XPDFValue XCDF 00 < X < 11/16X < 01/16 11 < X < 24/16X < 15/16 22 < X < 36/16X < 211/16 33 < X < 44/16X < 315/16 44 < X1/16X < 41

2-27 Figure 2-4 The cumulative distribution function (CDF) of a discrete random variable (Example 2.15).

2-28 Figure 2-5 The CDF of a continuous random variable.

2-29 Multivariate PDFs Previous examples involved one r.v. and are single variable or univariate PDFs Outcomes of some experiments may be described by more than one r.v. These involve multivariate PDFs The simplest of these is the bivariate, or two variable, PDF Table 2-2 is an example of a joint frequency distribution for two variables.

2-30 Table 2-2: Absolute Frequencies The frequency distribution of two random variables: Number of PCs sold (X) and Number of Printers sold (Y).

2-31 Table 2-3: Measures of Joint Probabilities The bivariate probability distribution of number of PCs sold (X) and number of printers sold (Y).

2-32 Bivariate or Joint PMF f(X, Y) = P(X = x and Y = y) is the joint PMF f(X, Y) = 0 when X ≠ x and Y ≠ y The probability that X and Y take certain values simultaneously f(X, Y) > 0 for all pairs of X and Y ∑ x ∑ y f(X, Y) = 1 sum of joint probabilities Leads to Marginal and Conditional PFs

2-33 Marginal Probability Distributions f(X) and f(Y) are called univariate, unconditional, individual, or marginal PMFs or PDFs relative to the joint or bivariate PF f(X, Y). For example, the marginal PMF of X is the probability that X takes a given value regardless of the values taken by Y Tables 2-3 and 2-4 show how the marginal PMF is derived

2-34 Table 2-4 Marginal probability distributions of X (number of PCs sold) and Y (number of printers sold).

2-35 Conditional Probability Functions What is the probability that Y = 4, conditional on X = 4 (Table 2-3)? This known as a conditional probability and be found from the conditional PMF f(Y | X) = P(Y = y | X = x) and f(X | Y) = P(X = x | Y = y) These can be computed as f(Y | X) = f(X, Y)/f(X) and f(X | Y) = f(Y, X)/f(Y) Or conditional = joint/marginal of conditioning r.v.

2-36 Statistical Independence Two variables, X and Y, are statistically independent if and only if their joint PMF or PDF can be expressed as the product of their individual or marginal PMFs or PDFs for all combinations of X and Y. Example: A bag contains three balls numbered 1, 2, 3. Two balls are drawn at random with replacement. X is the number on the first ball and Y is the number on the second. Consider f(X = 1, Y = 1), f(X = 1), and f(Y = 1) from Table 2-5. Are the number of PCs and printers sold in Table 2-3 independent random variables?

2-37 Table 2-5 Statistical independence of two random variables.