Presentation is loading. Please wait.

Presentation is loading. Please wait.

Transforming and Combining Random Variables

Similar presentations


Presentation on theme: "Transforming and Combining Random Variables"— Presentation transcript:

1 Transforming and Combining Random Variables
Section 6.2 Reference Text: The Practice of Statistics, Fourth Edition. Starnes, Yates, Moore Lesson 6.1.1

2 Objectives Multiplying or Dividing by a constant
Adding or Subtracting by a constant Putting it Together: Adding, Subtracting, Multiplying, or Dividing  Linear transformation! Mean of the Sum of Random Variables Independent Random Variables Variance of the Sum of Independent Random Variables Mean of the Difference of Random Variables Variance of the Differences of Random Variables

3 Multiplying or Dividing by a Constant

4 Adding or Subtracting by a Constant
If we took our values of our distinct random variables and added (which could be negative) (or subtracted) them by a constant (a)…. Adds (or subtracts) measures of center and location (mean, median, quartiles, percentiles) by a. Does not change the measure of spread Does not change the shape of the distribution Shall we look at Pete’s Jeep Tours?

5 Putting It All Together: Linear Transformation
What happens if we transform a random variable by both adding or subtracting a constant and multiplying or dividing by a constant? We could have gone directly from the number of passengers X on Pete’s Jeep Tours to the profit of: V = 150X -100 where we both subtracted 100 and multiplied by 150. This is a linear transformation! In general can be written in the form of Y = a + bX, where a and b are constants. Lets generalize on the next slide….

6 Effects of Linear Transformation on the mean and SD

7 Mean of the Sum of Random Variables

8 Independent Random Variables
If knowing whether any event involving X alone has occurred tells us nothing about the occurrence of any event involving Y alone, and vice versa, then X and Y are independent random variables. But we already knew this! Just restating the idea of being independent!

9 Independent Random Variables
Probability models often assume independence when the random variables describe outcomes that appear unrelated to each other. You should always ask whether the assumption of independence seems reasonable. For instance, its reasonable to treat the random variables X = number of passengers on Pete’s trip and Y = number of passengers on Erin’s trip on a randomly chosen day as independent, since the siblings operate their trips in different parts of the country.

10 Variance of the Sum of Independent Random Variables

11 By the Way… You might be wondering whether there’s a formula for computing the variance of the sum of two random variables that are not independent. There is, but its beyond the scope of this course. Just remember, you can add variances only if the two random variables are independent, and that you can never add standard deviations.

12 Mean of the Difference of Random Variables

13 Variance of the Differences of Random Variables
Earlier, we saw that the variance of the sum of two independent random variables is the sum of their variances. Can you guess what the variance of the difference of two independent random variables will be? WRONG! THINK AGAIN! MUHAHAHA!

14 Variance of the Differences of Random Variables

15 Objectives Multiplying or Dividing by a constant
Adding or Subtracting by a constant Putting it Together: Adding, Subtracting, Multiplying, or Dividing  Linear transformation! Mean of the sum of random variables Independent random variables Variance of the sum of independent variables Mean difference of Random Variables Variance of the differences of Random Variables

16 Homework Worksheet : I'm going to post the homework online- however the deal is: If I don’t post the homework online by Tuesday 11/25/14 then there is no homework over break. Deal?!


Download ppt "Transforming and Combining Random Variables"

Similar presentations


Ads by Google