Download presentation

Presentation is loading. Please wait.

1
**Chapter 3 Properties of Random Variables**

Moments and Expectation

2
**Review Experiment Random Variable Observation Realization Parameter**

Sample Statistic

3
Moments One way to quantify the location and some measures of the shape of the pdf.

4
**First moment about the origin**

5
**ith moment about the origin**

continuous random variable discrete random variable

6
**ith central moment about the mean, m**

7
**Expected value of a random variable X**

X continuous X discrete

8
**Expected value of function of X, g(X)**

X continuous X discrete

9
**Expected value and the first moment about the origin**

Comparing to You can see that the expected value of the random value x is the first moment about the origin.

10
**Rules for finding expected values**

11
**Measures of central tendency**

Arithmetic mean Geometric mean Median Mode Weighted Mean

12
**Mean, mx, or average value**

Mean of a r.v. X is its expected value. Sample estimate is the arithmetic average. Arithmetic mean of grouped data (k is number of groups, n is total number of observations, ni is the number of observations in group i, xi is the class mark of the ith group.

13
Geometric mean Used when the ratio of two consecutive observations is either constant or nearly constant. The logarithm of the population geometric mean would be the expected value of the logarithm of X.

14
Median, Xmd The observation such that half of the values in the sample lie on either side of Xmd. The median may not exist. Population median, mmd would be the value satisfying: X continuous X discrete

15
Mode, mmo Most frequently occurring value. The sample or population may have none, one or more than one mode. Population mode is the value of X maximizing px(x). X continuous X discrete

16
Weighted mean Used for describing the central tendancy of grouped data.

17
**Measures of Dispersion**

Measures of the spread of the data Range Variance

18
**Range Difference between the largest and smallest sample values.**

For a population this interval often ranges from - ∞ to ∞ or from 0 to ∞. The sample range is a function of only 2 of the sample values, but does convey some idea of the spread of the data. Disadvantage of range: does not reflect frequency or magnitude of values that deviate from the mean. Occasionally use the relative range Relative range =

19
**Variance, s2 Defined as the second moment about the mean.**

The average squared deviation from the mean. For a discrete population of size n: Sample estimate of sx2 is sx2

20
**Variance Two basic differences between population and sample variance.**

used instead of m n-1 is used as the denominator rather than n to avoid a biased estimate for sx2 Variance of grouped data

21
**Rules for finding the Variance**

22
**Units of Variance Units of the variance are the same as units on X2.**

Units on its positive square root, the standard deviation, sx, are the same as the units of the random variable, X. A dimensionless measure of dispersion is the coefficient of variation, Cv.

23
**Measures of Symmetry Many distributions are not symmetrical**

Tailing off to the right or the left is skewing the distribution. Tailing to the right-positively skewed Tailing to the left-negatively skewed

24
Skewness 3rd moment about the mean

25
**Practical measurements of skewness**

One measure of absolute skew is to measure the difference between the mean and the mode. Not meaningful for comparison sake because it is dependent on units of measure.

26
**Pearson’s first coefficient of skewness**

Relative measure of skewness more useful for comparison. Population skewness Sample skewness

27
**Measures of Peakedness (Flatness)**

Kurtosis refers to the extent of peakedness of a probability distribution in comparison to the normal distribution. Kurtosis is the 4th moment about the mean. Calculate the coefficient of kurtosis, k

28
Kurtosis

29
Covariance Measure of the linear relationship between two jointly distributed random variables, X and Y. Covariance is the 1,1 central moment

30
Covariance If X and Y are independent: Sample statistic:

31
**Correlation Coefficient**

Normalized covariance If X and Y are independent

32
**Correlation Coefficient**

Measure of how two variables vary together. A value of r equal to positive one implies that X and Y are perfectly related by Y=a+bX. Positive values indicate large (small) values of X tend to be paired with large (small) values of Y. Negative values indicate large (small) values of X tend to be paired with small (large) values of Y. Two values are uncorrelated ONLY IF r(x,y)=0. Correlation does NOT equal cause and effect.

33
**Correlation coefficient: Linear dependence and functional dependence**

34
**Other Properties of Moments**

Similar presentations

© 2019 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google