Presentation is loading. Please wait.

Presentation is loading. Please wait.

Continuous Distributions The Uniform distribution from a to b.

Similar presentations


Presentation on theme: "Continuous Distributions The Uniform distribution from a to b."— Presentation transcript:

1 Continuous Distributions The Uniform distribution from a to b

2 The Normal distribution (mean , standard deviation  )

3 The Exponential distribution

4 Weibull distribution with parameters  and .

5 The Weibull density, f(x) (  = 0.5,  = 2) (  = 0.7,  = 2) (  = 0.9,  = 2)

6 The Gamma distribution Let the continuous random variable X have density function: Then X is said to have a Gamma distribution with parameters  and.

7 Expectation

8 Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of X, E(X) is defined to be: and if X is continuous with probability density function f(x)

9 Interpretation of E(X) 1.The expected value of X, E(X), is the centre of gravity of the probability distribution of X. 2.The expected value of X, E(X), is the long-run average value of X. (shown later –Law of Large Numbers) E(X)E(X)

10 Example: The Uniform distribution Suppose X has a uniform distribution from a to b. Then: The expected value of X is:

11 Example: The Normal distribution Suppose X has a Normal distribution with parameters  and . Then: The expected value of X is: Make the substitution:

12 Hence Now

13 Example: The Gamma distribution Suppose X has a Gamma distribution with parameters  and. Then: Note: This is a very useful formula when working with the Gamma distribution.

14 The expected value of X is: This is now equal to 1.

15 Thus if X has a Gamma ( , ) distribution then the expected value of X is: Special Cases: ( , ) distribution then the expected value of X is: 1. Exponential ( ) distribution:  = 1, arbitrary 2. Chi-square ( ) distribution:  = / 2, = ½.

16 The Gamma distribution

17 The Exponential distribution

18 The Chi-square (  2 ) distribution

19 Expectation of functions of Random Variables

20 Definition Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of g(X), E[g(X)] is defined to be: and if X is continuous with probability density function f(x)

21 Example: The Uniform distribution Suppose X has a uniform distribution from 0 to b. Then: Find the expected value of A = X 2. If X is the length of a side of a square (chosen at random form 0 to b) then A is the area of the square = 1/3 the maximum area of the square

22 Example: The Geometric distribution Suppose X (discrete) has a geometric distribution with parameter p. Then: Find the expected value of X A and the expected value of X 2.

23 Recall: The sum of a geometric Series Differentiating both sides with respect to r we get:

24 Thus This formula could also be developed by noting:

25 This formula can be used to calculate:

26 To compute the expected value of X 2. we need to find a formula for Note Differentiating with respect to r we get

27 Differentiating again with respect to r we get Thus

28 implies Thus

29

30 Moments of Random Variables

31 Definition Let X be a random variable (discrete or continuous), then the k th moment of X is defined to be: The first moment of X,  =  1 = E(X) is the center of gravity of the distribution of X. The higher moments give different information regarding the distribution of X.

32 Definition Let X be a random variable (discrete or continuous), then the k th central moment of X is defined to be: where  =  1 = E(X) = the first moment of X.

33 The central moments describe how the probability distribution is distributed about the centre of gravity, . and is denoted by the symbol var(X). = 2 nd central moment. depends on the spread of the probability distribution of X about . is called the variance of X.

34 is called the standard deviation of X and is denoted by the symbol . The third central moment contains information about the skewness of a distribution.

35 The third central moment contains information about the skewness of a distribution. Measure of skewness

36 Positively skewed distribution

37 Negatively skewed distribution

38 Symmetric distribution

39 The fourth central moment Also contains information about the shape of a distribution. The property of shape that is measured by the fourth central moment is called kurtosis The measure of kurtosis

40 Mesokurtic distribution

41 Platykurtic distribution

42 leptokurtic distribution

43 Example: The uniform distribution from 0 to 1 Finding the moments

44 Finding the central moments:

45 Thus The standard deviation The measure of skewness The measure of kurtosis

46 Rules for expectation

47 Rules: Proof The proof for discrete random variables is similar.

48 Proof The proof for discrete random variables is similar.

49 Proof The proof for discrete random variables is similar.

50 Proof

51 Moment generating functions

52 Definition Let X denote a random variable, Then the moment generating function of X, m X (t) is defined by: Recall

53 Examples The moment generating function of X, m X (t) is: 1.The Binomial distribution (parameters p, n)

54 The moment generating function of X, m X (t) is: 2.The Poisson distribution (parameter )

55 The moment generating function of X, m X (t) is: 3.The Exponential distribution (parameter )

56 The moment generating function of X, m X (t) is: 4.The Standard Normal distribution (  = 0,  = 1)

57 We will now use the fact that We have completed the square This is 1

58 The moment generating function of X, m X (t) is: 4.The Gamma distribution (parameters , )

59 We use the fact Equal to 1

60 Properties of Moment Generating Functions

61 1. m X (0) = 1 Note: the moment generating functions of the following distributions satisfy the property m X (0) = 1

62 We use the expansion of the exponential function:

63 Now

64 Property 3 is very useful in determining the moments of a random variable X. Examples

65

66 To find the moments we set t = 0.

67

68

69 The moments for the exponential distribution can be calculated in an alternative way. This is note by expanding m X (t) in powers of t and equating the coefficients of t k to the coefficients in: Equating the coefficients of t k we get:

70 The moments for the standard normal distribution We use the expansion of e u. We now equate the coefficients t k in:

71 If k is odd:  k = 0. For even 2k:


Download ppt "Continuous Distributions The Uniform distribution from a to b."

Similar presentations


Ads by Google