1 Continuous Distributions ch3. 2   A random variable X of the continuous type has a support or space S that is an interval(possibly unbounded) or a.

Slides:



Advertisements
Similar presentations
1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2.
Advertisements

Review of Basic Probability and Statistics
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
1 Continuous Distributions ch4. 2   A random variable X of the continuous type has a support or space S that is an interval(possibly unbounded) or a.
CONTINUOUS RANDOM VARIABLES These are used to define probability models for continuous scale measurements, e.g. distance, weight, time For a large data.
Normal Distribution ch5.
Probability Densities
Review.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33, 5.38, 5.47, 5.53, 5.62.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Chapter 6 Continuous Random Variables and Probability Distributions
Agenda Purpose Prerequisite Inverse-transform technique
1 Empirical and probability distributions 0.4 exploratory data analysis.
Section 3.3 If the space of a random variable X consists of discrete points, then X is said to be a random variable of the discrete type. If the space.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables.
Chapter 5 Continuous Random Variables and Probability Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter 4 Continuous Random Variables and Probability Distributions
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 12 Review of Calculus and Probability
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Chapter 4 Continuous Random Variables and their Probability Distributions The Theoretical Continuous Distributions starring The Rectangular The Normal.
Chapter 3 Basic Concepts in Statistics and Probability
Exponential Distribution
Continuous Probability Distributions  Continuous Random Variable  A random variable whose space (set of possible values) is an entire interval of numbers.
Moment Generating Functions
Theory of Probability Statistics for Business and Economics.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
K. Shum Lecture 16 Description of random variables: pdf, cdf. Expectation. Variance.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Continuous Distributions The Uniform distribution from a to b.
One Random Variable Random Process.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
CONTINUOUS RANDOM VARIABLES
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Distributions of Functions of Random Variables November 18, 2015
2.Find the turning point of the function given in question 1.
Chapter 20 Statistical Considerations Lecture Slides The McGraw-Hill Companies © 2012.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
One Function of Two Random Variables
Fractiles Given a probability distribution F(x) and a number p, we define a p-fractile x p by the following formulas.
Random Variables By: 1.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
3. Random Variables (Fig.3.1)
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Lecture 3 B Maysaa ELmahi.
The Exponential and Gamma Distributions
Ch4.2 Cumulative Distribution Functions
Cumulative distribution functions and expected values
CONTINUOUS RANDOM VARIABLES
3.1 Expectation Expectation Example
Functions of Random variables
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
Presentation transcript:

1 Continuous Distributions ch3

2   A random variable X of the continuous type has a support or space S that is an interval(possibly unbounded) or a union of intervals, instead of a set of real numbers (discrete points).   The probability density function(p.d.f.) of X is an integrable function f(x) satisfying:   f(x)>0, x ∈ S.   ∫ S f(x)dx= 1,   P(X ∈ A) = ∫ A f(x)dx, where A ⊂ S.   Ex.3.2-1: Assume X has the p.d.f.   P(X>20)=?   The distribution function of X is (cumulative distribution function) (c.d.f.) If the derivative F’(x) exists,

3 Zero Probability of Points   For the random variable of the continuous type, the probability of any point is zero.   Namely, P(X=b)=0.   Thus, P(a≤X≤b) = P(a<X<b) = P(a≤X<b) = P(a<X≤b) = F(b)-F(a).   For instance,   Let X be the times between calls to observations aremade to construct a relative frequent histogram h(x).   It is compared with the exponential model   in Ex

4 (Dis)continuity, Differentiable, Integration   Ex3.2-4: Let Y be a continuous random variable with p.d.f. g(y)=2y, 0<y<1.   The distribution function of Y is   P(½≤Y≤¾)=G(¾)-G(½)=5/16.   P(¼≤Y<2)=G(2)-G(¼)=15/16.   Properties of a continuous random variable:   The area between the p.d.f. f(x) and x-axis must equal 1.   f(x) is possibly unbounded (say, >1).   f(x) can be discontinuous function (defined over a set of intervals),   However, its c.d.f. F(x) is always continuous since integration.   It is possible that F’(x) does not exist at x=x 0.

5 Mean, Variance, Moment-generating Fn   Suppose X is a continuous random variable.   The expected value of X, the mean of X is   The variance of X is   The standard deviation of X is   The moment-generating function of X, if it exists, is   The r th moment E(X r ) exists and is finite ⇒ E(X r-1 ), …, E(X 1 ) do.   The converse is not true.   E(e tX ) exists and is finite –h<t<h ⇒ all the moments do.   The converse is not necessarily true.   Ex3.2-5: Random variable Y with p.d.f. g(y)=2y, 0<y<1. (Ex3.2-4)

6 Percentile (π) and Quartiles   Ex.3.2-6: X has p.d.f. f(x)=xe -x, 0≤x<∞.   The (100p) th percentiles a number π p s.t. the area under f(x) to the left of π p is p.   The 50 th percentile is called the median: m = π.5.   The 25 th and 75 th percentiles are called the first and third quartiles.   Namely, π.25 = q 1, π.75 = q 3, and m = π.5 = q 2 the second quartile.   Ex: X with p.d.f. f(x)=1-|x-1|, 0≤x<2. To find 32 rd percentile π.32 is to solve F(π.32 )=.32 ∵ F(1)=.5>.32 To find 92 rd percentile π.92 is to solve F(π.92 )=.92

7 More Example   Ex3.2-8: X has the p.d.f. f(x)=e -x-1, -1<x<∞.   The median m = π.5 is

8 Uniform Distribution   Random variable X has a uniform distribution if its p.d.f. equals a constant on its support.   If the support is the interval [a, b], then p.d.f. ⇒   This distribution, denoted as U(a, b), is also referred to as rectangular due to the shape of f(x).   The mean, variance, distribution function and moment-generating function are   Pseudo-random number generator: a program applies simple arithmetical operations on the seed (starting number) to deterministically generate a sequence of numbers, whose distribution follows U(0, 1).   Table IX on p.695 shows an example of these (random) numbers*104.   Ex.3.3-1: X has p.d.f. f(x)=1/100, 0<x<100, namely U(0,100).   The mean and variance are μ=(0+100)/2=50, σ 2 =10000/12.   The standard deviation is, 100 times of U(0,1).

9 Exponential Distribution   The waiting (inter-change) times W between successive changes whose number X in a given interval is a Poisson distribution is indeed an exponential distribution.   Such time is nonnegative ⇒ the distribution function F(w)=0 for w<0.   For w ≥0, F(w) = P(W≤w) = 1 -P(W>w) = 1 -P(no changes in [0, w]) = 1 –e –λw,   For w>0, the p.d.f. f(w) = F’(w) = λe –λw ⇒   Suppose λ=7, the mean number of changes per minute; ⇒ θ= 1/7, the mean waiting time for the first (next) change.   Ex3.3-2: X has an exponential distribution with a mean of θ=20.

10 Examples   Ex.3.3-3: Customers arrivals follow a Poisson process of 20 per hr.   What is the probability that the shopkeeper will have to wait more than 5 minutes for the arrival of the first (next) customer?   Let X be the waiting time in minutes until the next customer arrives.   Having awaited for 6 min., what is the probability that the shopkeeper will have to wait more than 3 min. additionally for the new arrival?   Memory-less, forgetfulness property!   Percentiles:   To exam how close an empirical collection of data is to the exponential distribution, the q-qplot (y r,π p ) from the ordered statistics can be constructed, where p=r/(n+1), r=1,2,…,n.   If θis unknown, π p =-ln(1-p) can be used in the plot, instead.   As the curve ≈a straight line, it matches well. (Slope: an estimate of 1/θ)

11 Gamma Distribution  Generalizing the exponential distribution, the Gamma distribution considers the waiting time W until the α th change occurs, α≥1.  The distribution function F(w) of W is given by  Leibnitz's rule:  The Gamma function is defined by F(w) F’(w) generalized factorial

12 Fundamental Calculus  There are some formula from Calculus:  Generalized integration by parts (also ref. p.666):  Formula used in the Gamma distribution:  Pf:  Suppose  Then,  m   By the induction hypothesis, the equation holds! α=1:

13 Chi-square Distribution  A Gamma distribution with θ=2, α=r/2, (r ∈ N) is a Chi- square distribution with r degrees of freedom, denoted as χ 2 (r).  The mean μ=r, and variance σ 2 =2r.  The mode, the point for the maximal p.d.f., is x=r-2  Ex.3.4-3: X has a chi-square distribution with r=5.  Using Table IV on p.685 P(X>15.09)1-F(15.09)=1-0.99=0.01. P(1.145 ≦ X ≦ 12.83)=F(12.83)-F(1.145)= =  Ex.3.4-4: X is χ 2 (7). Suppose there are two constants a & b s.t. P(a<X<b)=0.95. ⇒ One of many possible is a=1.69 & b=16.01  Percentiles:  The 100(1-α) percentile is  The 100αpercentile is f(x)

14 Distributions of Functions of Random Variable  From a known random variable X, we may consider another Y = u(X), a function of X, and want to know Y’s distribution function.  Distribution function technique:  We directly find G(y) = P(Y≤y) = P[u(X)≤y], and g(y)=G’(y).  E.g., finding the gamma distribution from the Poisson distribution.  Also, N(μ,σ 2 ) ⇒ N(0,1), and N(μ,σ 2 ) ⇒ χ 2 (1).  It requires the knowledge of the related probability models.  Change-of-variable technique:  Find the inverse function X=v(Y) from Y=u(X), and  Find the mapping: boundaries, one-to-one, two-to-one, etc.  It requires the knowledge of calculus and the like.

15 Distribution Function Technique  Ex: [Lognormal] X is N(μ,σ 2 ). If W=e X, G(w)=?, g(w)=?  Ex3.5-2: Let w be the smallest angle between the y-axis and the spinner, and have a uniform distribution on (-π/2, π/2). G(w)= g(w)= <=Cauchy p.d.f. Both limits ⇒ E(x) does not exist g(x) (0,1) y w x

16 Change-of of-variable Technique  Suppose X is a R.V. with p.d.f. f(x) with support c 1 < x < c 2.  Y=u(X) ⇔ the inverse X=v(Y) with support d 1 < y < d 2.  Both u and v are conti. increasing functions, and d 1 =u(c 1 ), d 2 =u(c 2 ).  Suppose both u and v are conti. decreasing functions:  The mapping of c 1 y > d 2  Generally, the support mapped from. G(y)= g(y)= Ex4.5-1 is an example. G(y)=…= g(y)=G’(y)

17 Conversions: Any ⇔ U(0,1)  Thm.3.5-2: X has F(x) that is strictly increasing on Sx={x: a<x<b}.Then R.V. Y, defined by Y=F(X), has a distribution U(0,1).  Pf: The distribution function of Y is  The requirement that F(x) is strictly increasing can be dropped.  It will take tedious derivations to exclude the set of intervals as f(x)=0.  The change-of-variable technique can be applied to the discrete type R.V.  Y=u(X), X=v(Y): there exists one-to-one mapping.  The p.m.f. of Y is g(y)=P(Y=y)=P[u(X)=y]=P[X=v(y)]=f[v(y)], y ∈ S y.  There is no term “|v’(y)|”, since f(x) presents the probability.

18 Examples  Ex.3.5-6: X is a Poisson with λ=4.  If Y=X 1/2, X=Y 2, then  When the transformation Y=u(X) is not one-to-one, say V=Z 2.  For instance, Z is N(0,1): -∞<z<∞, 0≤v<∞. [2-to-1 mapping]  Each interval (case) is individually considered.  Ex.3.5-7: X has p.d.f. f(x)=x 2 /3, -1<x<2.  If X=Y 1/2, Y=X 2, then 0≤y<4:  -1 < x 1 < 0 ⇔ 0 ≤y 1 < 1  0 ≤x 2 < 1 ⇔ 0 ≤y 2 < 1  1 ≤x 3 < 2 ⇔ 1 ≤y 3 < 4 G(v)= g(v)=

19 How to find E(X) & Var(X) now  Ex3.6-6: Find the mean and variance of X in previous example.  Ex3.6-7: Reinsurance companies may agree to cover the wind damages that ranges between $2 and $10 million.  X is the loss in million and has a distribution function:  If losses beyond $10 is set as $10, then  The cases (x>10) will all be attributed to x=10: P(X=10)=1/8. F(x)= μ=E(X) σ 2 =E(X 2 )-μ 2