5. Functions of a Random Variable

Slides:



Advertisements
Similar presentations
CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
Advertisements

1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.
Probability Densities
Point estimation, interval estimation
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
19. Series Representation of Stochastic Processes
511 Friday March 30, 2001 Math/Stat 511 R. Sharpley Lecture #27: a. Verification of the derivation of the gamma random variable b.Begin the standard normal.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 Let X represent a Binomial r.v,Then from => for large n. In this context, two approximations are extremely useful. (4-1) 4. Binomial Random Variable.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Joint Moments and Joint Characteristic Functions.
One Function of Two Random Variables
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Expectations of Random Variables, Functions of Random Variables
APPENDIX A: A REVIEW OF SOME STATISTICAL CONCEPTS
3. Random Variables (Fig.3.1)
Statistics Lecture 19.
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Equations Quadratic in form factorable equations
12. Principles of Parameter Estimation
Chapter 3 Techniques of Differentiation
Reflexivity, Symmetry, and Transitivity
The Bernoulli distribution
Chi Square Distribution (c2) and Least Squares Fitting
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Numerical Analysis Lecture 23.
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
5. Functions of a Random Variable
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
6.3 Sampling Distributions
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Tutorial 4 Techniques of Differentiation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Inequalities Some problems in algebra lead to inequalities instead of equations. An inequality looks just like an equation, except that in the place of.
Equations Quadratic in form factorable equations
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
8. One Function of Two Random Variables
Continuous Distributions
Moments of Random Variables
Continuous Random Variables: Basics
Chi - square.
Presentation transcript:

5. Functions of a Random Variable

Functions of a Random Variable X: a r.v defined on the model g(X): a function of the variable x, Is Y necessarily a r.v? If so what is its PDF pdf If X is a r.v, so is Y, and for every Borel set B With , In particular => To obtain the distribution function of Y, we must determine the Borel set on the x-axis such that for every given y, and the probability of that set. (5-1) (5-2) (5-3)

Example 5.1 (5-4) (5-5) (5-6) (5-7) (5-8) (5-9) Let Solution: Suppose and On the other hand if then and hence From (5-6) and (5-8), we obtain (for all a) (5-4) (5-5) (5-6) (5-7) (5-8) (5-9)

Example 5.2 Let If then the event and hence (5-10) Let If then the event and hence For from Fig. 5.1, the event is equivalent to (5-11) (5-12) Fig. 5.1

Example 5.2(Cont.) Hence By direct differentiation, we get If represents an even function, then (5-14) reduces to (5-13) (5-14) (5-13)

Example 5.2(Cont.) In particular if  so that and substituting this into (5-14) or (5-15), we obtain the p.d.f of to be notice that (5-17) represents a Chi-square r.v with n = 1, since Thus, if X is a Gaussian r.v with then represents a Chi-square r.v with one degree of freedom (n = 1). (5-16) (5-17)

Example 5.3 Let In this case For we have and so that (5-17) (5-18) (5-19)

Example 5.3(cont.) Similarly if and so that Thus (5-20) (5-21) (b) (c) Fig. 5.2

Example 5.4 Suppose Half-wave rectifier In this case and for since Fig. 5.3 Suppose Half-wave rectifier In this case and for since Thus (5-22) (5-23) (5-24) (5-25)

Continuous Functions of a Random Variable A continuous function g(x) with nonzero at all but a finite number of points, has only a finite number of maxima and minima, and it eventually becomes monotonic as Consider a specific y on the y-axis, and a positive increment as shown in Fig. 5.4 Fig. 5.4 In pervious examples we describe a general approach

Continuous Functions of a Random Variable for where is of continuous type. we can write referring back to Fig. 5.4, has three solutions when the r.v X could be in any one of the three mutually exclusive intervals Hence the probability of the event in (5-26) is the sum of the probability of the above three events, i.e., (5-26) (5-27)

Continuous Functions of a Random Variable For small making use of the approximation in (5-26), we get In this case, and so that (5-28) can be rewritten as and as (5-29) can be expressed as (5-28) (5-29) The summation index i in (5-30) depends on y, and for every y the equation y=g(xi) must be solved to obtain the total number of solutions at every y, and the actual solutionsx1,x2,…. all in terms of y. (5-30)

Continuous Functions of a Random Variable For example, if then for all and represent the two solutions for each y. Notice that the solutions are all in terms of y so that the right side of (5-30) is only a function of y. Referring back to the example (Example 5.2) here for each there are two solutions given by and ( for ). Moreover and using (5-30) we get which agrees with (5-14). Fig. 5.5 (5-31)

Example 5.5 (5-32) Let Find Solution: Here for every y, is the only solution, and and substituting this into (5-30), we obtain (5-33)

Example 5.6 Suppose and Determine Solution: Since X has zero probability of falling outside the interval has zero probability of falling outside the interval Clearly outside this interval. For any from Fig.5.6(b), the equation has an infinite number of solutions where is the principal solution. Moreover, using the symmetry we also get etc. Further, so that

Example 5.6(Cont.) Using this in (5-30), we obtain for Fig. 5.6 Using this in (5-30), we obtain for But from Fig. 5.6(a), in this case (Except for and the rest are all zeros). (5-36)

Example 5.6(Cont.) Thus (Fig. 5.7) (5-37) Fig. 5.7

Example 5.7 Let where  Determine Solution: As x moves from y moves from From Fig.5.8(b), the function is one-to-one for For any y, is the principal solution. Further

Example 5.7(Cont.) so that using (5-30) which represents a Cauchy density function with parameter equal to unity (Fig. 5.9). (5-38) (a) Fig. 5.9 (b) Fig. 5.8

Functions of a discrete-type r.v Suppose X is a discrete-type r.v with and Clearly Y is also of discrete-type, and when and for those (5-39) (5-40)

Example 5.8 Suppose  so that Define Find the p.m.f of Y. Solution: X takes the values so that Y only takes the value and so that for (5-42)