Handout Ch 4 實習.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

NORMAL OR GAUSSIAN DISTRIBUTION Chapter 5. General Normal Distribution Two parameter distribution with a pdf given by:
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
Independence of random variables
06/05/2008 Jae Hyun Kim Chapter 2 Probability Theory (ii) : Many Random Variables Bioinformatics Tea Seminar: Statistical Methods in Bioinformatics.
Probability Densities
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Review of Probability and Statistics
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Today Today: More Chapter 5 Reading: –Important Sections in Chapter 5: Only material covered in class Note we have not, and will not cover moment/probability.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Andy Guo 1 Handout Ch5(2) 實習. Andy Guo 2 Normal Distribution There are three reasons why normal distribution is important –Mathematical properties of.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Chap. 4 Continuous Distributions
Continuous Distributions The Uniform distribution from a to b.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Statistics for Business & Economics
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Operations on Multiple Random Variables
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Continuous Random Variables and Probability Distributions
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Sampling and Sampling Distributions
Jiaping Wang Department of Mathematical Science 04/22/2013, Monday
Inequalities, Covariance, examples
Handout Ch5(1) 實習.
Lecture 3 B Maysaa ELmahi.
Applied Discrete Mathematics Week 11: Relations
Handout Ch 4 實習.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Parameter, Statistic and Random Samples
Linear Combination of Two Random Variables
3.1 Expectation Expectation Example
Discrete Probability Distributions
Chapter 4: Mathematical Expectation:
Some Rules for Expectation
Review of Probability Concepts
Chapter 10: Covariance and Correlation
Multinomial Distribution
Example Suppose X ~ Uniform(2, 4). Let . Find .
Tutorial 9: Further Topics on Random Variables 2
Independence of random variables
Handout Ch 4 實習.
Chapter 5 Expectations 主講人:虞台文.
Chapter 5 Applied Statistics and Probability for Engineers
Chapter 2. Random Variables
Further Topics on Random Variables: Covariance and Correlation
Introduction to Probability: Solutions for Quizzes 4 and 5
Continuous Distributions
Chapter 10: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Chapter 10: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

Handout Ch 4 實習

微積分複習第二波(1) Example Jia-Ying Chen

微積分複習第二波(2) 變數變換 Example 這是什麼鬼 Jia-Ying Chen

微積分複習第二波(3) Ch 4積分補充 Jia-Ying Chen

Find x Jia-Ying Chen

新的消去法? Jia-Ying Chen

Another way to expand an equation Jia-Ying Chen

歸去來析 ( 乾脆去死,台語) Jia-Ying Chen

Expectation of a Random Variable Discrete distribution Continuous distribution E(X) is called expected value, mean or expectation of X. E(X) can be regarded as being the center of gravity of that distribution. E(X) exists if and only if E(X) exists if and only if Whenever X is a bounded random variable, then E(X) must exist. Jia-Ying Chen

The Expectation of a Function Let , then Let , then Suppose X has p.d.f as follows: Let it can be shown that Jia-Ying Chen

Example 1 (4.1.3) In a class of 50 students, the number of students ni of each age i is shown in the following table: If a student is to be selected at random from the class, what is the expected value of his age Agei 18 19 20 21 25 ni 22 4 3 1 Jia-Ying Chen

Solution E[X]=18*0.4+19*0.44+20*0.08+21*0.06+ 25*0.02=18.92 Agei 18 19 ni 22 4 3 1 Pi 0.4 0.44 0.08 0.06 0.02 Jia-Ying Chen

Properties of Expectations If there exists a constant such that If are n random variables such that each exists, then For all constants Usually Only linear functions g satisfy If are n independent random variable such that each exists, then Jia-Ying Chen

Example 2 (4.2.6) Suppose that a particle starts at the origin of the real line and moves along the line in jumps of one unit. For each jump, the probability is p (0<=p<=1) that the particle will jump one unit to the left and the probability is 1-p that the particle will jump one unit to the right. Find the expected value of the position of the particle after n jumps. Jia-Ying Chen

Solution Jia-Ying Chen

Properties of the Variance Var(X ) = 0 if and only if there exists a constant c such that Pr(X = c) = 1. For constant a and b, . Proof : Jia-Ying Chen

Properties of the Variance If X1 , …, Xn are independent random variables, then If X1,…, Xn are independent random variables, then Jia-Ying Chen

Example 3 (4.3.6) Suppose that X and Y are independent random variables with finite variances such that E(X)=E(Y) Show that Jia-Ying Chen

Solution Jia-Ying Chen

Moment Generating Functions Consider a given random variable X and for each real number t, we shall let . The function is called the moment generating function (m.g.f.) of X. Suppose that the m.g.f. of X exists for all values of t in some open interval around t = 0. Then, More generally, Jia-Ying Chen

Properties of Moment Generating Functions Let X has m.g.f. ; let Y = aX+b has m.g.f. . Then for every value of t such that exists, Proof: Suppose that X1,…, Xn are n independent random variables; and for i = 1,…, n, let denote the m.g.f. of Xi. Let , and let the m.g.f. of Y be denoted by . Then for every value of t such that exists, we have Jia-Ying Chen

The m.g.f. for the Binomial Distribution Suppose that a random variable X has a binomial distribution with parameters n and p. We can represent X as the sum of n independent random variables X1,…, Xn. Determine the m.g.f. of Jia-Ying Chen

Uniqueness of Moment Generating Functions If the m.g.f. of two random variables X1 and X2 are identical for all values of t in an open interval around t = 0, then the probability distributions of X1 and X2 must be identical. The additive property of the binomial distribution Suppose X1 and X2 are independent random variables. They have binomial distributions with parameters n1 and p and n2 and p. Let the m.g.f. of X1 + X2 be denoted by . The distribution of X1 + X2 must be binomial distribution with parameters n1 + n2 and p. Jia-Ying Chen

Example 4 (4.4.10) Suppose that the random variables X and Y are i.i.d. and that the m.g.f. of each is Find the m.g.f. of Z=2X-3Y+4 Jia-Ying Chen

Solution Jia-Ying Chen

Properties of Variance and Covariance If X and Y are random variables such that and , then Correlation only measures linear relationship. 兩各變數可以dependent,但correlation=0 Example: Suppose that X can take only three values –1, 0, and 1, and that each of these three values has the same probability. Let Y=X 2. So X and Y are dependent. E(XY)=E(X 3)=E(X)=0, so Cov(X,Y) = E(XY) – E(X)E(Y)=0 (uncorrelated). Jia-Ying Chen

Example 5 (4.6.12) Suppose that X and Y have a continuous joint distribution for which the joint p.d.f is as follows: Determine the value of Var(2X-3Y+8) Jia-Ying Chen

Solution Jia-Ying Chen