Conditional Expectation

Slides:



Advertisements
Similar presentations
The Maximum Likelihood Method
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Point Estimation Notes of STAT 6205 by Dr. Fan.
CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Chapter 7. Statistical Estimation and Sampling Distributions
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Statistical Estimation and Sampling Distributions
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
The General Linear Model. The Simple Linear Model Linear Regression.
Independence of random variables
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
Today Today: Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Section 6.1 Let X 1, X 2, …, X n be a random sample from a distribution described by p.m.f./p.d.f. f(x ;  ) where the value of  is unknown; then  is.
A gentle introduction to Gaussian distribution. Review Random variable Coin flip experiment X = 0X = 1 X: Random variable.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
The moment generating function of random variable X is given by Moment generating function.
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
July 3, Department of Computer and Information Science (IDA) Linköpings universitet, Sweden Minimal sufficient statistic.
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Joint Distribution of two or More Random Variables
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Random Sampling, Point Estimation and Maximum Likelihood.
A statistical model Μ is a set of distributions (or regression functions), e.g., all uni-modal, smooth distributions. Μ is called a parametric model if.
Chapter 7 Point Estimation
Convergence in Distribution
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
1 Lecture 16: Point Estimation Concepts and Methods Devore, Ch
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Week 41 How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Statistical Estimation Vasileios Hatzivassiloglou University of Texas at Dallas.
M.Sc. in Economics Econometrics Module I Topic 4: Maximum Likelihood Estimation Carol Newman.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
The Maximum Likelihood Method
Copyright © Cengage Learning. All rights reserved.
STATISTICS POINT ESTIMATION
Probability Theory and Parameter Estimation I
The Maximum Likelihood Method
Parameter, Statistic and Random Samples
Conditional Probability on a joint discrete distribution
Review of Probability and Estimators Arun Das, Jason Rebello
The Maximum Likelihood Method
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
Monte Carlo Approximations – Introduction
Random Sampling Population Random sample: Statistics Point estimate
More about Posterior Distributions
POINT ESTIMATOR OF PARAMETERS
6 Point Estimation.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Applied Statistics and Probability for Engineers
Presentation transcript:

Conditional Expectation For X, Y discrete random variables, the conditional expectation of Y given X = x is and the conditional variance of Y given X = x is where these are defined only if the sums converges absolutely. In general, week 4

For X, Y continuous random variables, the conditional expectation of Y given X = x is and the conditional variance of Y given X = x is In general, week 4

Example Suppose X, Y are continuous random variables with joint density function Find E(X | Y = 2). week 4

More about Conditional Expectation Assume that E(Y | X = x) exists for every x in the range of X. Then, E(Y | X ) is a random variable. The expectation of this random variable is E [E(Y | X )] Theorem E [E(Y | X )] = E(Y) This is called the “Law of Total Expectation”. Proof: week 4

Example Suppose we roll a fair die; whatever number comes up we toss a coin that many times. What is the expected number of heads? week 4

Theorem For random variables X, Y V(Y) = V [E(Y|X)] + E[V(Y|X)] Proof: week 4

Example Let Y be the number of customers entering a CIBC branch in a day. It is known that Y has a Poisson distribution with some unknown mean λ. Suppose that 1% of the customers entering the branch in a day open a new CIBC bank account. Find the mean and variance of the number of customers who open a new CIBC bank account in a day. week 4

Minimum Variance Unbiased Estimator MVUE for θ is the unbiased estimator with the smallest possible variance. We look amongst all unbiased estimators for the one with the smallest variance. week 4

The Rao-Blackwell Theorem Let be an unbiased estimator for θ such that . If T is a sufficient statistic for θ, define . Then, for all θ, and Proof: week 4

How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes the two methods will give the same estimator. week 4

Method of Moments The method of moments is a very simple procedure for finding an estimator for one or more parameters of a statistical model. It is one of the oldest methods for deriving point estimators. Recall: the k moment of a random variable is These will very often be functions of the unknown parameters. The corresponding k sample moment is the average . The estimator based on the method of moments will be the solutions to the equation μk = mk. week 4

Examples week 4

Maximum Likelihood Estimators In the likelihood function, different values of θ will attach different probabilities to a particular observed sample. The likelihood function, L(θ | x1, …, xn), can be maximized over θ, to give the parameter value that attaches the highest possible probability to a particular observed sample. We can maximize the likelihood function to find an estimator of θ. This estimator is a statistics – it is a function of the sample data. It is denoted by week 4

The log likelihood function l(θ) = ln(L(θ)) is the log likelihood function. Both the likelihood function and the log likelihood function have their maximums at the same value of It is often easier to maximize l(θ). week 4

Examples week 4

Important Comment Some MLE’s cannot be determined using calculus. This occurs whenever the support is a function of the parameter θ. These are best solved by graphing the likelihood function. Example: week 4

Properties of MLE The MLE is invariant, i.e., the MLE of g(θ) equal to the function g evaluated at the MLE. Proof: Examples: week 4