CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM )

Slides:



Advertisements
Similar presentations
NORMAL OR GAUSSIAN DISTRIBUTION Chapter 5. General Normal Distribution Two parameter distribution with a pdf given by:
Advertisements

Estimation of Means and Proportions
Multiple Regression Analysis
Mean, Proportion, CLT Bootstrap
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Sampling: Final and Initial Sample Size Determination
Econ 140 Lecture 61 Inference about a Mean Lecture 6.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
ELEC 303 – Random Signals Lecture 18 – Statistics, Confidence Intervals Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 10, 2009.
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Confidence intervals. Population mean Assumption: sample from normal distribution.
Simple Linear Regression
Chapter 6 Introduction to Sampling Distributions
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 6 Introduction to Sampling Distributions.
4. Multiple Regression Analysis: Estimation -Most econometric regressions are motivated by a question -ie: Do Canadian Heritage commercials have a positive.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
Topic4 Ordinary Least Squares. Suppose that X is a non-random variable Y is a random variable that is affected by X in a linear fashion and by the random.
Standard error of estimate & Confidence interval.
Statistical Hypothesis Testing. Suppose you have a random variable X ( number of vehicle accidents in a year, stock market returns, time between el nino.
Chapter 7 Estimation: Single Population
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
Random Sampling, Point Estimation and Maximum Likelihood.
9-1 MGMG 522 : Session #9 Binary Regression (Ch. 13)
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
8 Sampling Distribution of the Mean Chapter8 p Sampling Distributions Population mean and standard deviation,  and   unknown Maximal Likelihood.
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
405 ECONOMETRICS Domodar N. Gujarati Prof. M. El-Sakka
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 We will now look at the properties of the OLS regression estimators with the assumptions of Model B. We will do this within the context of the simple.
Confidence Interval & Unbiased Estimator Review and Foreword.
© 2001 Prentice-Hall, Inc.Chap 7-1 BA 201 Lecture 11 Sampling Distributions.
5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Chapter 5 Sampling Distributions. The Concept of Sampling Distributions Parameter – numerical descriptive measure of a population. It is usually unknown.
Sampling Theory and Some Important Sampling Distributions.
M.Sc. in Economics Econometrics Module I Topic 4: Maximum Likelihood Estimation Carol Newman.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Statistical Estimation
Chapter 4. Inference about Process Quality
Ch3: Model Building through Regression
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Chapter 7: Sampling Distributions
CIS 2033 based on Dekking et al
Random Sampling Population Random sample: Statistics Point estimate
Two-Variable Regression Model: The Problem of Estimation
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Statistical Assumptions for SLR
CHAPTER 15 SUMMARY Chapter Specifics
Simple Linear Regression
Multiple Regression Analysis: OLS Asymptotics
Multiple Regression Analysis: OLS Asymptotics
Applied Statistics and Probability for Engineers
Regression Models - Introduction
Presentation transcript:

CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM ) Chapter Four CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM ) 4.2 THE NORMALITY ASSUMPTION FOR ui The assumptions given above can be more compactly stated as where the symbol ~ means distributed as and N stands for the normal distribution, the terms in the parentheses representing the two parameters of the normal distribution, namely, the mean and the variance.

Why the Normality Assumption? Therefore, with the normality assumption, (4.2.4) means that ui and uj are not only uncorrelated but are also independently distributed. Therefore, we can write (4.2.4) as where NID stands for normally and independently distributed. Why the Normality Assumption? Why do we employ the normality assumption? There are several reasons: 1-Now by the celebrated central limit theorem (CLT) of statistics, it can be shown that if there are a large number of independent and identically distributed random variables, then, with a few exceptions, the distribution of their sum tends to a normal distribution as the number of such variables increase indefinitely. It is the CLT that provides a theoretical justification for the assumption of normality of ui. 2. A variant of the CLT states that, even if the number of variables is not very large or if these variables are not strictly independent, their sum may still be normally distributed.

4.3 PROPERTIES OF OLS ESTIMATORS UNDER THE NORMALITY ASSUMPTION 4. The normal distribution is a comparatively simple distribution involving only two parameters (mean and variance); it is very well known and its theoretical properties have been extensively studied in mathematical statistics. Besides, many phenomena seem to follow the normal distribution. we may be able to relax the normality assumption. 4.3 PROPERTIES OF OLS ESTIMATORS UNDER THE NORMALITY ASSUMPTION With the assumption that ui follow the normal distribution as in (4.2.5), the OLS estimators have the following properties; Appendix A provides a general discussion of the desirable statistical properties of estimators. 1. They are unbiased. 2. They have minimum variance. Combined with 1, this means that they are minimum-variance unbiased, or efficient estimators. 3. They have consistency; that is, as the sample size increases indefinitely, the estimators converge to their true population values.

Or more compactly Then by the properties of the normal distribution the variable Z, which is defined as follows the standard normal distribution, that is, a normal distribution with zero mean and unit ( = 1) variance, or

Or, more compactly Then, as in (4.3.3), also follows the standard normal distribution.

More neatly, we can write

which is the density function of a normally distributed variable with the given mean and variance. The method of maximum likelihood, as the name indicates, consists in estimating the unknown parameters in such a manner that the probability of observing the given Y’s is as high (or maximum) as possible. Therefore, we have to find the maximum of the function (4).

After simplifying, Eqs. (9) and (10) yield