Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Heteroskedasticity Hill et al Chapter 11. Predicting food expenditure Are we likely to be better at predicting food expenditure at: –low incomes; –high.
Properties of Least Squares Regression Coefficients
Multiple Regression Analysis
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
Linear regression models
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
GRA 6020 Multivariate Statistics The regression model OLS Regression Ulf H. Olsson Professor of Statistics.
Simple Linear Regression
The Simple Regression Model
Topic 3: Regression.
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Business Statistics - QBM117 Statistical inference for regression.
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
Standard error of estimate & Confidence interval.
Ordinary Least Squares
Correlation & Regression
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Hypothesis Testing in Linear Regression Analysis
Model Checking Using residuals to check the validity of the linear regression model assumptions.
What does it mean? The variance of the error term is not constant
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
12.1 Heteroskedasticity: Remedies Normality Assumption.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
ANOVA Assumptions 1.Normality (sampling distribution of the mean) 2.Homogeneity of Variance 3.Independence of Observations - reason for random assignment.
Diagnostics – Part II Using statistical tests to check to see if the assumptions we made about the model are realistic.
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Heteroskedasticity. 2 The Nature of Heteroskedasticity  Heteroskedasticity is a systematic pattern in the errors where the variances of the errors.
Class 5 Multiple Regression CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
11.1 Heteroskedasticity: Nature and Detection Aims and Learning Objectives By the end of this session students should be able to: Explain the nature.
732G21/732G28/732A35 Lecture 3. Properties of the model errors ε 4. ε are assumed to be normally distributed
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Econometrics III Evgeniya Anatolievna Kolomak, Professor.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Econometrics I Summer 2011/2012 Course Guarantor: prof. Ing. Zlata Sojková, CSc., Lecturer: Ing. Martina Hanová, PhD.
Linear Regression with One Regression
Multiple Regression Analysis: Estimation
The Simple Linear Regression Model: Specification and Estimation
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Business statistics and econometrics
Virtual COMSATS Inferential Statistics Lecture-26
Regression.
Evgeniya Anatolievna Kolomak, Professor
Fundamentals of regression analysis
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Multiple Regression Analysis
Introduction to Econometrics
Autocorrelation.
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
6-1 Introduction To Empirical Models
The regression model in matrix form
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
BASIC REGRESSION CONCEPTS
Multiple Regression Analysis: OLS Asymptotics
Chapter 13 Additional Topics in Regression Analysis
Linear Regression Summer School IFPRI
Financial Econometrics Fin. 505
Regression Models - Introduction
Presentation transcript:

Lecturer: Ing. Martina Hanová, PhD.

 How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population) prediction errors  study through the (sample) estimated errors, the residuals

The four conditions of the SRL model What can be wrong with our model? 1. Yi is a Linear function of the Xi 2. Errors ε i, are Independent. 3. Errors ε i are Normally distributed. 4. Errors ε i have Equal variances (denoted σ 2 ).

 all of the estimates, intervals, and hypothesis tests arising in a regression analysis have been developed assuming that the model is correct.  all the formulas depend on the model being correct!  if the model is incorrect, then the formulas and methods (OLS) we use are at risk of being incorrect.

Model is linear in parameters

LRM NRM

Zero mean value of disturbances - ui.

No autocorrelation between the disturbances The data are a random sample of the population

Equal variance of disturbences - ui homoskedasticity Errors have constant variance “homoskedasticity” heteroskedasticity Errors have non-constant variance “heteroskedasticity”

Construction of var-cov matrix: vector ei * transpose vector ei

The errors are normally distributed Normal Probability Plot

The errors, i, are independent normal random variables independent normal random variables with mean zero and constant variance, σ2

 The population regression function is not linear.  The error terms are not independent.  The error terms are not normally distributed.  The error terms do not have equal variance.

Zero covariance between ui and Xi

the number of >= the number of observations explanatory variables

 The mean of the response, E(Yi), at each set of values of the predictor, (x1i,x2i,…), is a Linear function of the predictors.  The errors, ε i, are Independent.  The errors, ε i, at each set of values of the predictor, (x1i,x2i,…), are Normally distributed.  The errors, ε i, at each set of values of the predictor, (x1i,x2i,…) have Equal variances (denoted σ 2 ).

 All tests and intervals are very sensitive to even minor variance from independence.  All tests and intervals are sensitive to moderate variance from equal variance.  The hypothesis tests and confidence intervals for β i are fairly "robust" (that is, forgiving) against variance from normality.  Prediction intervals are quite sensitive to variance from normality.

The Gauss Markov theorem  When the first 4 assumptions of the simple regression model are satisfied the parameter estimates are unbiased and have the smallest variance among other linear unbiased estimators.  The OLS estimators are therefore called BLUE for Best Linear Unbiased Estimators