1 Hypothesis Testing Under General Linear Model  Previously we derived the sampling property results assuming normality:  Y = X  + e where e t ~N(0,

Slides:



Advertisements
Similar presentations
Chapter 4: Basic Estimation Techniques
Advertisements

Properties of Least Squares Regression Coefficients
3.3 Hypothesis Testing in Multiple Linear Regression
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
The Simple Regression Model
Part 12: Asymptotics for the Regression Model 12-1/39 Econometrics I Professor William Greene Stern School of Business Department of Economics.
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
1 Hypothesis Testing. 2  Greene: App. C:  Statistical Test: Divide parameter space (Ω) into two disjoint sets: Ω 0, Ω 1  Ω 0 ∩ Ω 1 =  and Ω.
Econ 140 Lecture 81 Classical Regression II Lecture 8.
Objectives (BPS chapter 24)
The General Linear Model. The Simple Linear Model Linear Regression.
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
The Simple Linear Regression Model: Specification and Estimation
Copyright © 2008 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics, 9e Managerial Economics Thomas Maurice.
Hypothesis testing Some general concepts: Null hypothesisH 0 A statement we “wish” to refute Alternative hypotesisH 1 The whole or part of the complement.
Chapter 10 Simple Regression.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
SIMPLE LINEAR REGRESSION
T-test.
Chapter 11 Multiple Regression.
2.3. Measures of Dispersion (Variation):
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
The Neymann-Pearson Lemma Suppose that the data x 1, …, x n has joint density function f(x 1, …, x n ;  ) where  is either  1 or  2. Let g(x 1, …,
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Statistical Inference: Estimation and Hypothesis Testing chapter.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
5.1 Basic Estimation Techniques  The relationships we theoretically develop in the text can be estimated statistically using regression analysis,  Regression.
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Maximum Likelihood Estimator of Proportion Let {s 1,s 2,…,s n } be a set of independent outcomes from a Bernoulli experiment with unknown probability.
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
Managerial Economics Demand Estimation & Forecasting.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
Interval Estimation and Hypothesis Testing Prepared by Vera Tabakova, East Carolina University.
Lecture 3: Statistics Review I Date: 9/3/02  Distributions  Likelihood  Hypothesis tests.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
EC 532 Advanced Econometrics Lecture 1 : Heteroscedasticity Prof. Burak Saltoglu.
6. Simple Regression and OLS Estimation Chapter 6 will expand on concepts introduced in Chapter 5 to cover the following: 1) Estimating parameters using.
Math 4030 Final Exam Review. Probability (Continuous) Definition of pdf (axioms, finding k) Cdf and probability (integration) Mean and variance (short-cut.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
Statistical Inference Statistical inference is concerned with the use of sample data to make inferences about unknown population parameters. For example,
M.Sc. in Economics Econometrics Module I Topic 4: Maximum Likelihood Estimation Carol Newman.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University.
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Inferences Concerning Means.
The Simple Linear Regression Model: Specification and Estimation  Theory suggests many relationships between variables  These relationships suggest that.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Chapter 4: Basic Estimation Techniques
Chapter 4 Basic Estimation Techniques
6. Simple Regression and OLS Estimation
Basic Estimation Techniques
Multiple Regression Analysis: Inference
Ch3: Model Building through Regression
Basic Estimation Techniques
The regression model in matrix form
Statistical inference
EC 331 The Theory of and applications of Maximum Likelihood Method
Undergraduated Econometrics
The Simple Linear Regression Model: Specification and Estimation
Interval Estimation and Hypothesis Testing
5.4 General Linear Least-Squares
Simple Linear Regression
Tutorial 1: Misspecification
Product moment correlation
Presentation transcript:

1 Hypothesis Testing Under General Linear Model  Previously we derived the sampling property results assuming normality:  Y = X  + e where e t ~N(0,  2 )  → Y~N(X ,  2 I T )   s =(X'X) -1 X'Y, E(  s )=   Cov(  s )=  β =  2 (X'X) -1   l ~N( ,  2 (X'X) -1 )  σ U 2 unbiased estimate of σ 2  An estimate of Cov(β s ) =  βs =σ U 2 (X'X) -1 e l = y - Xβ l

2 Hypothesis Testing Under General Linear Model  Single Parameter (β k,L ) Hypothesis Test  β k,l ~N(β k,Var(β k )) k th diagonal element of  βs  When σ 2 is known:  unknown true coeff.  When σ 2 not known:  Σ βs =σ u 2 (X'X) -1

3 Hypothesis Testing Under General Linear Model  Can obtain (1-  ) CI for β k :  There is a (1-α) probability that the true unknown value of β is within this range  Does this interval contain our hypothesized value? If it does, than we can not reject H 0

4 Hypothesis Testing Under General Linear Model  Testing More Than One Linear Combination of Estimated Coefficients  Assume we have a-priori information about the value of β  We can represent this information via a set of J-Linear hypotheses (or restrictions):  In matrix notation

5 Hypothesis Testing Under General Linear Model known coefficients

6 Hypothesis Testing Under General Linear Model  Assume we have a model with 5 parameters to be estimated  Joint hypotheses: β 1 =8 and β 2 =β 3  J=2, K=5 β 2 -β 3 =0

7 Hypothesis Testing Under General Linear Model  How do we obtain parameter estimates if J hypotheses are true?  Constrained (Restricted) Least Squares   R is β that minimizes: S=(Y-Xβ)'(Y-Xβ) s.t. Rβ=r = e'e s.t. Rβ=r e.g. we act as if H 0 are true  S*=(Y-Xβ)'(Y-Xβ)+λ'(r-Rβ)  λ is (J x1) Lagrangian multipliers associated with J-joint hypotheses  We want to choose β such that we minimize SSE but also satisfy the J constraints (hypotheses), β R

8 Hypothesis Testing Under General Linear Model  Min. S*=(Y-Xβ)'(Y-Xβ) + λ'(r-Rβ)  What and how many FOC’s?  K+J FOC’s K-FOC’s J-FOC’s

9 Hypothesis Testing Under General Linear Model  What are the FOC’s?  Substitute these FOC into 2 nd set ∂S * /∂λ = (r-Rβ R ) = 0 J → S*=(Y-Xβ)'(Y-Xβ)+λ'(r-Rβ) CRM βSβS

10 Hypothesis Testing Under General Linear Model  The 1 st FOC  Substitute the expression for λ/2 into the 1 st FOC:

11 Hypothesis Testing Under General Linear Model  β R is the restricted LS estimator of β as well as the restricted ML estimator  Properties of Restricted Least Squares EstimatorRestricted Least Squares  → E(  R )   if R   r  V(  R ) ≤ V(  S ) →[V(  S ) - V(  R )] is positive semi- definite  diag(V(  R )) ≤ diag(V(  S )) True but Unknown Value

12 Hypothesis Testing Under General Linear Model  From above, if Y is multivariate normal and H 0 is true  β l,R ~N(β,σ 2 M * (X'X) -1 M * ') ~N(β,σ 2 M * (X'X) -1 )  From previous results, if r-Rβ≠0 (e.g., not all H 0 true), estimate of β is biased if we continue to assume r-Rβ=0 ≠0

13 Hypothesis Testing Under General Linear Model  The variance is the same regardless of he correctness of the restrictions and the biasedness of β R  → β R has a variance that is smaller when compared to β s which only uses the sample information.

14 Hypothesis Testing Under General Linear Model  Beer Consumption Example :   q B ≡ quantity of beer purchased P B ≡ price of beer P L ≡ price of other alcoholic bev. P O ≡ price of other goods INC ≡ household income  Real Prices Matter? All prices and INC  by 10% β 1 + β 2 + β 3 + β 4 =0  Equal Price Impacts? Liquor and Other Goods β 2 =β 3  Unitary Income Elasticity? β 4 =1  Data used in the analysis Data

15  Given the above, what does the R-matrix and r vector look like for these joint tests?  Lets develop a test statistic to test these joint hypotheses  We are going to use the Likelihood Ratio (LR) to test the joint hypotheses Hypothesis Testing Under General Linear Model

16 Hypothesis Testing Under General Linear Model  LR=l U * /l R *  l U * =Max  [l(  |y 1,…,y T );  =(β, σ  )   ] = “unrestricted” maximum likelihood function  l R * =Max  [l(  |y 1,…,y T );  =(β, σ  )   ; Rβ=r] = “restricted” maximum likelihood function  Again, because we are possibly restricting the parameter space via our null hypotheses, LR≥1

17 Hypothesis Testing Under General Linear Model  If l U * is large relative to l R * →data shows evidence that the restrictions (hypotheses) are not true (e.g., reject null hypothesis)  How much should LR exceed 1 before we reject H 0 ?  We reject H 0 when LR ≥ LR C where LR C is a constant chosen on the basis of the relative cost of the Type I vs. Type II errors  When implementing the LR Test you need to know the PDF of the dependent variable which determines the density of the test statistic

18 Hypothesis Testing Under General Linear Model  For LR test, assume Y has a normal distribution  →e~N(0,σ  I T )  This implies the following LR test statistic (LR * )test statistic  What are the distributional characteristics of LR * ? Will address this in a bit

19 Hypothesis Testing Under General Linear Model  We can derive alternative specifications of LR test statistic  LR*=(SSE R -SSE U )/(J  2 U ) (ver. 1)  LR *=[(R  e -r)′[R(X′X) -1 R′] -1 (R  e -r)]/(J  2 U ) (ver. 2) (ver. 2)  LR*=[(  R -  e )′(X′X)(  R -  e )]/(J  2 U ) (ver. 3) β e =β S =β l  What are the Distributional Characteristics of LR* (JHGLL p. 255)Distributional Characteristics  LR* ~ F J,T-K  J = # of Hypotheses  K= # of Parameters (including intercept)

20 Hypothesis Testing Under General Linear Model  Proposed Test Procedure  Choose  = P(reject H 0 | H 0 true) = P(Type-I error)  Calculate the test statistic LR* based on sample information  Find the critical value LR crit in an F-table such that:  = P(F (J, T – K)  LR crit ), where α = P(reject H 0 | H 0 true) f(LR*) α LR crit α = P(F J,T-K ≥ LR crit )

21 Hypothesis Testing Under General Linear Model  Proposed Test Procedure  Choose  = P(reject H 0 | H 0 true) = P(Type-I error)  Calculate the test statistic LR* based on sample information  Find the critical value LR crit in an F- table such that:  = P(F (J, T – K)  LR crit ), where α = P(reject H 0 | H 0 true)  Reject H 0 if LR*  LR crit  Don’t reject H 0 if LR* < LR crit

22 Hypothesis Testing Under General Linear Model  Beer Consumption Example  Does the regression do a better job in explaining variation in beer consumption than if assumed the mean response across all obs.?  Remember SSE=(T-K)σ 2 U  Under H 0 : All slope coefficients=0  Under H 0, TSS=SSE given that that there is no RSS and TSS=RSS+SSE

23 Hypothesis Testing Under General Linear Model Log-Log Beer Consumption Model Unconstrained Model R2R Adj. R σUσU Obs30 Variable CoeffStd ErrorT-Stat Intercept lnP B lnP L lnP O ln(INC) Constrained Model σUσU SSE R = *29= CoeffStd ErrorT-Stat Intercept SSE = *25 = R 2 = / TSS=SSE R Mean of LN(Beer)

24 Hypothesis Testing Under General Linear Model  Results of our test of overall significance of regression modeloverall significance  Lets look at the following GAUSS CodeGAUSS Code  GAUSS command: CDFFC(29.544,4,25)=3.799e-009 CDFFC Computes the complement of the cdf of the F distribution (1- F df1,df2 ) Unlikely value of F if hypothesis is true, that is no impact of exogenous variables on beer consumption Reject the null hypothesis  An alternative look An alternative look

25 Hypothesis Testing Under General Linear Model  Beer Consumption Example  Three joint hypotheses examplejoint hypotheses example Sum of Price and Income Elasticities Sum to 0 (e.g., β 1 + β 2 + β 3 + β 4 =0) Other Liquor and Other Goods Price Elasticities are Equal (e.g., β 2 =β 3 ) Income Elasticity = 1 (e.g., β 4 =1) cdffc(0.84,3,25)=0.4848

26 Hypothesis Testing Under General Linear Model PDF F 3, area =  Location of our calculated test statistic F

27 Hypothesis Testing Under General Linear Model  A side note: How do you estimate the variance of an elasticity and therefore test H 0 about this elasticity?  Suppose you have the following model: FDX t = β 0 + β 1 Inc t + β 2 Inc 2 t + e t  FDX= food expenditure  Inc=household income  Want to estimate the impacts of a change in income on expenditures. Use an elasticity measure evaluated at mean of the data. That is:

28 Hypothesis Testing Under General Linear Model  Income Elasticity (Γ) is:  How do you calculate the variance of Γ?  We know that: Var(α′Z)= α′Var(Z)α  Z is a column vector of RV’s  α a column vector of constants  Treat β 0, β 1 and β 2 are RV’s. The α vector is: FDX t = β 0 + β 1 Inc t + β 2 Inc 2 t + e t Linear combination of Z

29 Hypothesis Testing Under General Linear Model  This implies var(Γ) is: (1 x 1) σ 2 (X'X) -1 (3 x 3) (1 x 3) (3 x 1) Due to 0 α value

30 Hypothesis Testing Under General Linear Model  This implies: var(Γ) is: C12C12 C22C22 2C 1 C 2