Download presentation

Presentation is loading. Please wait.

Published byAlfred Kelman Modified about 1 year ago

1
1 Hypothesis Testing

2
2 Greene: App. C: Statistical Test: Divide parameter space (Ω) into two disjoint sets: Ω 0, Ω 1 Ω 0 ∩ Ω 1 = and Ω 0 Ω 1 =Ω Based on sample evidence does estimated parameter ( * ) and therefore the true parameter fall into one of these sets? We answer this question using a statistical test.

3
3 Hypothesis Testing {y 1,y 2,…,y T } is a random sample providing information on the (K x 1) parameter vector, Θ where Θ Ω R(Θ)=[R 1 (Θ), R 2 (Θ),…R J (Θ)] is a (J x 1) vector of restrictions (e.g., hypotheses) on K parameters, Θ. For this class: R(Θ)=0, Θ Ω Ω 0 = {Θ| Θ Ω, R(Θ)=0} Ω 1 = {Θ| Θ Ω, R(Θ)≠0}

4
4 Hypothesis Testing Null Hypothesis: Θ Ω 0 (H 0 ) Alternate Hypothesis: Θ Ω 1 (H 1 ) Hypothesis Testing: Divide sample space into two portions pertaining to H 0 and H 1 The region where we reject H 0 referred to as critical region of the test

5
5 Hypothesis Testing Test of whether * 0 or 1 ( * an est. of ) based on a test statistic w/known dist. under H 0 and some other dist. if H 1 true Transform * into test statistic Critical region of hyp. test is the set of values for which H 0 would be rejected (e.g., values of test statistic unlikely to occur if H 0 is true) If test statistic falls into the critical region→evidence that H 0 not true

6
6 Hypothesis Testing General Test Procedure Develop a null hypothesis (H o ) that will be maintained until evidence to the contrary Develop an alternate hypothesis (H 1 ) that will be adopted if H 0 not accepted Estimate appropriate test statistic Identify desired critical region Compare calculated test statistic to critical region Reject H 0 if test statistic in critical region

7
7 Hypothesis Testing Definition of Rejection Region P(cv L ≤ ≤ cv U )=1-Pr(Type I Error) cv L cv U Do Not Reject H 0 Reject H 0 f( |H 0 ) Prob. rejecting H 0 even though true

8
8 Hypothesis Testing Defining the Critical Region Select a region that identifies parameter values that are unlikely to occur if the null hypothesis is true Value of Type I Error Pr (Type I Error) = Pr{Rejecting H 0 |H 0 true} Pr (Type II Error) = Pr{Accepting H 0 |H 1 true} Never know with certainty whether you are correct→pos. Pr(Type I Error) Example of Standard Normal

9
9 Hypothesis Testing Standard Normal Distribution P(-1.96 ≤ z ≤ 1.96)=0.95 α = 0.05 = P(Type I Error) 0.025

10
10 Hypothesis Testing Example of mean testing Assume RV is normally distributed: y t ~N( , 2 ) H 0 : = 1 H 1 : ≠ What is distribution of mean under H 0 ? Assume 2 =10, T=10 →→

11
11 Hypothesis Testing β~N(1,1) if H 0 True P(-0.96 ≤ β ≤ 2.96)=0.95 P(-1.96 ≤ z ≤ 1.96)=0.95 (e.g, transform dist. of β into RV with std. normal dist. α =

12
12 Hypothesis Testing Standard Normal Distribution P(-1.96 ≤ z ≤ 1.96)=0.95 α = 0.05 = P(Type I Error) 0.025

13
13 Hypothesis Testing

14
14 Hypothesis Testing Again, this assumes we know σ P(-t (T-1),α/2 ≤ t ≤ t (T-1),α/2) =1-α

15
15 Hypothesis Testing

16
16 Hypothesis Testing Likelihood Ratio Test:

17
17 Hypothesis Testing Likelihood Ratio Test: Compare value of likelihood function, l(), under the null hypothesis, l(Ω 0 )] vs. value with unrestricted parameter choice [l*(Ω)] Null hyp. could reduce set of parameter values. What does this do to the max. likelihood function value? If the two resulting max. LF values are close enough→can not reject H 0

18
18 Hypothesis Testing Is this difference in likelihood function values large? Likelihood ratio (λ): λ is a random variable since it depends on y i ’s What are possible values of λ?

19
19 Hypothesis Testing Likelihood Ratio Principle Null hypo. defining Ω 0 is rejected if λ > 1 (Why 1?) Need to establish critical level of λ, λ C that is unlikely to occur under H 0 (e.g., is 1.1 far enough away from 1.0)? Reject H 0 if estimated value of λ is greater than λ C λ = 1→Null hypo. does not sign. reduce parameter space H 0 not rejected Result conditional on sample

20
20 Hypothesis Testing General Likelihood Ratio Test Procedure Choose probability of Type I error, (e.g., test sign. level) Given , find value of C that satisfies: P( > C | H 0 is true) Evaluate test statistic based on sample information Reject (fail to reject) null hypothesis if > C (< C )

21
21 Hypothesis Testing LR test of mean of Normal Distribution (µ) with not known not known This implies the following test procedures: procedures F-Test t-Test LR test of hypothesized value of 2 (on class website)

22
22 Asymptotic Tests Previous tests based on finite samples Use asymptotic tests when appropriate finite sample test statistic is unavailable Three tests commonly used: Asymptotic Likelihood Ratio Wald Test Lagrangian Multiplier (Score) Test Greene p Buse article (on website)

23
23 Asymptotic Tests Asymptotic Likelihood Ratio Test y 1,…,y t are iid, E(y t )=β, var(y t )=σ (β*-β)T 1/2 converge in dist to N(0,σ ) As T→∞, use normal pdf to generate LF λ ≡ l * (Ω)/l(Ω 0 ) or l( l )/l( 0 ) l*(Ω) = Max [l( |y 1,…,y T ): Ω] l(Ω 0 ) = Max [l( |y 1,…,y T ): Ω 0 ] Restricted LF given H 0

24
24 Asymptotic Tests Asymptotic Likelihood Ratio (LR) LR ≡ 2ln(λ) = 2[L * ( )-L( 0 )] L( ) = lnl( ) LR~χ J asymptotically where J is the number of joint null hypothesis (restrictions)

25
25 Asymptotic Tests Asymptotic Likelihood Ratio Test ll LL LlLl LL.5LR LR ≡ 2ln( )=2[L( 1 )-L( 0 )] LR~ 2 J asymptotically (p.851 Greene) Evaluated L( ) at both 1 and 0 L≡ Log-Likelihood Function l generates unrestricted L() max L( 0 ) value obtained under H 0

26
26 Greene defines as: -2[L( 0 )-L( 1 )] Result is the same Buse, p.153, Greene p Given H 0 true, LR has an approximate χ 2 dist. with J DF (the number of joint hypotheses) Reject H 0 when LR > χ c where χ c is the predefined critical value of the dist. given J DF. Asymptotic Tests Asymptotic Likelihood Ratio Test

27
27 Suppose consists of 1 element Have 2 samples generating different estimates of the LF with same value of that max. the LF 0.5LR will depend on Distance between l and 0 (+) The curvature of the LF (+) C( ) represents LF curvature Don’t forget the “–” sign Asymptotic Tests Impact of Curvature on LR Shows Need For Wald Test Information Matrix

28
28 Asymptotic Tests Impact of Curvature on LR Shows Need For Wald Test ll LL LlLl LL.5LR 0 LL.5LR 1 L1L1 H 0 : 0 W=( l - 0 ) 2 C( | = l ) W=( l - 0 ) 2 I( | = l ) W~ 2 J asymptotically Note: Evaluated at l Max at same point Two samples L

29
29 Asymptotic Tests Impact of Curvature on LR Shows Need For Wald Test The above weights the squared distance, ( l - 0 ) 2 by the curvature of the LF instead of using the differences as in LR test Two sets of data may produce the same ( l - 0 ) 2 value but give diff. LR values because of curvature The more curvature, the more likely H 0 not true (e.g., test statistic is larger) Greene, p gives alternative motivation (careful of notation) Buse,

30
30 Asymptotic Tests Impact of Curvature on LR Shows Need For Wald Test Extending this to J simultaneous hypotheses and k parameters Note that R( ∙ ), d( ∙ ) and I( ∙ ) evaluated at l When R j ( ) of the form: j = j0, j=1,…k d( )=I k, W=( l - 0 ) 2 I( | = l )

31
31 Asymptotic Tests Based on the curvature of the log- likelihood function (L) At unrestricted max: Summary of Lagrange Multiplier (Score) Test Score of Likelihood Function

32
32 Asymptotic Tests Summary of Lagrange Multiplier (Score) Test How much does S( ) depart from 0 when evaluated at the hypothesized value? Weight squared slope by curvature The greater the curvature, the closer 0 will be to the max. value Weight by C( ) -1 →smaller test statistic the more curvature Small values of test statistic, LM, will be generated if the value of L( 0 ) is close to the max. value, L( l ), e.g. slope closer to 0

33
33 Asymptotic Tests Summary of Lagrange Multiplier (Score) Test LL LM~ 2 J asympt. S( 0 ) LBLB LALA S( 0 )=dL/d | = 0 LM= S( 0 ) 2 I( 0 ) -1 I( ) = -d 2 L/d 2 | = 0 S( )=0 S( ) ≡ dL/d Two samples L

34
34 Asymptotic Tests Summary of Lagrange Multiplier (Score) Test Small values of test statistic, LM, should be generated when L( ∙) has greater curvature when evaluated at 0 The test statistic is smaller when 0 nearer the value that generates maximum LF value (e.g. S( 0 ) is closer to zero)

35
35 Asymptotic Tests Summary of Lagrange Multiplier (Score) Test Extending this to multiple parameters Buse, pp Greene, pp

36
36 Asymptotic Tests Summary LR, W, LM differ in type of information required LR requires both restricted and unrestricted parameter estimates W requires only unrestricted estimates LM requires only restricted estimates If log-likelihood quadratic with respect to the 3 tests result in same numerical values for large samples

37
37 Asymptotic Tests Summary All test statistics distributed asym. 2 with J d.f. (number of joint hypotheses) In finite samples W > LR > LM This implies W more conservative Example: With 2 known, a test of parameter value (e.g., 0 ) results in: One case where LR=W=LM in finite samples

38
38 Asymptotic Tests Summary Example of asymptotic testsasymptotic tests Buse (pp ) same example but assumes =1

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google