Presented by: Regresi Linier Berganda (Cont.) (RLB) Dudi Barmana, M.Si. 55 1.

Slides:



Advertisements
Similar presentations
3.3 Hypothesis Testing in Multiple Linear Regression
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Topic 12: Multiple Linear Regression
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
4/14/ lecture 81 STATS 330: Lecture 8. 4/14/ lecture 82 Collinearity Aims of today’s lecture: Explain the idea of collinearity and its connection.
BA 275 Quantitative Business Methods
CHAPTER 8 MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF INFERENCE
12-1 Multiple Linear Regression Models Introduction Many applications of regression analysis involve situations in which there are more than.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
12 Multiple Linear Regression CHAPTER OUTLINE
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Linear regression models
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Classical Regression III
Regresi dan Analisis Varians Pertemuan 21 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Independent Sample T-test Formula
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Lecture 24: Thurs. Dec. 4 Extra sum of squares F-tests (10.3) R-squared statistic (10.4.1) Residual plots (11.2) Influential observations (11.3,
T-test.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 11 th Edition.
Chapter 11 Multiple Regression.
Multiple Linear Regression
Ch. 14: The Multiple Regression Model building
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Simple Linear Regression Analysis
Hypothesis tests for slopes in multiple linear regression model Using the general linear test and sequential sums of squares.
Variance and covariance Sums of squares General linear models.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Copyright © 2011 Pearson Education, Inc. Multiple Regression Chapter 23.
Regression and Correlation Methods Judy Zhong Ph.D.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Chapter 13: Inference in Regression
Multiple Linear Regression - Matrix Formulation Let x = (x 1, x 2, …, x n )′ be a n  1 column vector and let g(x) be a scalar function of x. Then, by.
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
7.1 - Motivation Motivation Correlation / Simple Linear Regression Correlation / Simple Linear Regression Extensions of Simple.
Chapter 14 Introduction to Multiple Regression
1 B IVARIATE AND MULTIPLE REGRESSION Estratto dal Cap. 8 di: “Statistics for Marketing and Consumer Research”, M. Mazzocchi, ed. SAGE, LEZIONI IN.
Chapter 12 Multiple Linear Regression Doing it with more variables! More is better. Chapter 12A.
STA302/ week 911 Multiple Regression A multiple regression model is a model that has more than one explanatory variable in it. Some of the reasons.
1 Chapter 3 Multiple Linear Regression Multiple Regression Models Suppose that the yield in pounds of conversion in a chemical process depends.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company.
VI. Regression Analysis A. Simple Linear Regression 1. Scatter Plots Regression analysis is best taught via an example. Pencil lead is a ceramic material.
Simple Linear Regression (SLR)
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
1 B IVARIATE AND MULTIPLE REGRESSION Estratto dal Cap. 8 di: “Statistics for Marketing and Consumer Research”, M. Mazzocchi, ed. SAGE, LEZIONI IN.
Week 101 ANOVA F Test in Multiple Regression In multiple regression, the ANOVA F test is designed to test the following hypothesis: This test aims to assess.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 14-1 Chapter 14 Introduction to Multiple Regression Statistics for Managers using Microsoft.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
Summary of the Statistics used in Multiple Regression.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Simple and multiple regression analysis in matrix form Least square Beta estimation Beta Simple linear regression Multiple regression with two predictors.
Statistics 350 Review. Today Today: Review Simple Linear Regression Simple linear regression model: Y i =  for i=1,2,…,n Distribution of errors.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Regression model with multiple predictors
CHAPTER 29: Multiple Regression*
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Chapter 3 Multiple Linear Regression
Presentation transcript:

presented by: Regresi Linier Berganda (Cont.) (RLB) Dudi Barmana, M.Si. 55 1

Agenda Pengujian parameter model: overall F-Test individual t-Test sequential test (partial F-Test) Ukuran penilaian kemampuan/kesesuaian model: Adjusted R-square (R 2 ) Cp Mallow Statistic 2

Today Quote Jika anda terlahir dalam kemiskinan itu bukanlah kesalahan anda, tapi jika anda mati dalam kemiskinan itu adalah kesalahan anda ---Bill Gates--- 3

Pengujian parameter model: overall F-Test individual t-Test sequential test (partial F-Test) 4

Hypothesis Testing in Multiple Linear Regression Questions: What is the overall adequacy of the model? Which specific regressors seem important? Assume the errors are independent and follow a normal distribution with mean 0 and variance  2 5

6 Overall F-Test (Test for Significance of Regression) Determine if there is a linear relationship between y and x j, j = 1,2,…,k. The hypotheses are H 0 : β 1 = β 2 =…= β k = 0 H 1 : β j  0 for at least one j ANOVA SS T = SS R + SS Res SS R /  2 ~  2 k, SS Res /  2 ~  2 n-k-1, and SS R and SS Res are independent

7 ANOVA table Sumber Variasi SSdfMSF0F0 RegresiSS R kMS R MS R /MS RES ResidualSS RES n – k – 1MS RES TotalSS T n – 1 Kesimpulan: Jika F 0 ≤ F( 1 – α ; k, n – k – 1): terima H 0 Jika F 0 > F( 1 – α ; k, n – k – 1): tolak H 0

8 Under H 1, F 0 follows F distribution with k and n-k-1 and a noncentrality parameter of

9 Tests on Individual Regression Coefficients (Individual t-Test) For the individual regression coefficient: H 0 : β j = 0 v.s. H 1 : β j  0 Let C jj be the j-th diagonal element of (X’X) -1. The test statistic: Kesimpulan: Jika |t 0 | ≤ t(1 – α /2; n – p): terima H 0 This is a partial or marginal test because any estimate of the regression coefficient depends on all of the other regression variables. This test is a test of contribution of x j given the other regressors in the model

10 Sequential Test (Partial F-Test) The subset of regressors:

11 For the full model, the regression sum of square Under the null hypothesis, the regression sum of squares for the reduce model The degree of freedom is p – r for the reduce model. The regression sum of square due to β 2 given β 1 This is called the extra sum of squares due to β 2 and the degree of freedom is p – (p – r) = r The test statistic

12 If β 2  0, F 0 follows a noncentral F distribution with Multicollinearity: this test actually has no power! This test has maximal power when X 1 and X 2 are orthogonal to one another! Partial F test: Given the regressors in X 1, measure the contribution of the regressors in X 2.

Ukuran penilaian kemampuan/kesesuaian model: Adjusted R-square (R 2 ) Cp Mallow Statistic 13

14 R 2 and Adjusted R 2 R 2 always increase when a regressor is added to the model, regardless of the value of the contribution of that variable. An adjusted R 2 : The adjusted R 2 will only increase on adding a variable to the model if the addition of the variable reduces the residual mean squares.

15 Mallow’s Cp Suppose we have a model with p regression coefficients. “Mallows C p ” provides an estimate of how well the model predicts new data, and is given by 15 The subscript FULL refers to the “full model” with k variables. Small values of Cp with Cp about p are good. Warning: C k+1 =k+1 always, so don’t take this as evidence that the full model is good unless all the other Cp’s are bigger.

16 Mallow’s Cp If the p-coefficient model contains all the important explanatory variables, then SS RES(p) is about the same as (n- p)  2. Moreover, MS RES(FULL) will also be about the same as  2. Thus

pertanyaan 17