I271b Quantitative Methods

Slides:



Advertisements
Similar presentations
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Advertisements

Introduction and Overview
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
Statistics for Managers Using Microsoft® Excel 5th Edition
Multivariate Data Analysis Chapter 4 – Multiple Regression.
Predictive Analysis in Marketing Research
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Regression Model Building Setting: Possibly a large set of predictor variables (including interactions). Goal: Fit a parsimonious model that explains variation.
Simple Linear Regression Analysis
Introduction to Regression
Forecasting Revenue: An Example of Regression Model Building Setting: Possibly a large set of predictor variables used to predict future quarterly revenues.
Correlation & Regression
Quantitative Business Analysis for Decision Making Multiple Linear RegressionAnalysis.
Objectives of Multiple Regression
STA302/ week 111 Multicollinearity Multicollinearity occurs when explanatory variables are highly correlated, in which case, it is difficult or impossible.
Lecture 22 Dustin Lueker.  The sample mean of the difference scores is an estimator for the difference between the population means  We can now use.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
Chapter 16 Data Analysis: Testing for Associations.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Exam 2 Review. Data referenced throughout review An Educational Testing Service (ETS) research scientist used multiple regression analysis to model y,
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
PO 141: INTRODUCTION TO PUBLIC POLICY Summer I (2015) Claire Leavitt Boston University.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
Predicting Energy Consumption in Buildings using Multiple Linear Regression Introduction Linear regression is used to model energy consumption in buildings.
Multiple Regression.
Chapter 15 Multiple Regression Model Building
Ch5 Relaxing the Assumptions of the Classical Model
Logistic Regression When and why do we use logistic regression?
Regression Analysis AGEC 784.
Statistical Data Analysis - Lecture /04/03
Chapter 9 Multiple Linear Regression
Regression Chapter 6 I Introduction to Regression
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Regression Diagnostics
APPROACHES TO QUANTITATIVE DATA ANALYSIS
Econometric methods of analysis and forecasting of financial markets
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Multivariate Regression
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Fundamentals of regression analysis
Multivariate Analysis Lec 4
BIVARIATE REGRESSION AND CORRELATION
I271B Quantitative Methods
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Multiple Regression.
Multiple Regression Models
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
STA 291 Summer 2008 Lecture 23 Dustin Lueker.
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
Chapter 7: The Normality Assumption and Inference with OLS
Correlation and Regression
Regression Assumptions
Regression Forecasting and Model Building
Checking Assumptions Primary Assumptions Secondary Assumptions
Chapter 13 Additional Topics in Regression Analysis
Financial Econometrics Fin. 505
STA 291 Spring 2008 Lecture 23 Dustin Lueker.
3 basic analytical tasks in bivariate (or multivariate) analyses:
Multiple Regression Berlin Chen
Regression Assumptions
Presentation transcript:

I271b Quantitative Methods Regression Part II

Data points and Regression http://www.math.csusb.edu/faculty/stanton/m262/regress/

Power Transformations Helps to correct for skew and non-symmetric distributions. Yq : reduces negative skew Log Y or –(Y-q ): reduces positive skew

Interpreting a Power Transformation Power transformations are not a magic wand– they only help you achieve symmetry and/or normality, but often symmetry is all you can hope to achieve. And, you still need to interpret the outcome! We must reconvert our transformed Y back to normal units by using the inverse transformation: Y* = Yq  Y* 1/q Y* = Loge Y  eY* So, when you calculate the predicted value from the regression equation, take the inverse transformation of the result: Y2 = 2.5 + 3.2 X = 5.7  5.71/2 = 2.39 Thus, when X =1, Y = 2.39 in its original units.

Multivariate Regression Same basic idea, but allows us to look at more than 2 variables at one time. Each variable has its own independent effect on the slope, given the effect of the other variables in the same model. Control Variables Should one or more variables be ‘controlled’ so that we can examine effect of our main IV? Alternate Predictor Variables Perhaps we have more than one IV that might have a linear relationship with Y? Nested Models We may want to examine more than one model and see which one is a better fit. X1 + X2  Y X1 + X3  Y

Nested Models Model 1 Model 2 Model 3 Control Variable 1 Explanatory Variable 1 Model 3 Explanatory Variable 2

Issue with Multiple Independent Variables: Multicollinearity Occurs when an IV is very highly correlated with one or more other IV’s Caused by many things (including variables computed by other variables in same equation, using different operationalizations of same concept, etc) Consequences For OLS regression, it does not violate assumptions, but Standard Errors will be much, much larger than normal when there is multicollinearity (confidence intervals become wider, t-statistics become smaller) We often use VIF (variance inflation factors) scores to detect multicollinearity Generally, VIF of 5-10 is problematic, higher values considered problematic Solving the problem Typically, regressing each IV on the other IV’s is a way to find the problem variable(s). Once we find IV’s that are collinear, we should eliminate one of them from the analysis.