Understanding Multivariate Research Berry & Sanders.

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Forecasting Using the Simple Linear Regression Model and Correlation
Regression Inferential Methods
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
Introduction to Regression Analysis
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter 10 Simple Regression.
BCOR 1020 Business Statistics Lecture 28 – May 1, 2008.
Linear Regression and Correlation
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Intro to Statistics for the Behavioral Sciences PSYC 1900
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
Topic 3: Regression.
SIMPLE LINEAR REGRESSION
Ch. 14: The Multiple Regression Model building
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Dr. Mario MazzocchiResearch Methods & Data Analysis1 Correlation and regression analysis Week 8 Research Methods & Data Analysis.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Simple Linear Regression Analysis
Ordinary Least Squares
Multiple Linear Regression Analysis
Lecture 5 Correlation and Regression
Active Learning Lecture Slides
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
Hypothesis Testing in Linear Regression Analysis
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Ordinary Least Squares Estimation: A Primer Projectseminar Migration and the Labour Market, Meeting May 24, 2012 The linear regression model 1. A brief.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Political Science 30: Political Inquiry. Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Quantitative Methods. Bivariate Regression (OLS) We’ll start with OLS regression. Stands for  Ordinary Least Squares Regression. Relatively basic multivariate.
Regression Inference. Height Weight How much would an adult male weigh if he were 5 feet tall? He could weigh varying amounts (in other words, there is.
Chapter 11: Linear Regression E370, Spring From Simple Regression to Multiple Regression.
REGRESSION G&W p
Regression.
Introduction to Regression Analysis
Political Science 30: Political Inquiry
Multiple Regression.
Regression.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Fundamentals of regression analysis
Correlation and Simple Linear Regression
Chapter 12 Regression.
LESSON 24: INFERENCES USING REGRESSION
Multiple Regression BPS 7e Chapter 29 © 2015 W. H. Freeman and Company.
Simple Linear Regression
Regression.
Regression.
Correlation and Simple Linear Regression
Regression.
Regression Chapter 8.
Regression.
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Regression.
SIMPLE LINEAR REGRESSION
Introduction to Regression
Presentation transcript:

Understanding Multivariate Research Berry & Sanders

Regression Assumptions 1. The independent variables are measured at the interval level or are dichotomous (1/0). 2. The dependent variable is continuous. 3. The variables in the model are measured perfectly (no measurement error). 4. The effect of the independent variable, X, on the dependent variable, Y, is linear. 5. The error or disturbance term is completely uncorrelated with the independent variables. 6. The effects of all independent variables on the dependent variable are additive.

Multivariate Regression Y i = b 0 + b 1 X 1i + b 2 X 2i + b 3 X 3i + e i Each slope coefficient (b i ) measures the responsiveness of the dependent variable to a one-unit change in the associated independent variable when the other independent variables are held constant.

Example Y: body weight (lbs) X 1 : food intake (average daily calories) X 2 : exercise (average daily expenditure, calories) X 3 : gender (1=male, 0=female)

Table 3.1 Regression Model of Body Weight Coefficient Intercept152.0 FOOD EXERCISE MALE 35.00

Interpretation Intercept: A female who eats no food and does not exercise weighs 152 pounds. FOOD: A one calorie increase in daily average food intake increases a person’s weight by.028 pounds. A 100 calorie increase results in a 2.8 pound increase in weight (100x.028). EXERCISE: A one calorie increase in calories expended through exercise decreases a person’s weight by pounds.

Interpretation (continued) MALE (dichotomous) The coefficient can be interpreted as the difference in the expected value of Y between a case for which X=0 and a case for which X=1 (holding all other independent variables constant). For two individuals with identical food intake and exercise, a man can expect to weigh 35 pounds more than a woman.

Elements of a Regression Model 1. Measuring the fit of the model: based on a comparison of the actual and predicted values of Y. The further away data points fall from the regression line, the worse the fit. 2. R 2 = the proportion of the variation in Y that is explained by the independent variables, or the squared correlation between the actual and predicted values. R 2 ranges from 0 to 1 with 1 indicating a perfect fit (all points on the regression line).

Elements (continued) 3. Statistical Significance H o : β i = 0 H 1 : β i > 0 (or < or not equal) t = b i / s.e. Rule of thumb: If t > 2 or t < -2, then the coefficient is statistically significant (we reject the null hypothesis that the coefficient is zero).

Elements (continued) Confidence level (95%): We calculate a partial slope coefficient for a sample (b i ). We can calculate a confidence interval around this estimate, within which we would expect the true (population) coefficient (β i ) to fall in 95 of 100 samples.

Potential Problems Multicollinearity: a high (or perfect) correlation among any of the independent variables (e.g. education and income) Heteroskedasticity: Non-constant variance in errors Autocorrelation (or serial correlation): The error terms are correlated; very common in time series data All of these problems create inefficiencies (increasing standard errors), but they do not affect our slope coefficients (they remain unbiased).