Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Objectives (BPS chapter 24)
Some Terms Y =  o +  1 X Regression of Y on X Regress Y on X X called independent variable or predictor variable or covariate or factor Which factors.
Chapter 13 Multiple Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
To accompany Quantitative Analysis for Management, 9e by Render/Stair/Hanna 4-1 © 2006 by Prentice Hall, Inc., Upper Saddle River, NJ Chapter 4 RegressionModels.
Chapter 10 Simple Regression.
Chapter 12 Simple Regression
Linear Regression and Correlation Analysis
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Statistical hypothesis testing – Inferential statistics II. Testing for associations.
Correlation & Regression
Quantitative Business Analysis for Decision Making Multiple Linear RegressionAnalysis.
Active Learning Lecture Slides
Regression and Correlation Methods Judy Zhong Ph.D.
Introduction to Linear Regression and Correlation Analysis
Inference for regression - Simple linear regression
Formulas: Hypothesis test: We would like to know if there is . The data on six-year graduation rate (%), student-related expenditure per full-time.
Chapter 14 Introduction to Multiple Regression Sections 1, 2, 3, 4, 6.
Chapter 12 Multiple Regression and Model Building.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Inference for Regression BPS chapter 23 © 2010 W.H. Freeman and Company.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Data Handling & Analysis BD7054 Scatter Plots Andrew Jackson
+ Chapter 12: Inference for Regression Inference for Linear Regression.
● Final exam Wednesday, 6/10, 11:30-2:30. ● Bring your own blue books ● Closed book. Calculators and 2-page cheat sheet allowed. No cell phone/computer.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Section 9-1: Inference for Slope and Correlation Section 9-3: Confidence and Prediction Intervals Visit the Maths Study Centre.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.3 Using Multiple Regression to Make Inferences.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Copyright ©2011 Brooks/Cole, Cengage Learning Inference about Simple Regression Chapter 14 1.
1 Regression Analysis The contents in this chapter are from Chapters of the textbook. The cntry15.sav data will be used. The data collected 15 countries’
Data Analysis.
Simple linear regression Tron Anders Moger
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 14 Comparing Groups: Analysis of Variance Methods Section 14.3 Two-Way ANOVA.
28. Multiple regression The Practice of Statistics in the Life Sciences Second Edition.
Regression Analysis1. 2 INTRODUCTION TO EMPIRICAL MODELS LEAST SQUARES ESTIMATION OF THE PARAMETERS PROPERTIES OF THE LEAST SQUARES ESTIMATORS AND ESTIMATION.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
REGRESSION REVISITED. PATTERNS IN SCATTER PLOTS OR LINE GRAPHS Pattern Pattern Strength Strength Regression Line Regression Line Linear Linear y = mx.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Chapter 12: Correlation and Linear Regression 1.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
The simple linear regression model and parameter estimation
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
AP Statistics Chapter 14 Section 1.
CHAPTER 29: Multiple Regression*
Prepared by Lee Revere and John Large
Review of Hypothesis Testing
Multiple Regression BPS 7e Chapter 29 © 2015 W. H. Freeman and Company.
Inference for Regression
Simple Linear Regression
CHAPTER 12 More About Regression
Regression and Categorical Predictors
Presentation transcript:

Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company

Parallel regression lines What is always true about two parallel regression lines? a) The slopes are approximately the same. b) The intercepts are approximately the same. c) Both the slopes and the intercepts are approximately the same. d) None of the above.

Parallel regression lines (answer) What is true about two parallel regression lines? a) The slopes are approximately the same. b) The intercepts are approximately the same. c) Both the slopes and the intercepts are approximately the same. d) None of the above.

Indicator variable When do we use an indicator variable in a regression equation? a) When we have a quantitative variable with two possible answers, 0 and 1. b) When we have a categorical variable with two possible answers, one we assign the code “0” and the other we assign the code “1”.

Indicator variable (answer) When do we use an indicator variable in a regression equation? a) When we have a quantitative variable with two possible answers, 0 and 1. b) When we have a categorical variable with two possible answers, one we assign the code “0” and the other we assign the code “1”.

Regression vocabulary The formula “observed y – predicted y” is the a) Correlation b) Regression c) R 2 d) Residual e) Measure of Normality

Regression vocabulary (answer) The formula “observed y – predicted y” is the a) Correlation b) Regression c) R 2 d) Residual e) Measure of Normality

Parameters The parameters for multiple regression are:  X and Y.  The  ’s.  The  and .  The correlation and standard deviation.  The  ’s and .

Parameters (answer) The parameters for multiple regression are:  X and Y.  The  ’s.  The  and .  The correlation and standard deviation.  The  ’s and .

ANOVA If we reject the null hypothesis for the ANOVA F-test, what does that tell us about our multiple regression model? a) All of our  parameters are 0. b) All of our  parameters are not 0. c) One of our  parameters is 0. d) One of our  parameters is not 0. e) At least one of our  parameters is not 0.

ANOVA (answer) If we reject the null hypothesis for the ANOVA F-test, what does that tell us about our multiple regression model? a) All of our  parameters are 0. b) All of our  parameters are not 0. c) One of our  parameters is 0. d) One of our  parameters is not 0. e) At least one of our  parameters is not 0.

Significance How do you know which coefficients are significant? a) Perform a t-test for each coefficient, and any with small P-values are significant. b) Perform a t-test for each coefficient, and any with large P-values are significant. c) Perform an F-test for all coefficients, and if the P-value is small, all coefficients are significant. d) Perform an F-test for all coefficients, and if the P-value is large, all coefficients are significant.

Significance (answer) How do you know which coefficients are significant? a) Perform a t-test for each coefficient, and any with small P- values are significant. b) Perform a t-test for each coefficient, and any with large P-values are significant. c) Perform an F-test for all coefficients, and if the P-value is small, all coefficients are significant. d) Perform an F-test for all coefficients, and if the P-value is large, all coefficients are significant.

Interaction Which of the following is FALSE if you have interaction between two explanatory variables, x 1 and x 2 ? a) The individual regression lines for each explanatory variable will be parallel. b) The interaction term can be expressed as x 1 x 2 in the model. c) The relationship between the mean response and one explanatory variable changes when we change the value of the other explanatory variable. d) The interaction term changes the slope of the full model from the slope of either of the simple (one x-variable) regression models.

Interaction (answer) Which of the following is FALSE if you have interaction between two explanatory variables, x 1 and x 2 ? a) The individual regression lines for each explanatory variable will be parallel. b) The interaction term can be expressed as x 1 x 2 in the model. c) The relationship between the mean response and one explanatory variable changes when we change the value of the other explanatory variable. d) The interaction term changes the slope of the full model from the slope of either of the simple (one x-variable) regression models.

Multiple regression models True or false: When considering which model is the best one for your setting, you should assume you have parallel regression lines (no interaction) in your model before considering a model with an interaction term. a) True b) False

Multiple regression models (answer) True or false: When considering which model is the best one for your setting, you should assume you have parallel regression lines (no interaction) in your model before considering a model with an interaction term. a) True b) False

Multiple regression True or false: The relationship between y and any explanatory variable can change greatly depending on which other explanatory variables are present in the model. a) True b) False

Multiple regression (answer) True or false: The relationship between y and any explanatory variable can change greatly depending on which other explanatory variables are present in the model. a) True b) False

Residual plots What does it mean if you see a quadratic pattern in your residual plot? a) All the regression assumptions were met. b) There are many outliers. c) The Normality assumption was not met. d) An x 2 term may need to be added to the model.

Residual plots (answer) What does it mean if you see a quadratic pattern in your residual plot? a) All the regression assumptions were met. b) There are many outliers. c) The Normality assumption was not met. d) An x 2 term may need to be added to the model.

Multiple regression models Which of the following is NOT an important indication of a good model? a) The ANOVA F-test rejected the null hypothesis. b) R 2 is close to 100%. c) The  0 coefficient is significant. d) The  i coefficients in the model (not counting  0 ) are significant. e) The residual plot shows a random scattering of points.

Multiple regression models (answer) Which of the following is NOT an important indication of a good model? a) The ANOVA F-test rejected the null hypothesis. b) R 2 is close to 100%. c) The  0 coefficient is significant. d) The  i coefficients in the model (not counting  0 ) are significant. e) The residual plot shows a random scattering of points.