Stat 512 – Lecture 18 Multiple Regression (Ch. 11)

Slides:



Advertisements
Similar presentations
Stat 112: Lecture 7 Notes Homework 2: Due next Thursday The Multiple Linear Regression model (Chapter 4.1) Inferences from multiple regression analysis.
Advertisements

Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Objectives (BPS chapter 24)
Class 16: Thursday, Nov. 4 Note: I will you some info on the final project this weekend and will discuss in class on Tuesday.
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
Lecture 23: Tues., Dec. 2 Today: Thursday:
Monday, 4/29/02, Slide #1 MA 102 Statistical Controversies Monday, 4/29/02 Today: CLOSING CEREMONIES!  Discuss HW #3  Review for final exam  Evaluations.
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
Stat 512 – Lecture 19 Wrap-Up. Announcements Review sheet online  Office hours  Review session next week?  Updated final exam signup on web Review.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Stat 217 – Day 26 Regression, cont.. Last Time – Two quantitative variables Graphical summary  Scatterplot: direction, form (linear?), strength Numerical.
Linear Regression and Correlation Analysis
Stat 217 – Week 10. Outline Exam 2 Lab 7 Questions on Chi-square, ANOVA, Regression  HW 7  Lab 8 Notes for Thursday’s lab Notes for final exam Notes.
Stat 512 – Lecture 17 Inference for Regression (9.5, 9.6)
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Lecture 23 Multiple Regression (Sections )
Stat 217 – Day 25 Regression. Last Time - ANOVA When?  Comparing 2 or means (one categorical and one quantitative variable) Research question  Null.
Stat 322 – Day 29. HW 8 See updated version online  Delete question 6 Please always define parameters, state hypotheses and comment on technical conditions,
Linear Regression Example Data
Correlation and Regression Analysis
Chapter 7 Forecasting with Simple Regression
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Simple Linear Regression Analysis
June 2, 2008Stat Lecture 18 - Review1 Final review Statistics Lecture 18.
Inference for regression - Simple linear regression
Lecture 14 Multiple Regression Model
BPS - 3rd Ed. Chapter 211 Inference for Regression.
Inference for Linear Regression Conditions for Regression Inference: Suppose we have n observations on an explanatory variable x and a response variable.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Lesson Inference for Regression. Knowledge Objectives Identify the conditions necessary to do inference for regression. Explain what is meant by.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
+ Chapter 12: More About Regression Section 12.1 Inference for Linear Regression.
Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Simple Linear Regression ANOVA for regression (10.2)
Copyright ©2011 Brooks/Cole, Cengage Learning Inference about Simple Regression Chapter 14 1.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
1 Quadratic Model In order to account for curvature in the relationship between an explanatory and a response variable, one often adds the square of the.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
Lecture 10: Correlation and Regression Model.
28. Multiple regression The Practice of Statistics in the Life Sciences Second Edition.
Chapter 14: Inference for Regression. A brief review of chapter 4... (Regression Analysis: Exploring Association BetweenVariables )  Bi-variate data.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
June 30, 2008Stat Lecture 16 - Regression1 Inference for relationships between variables Statistics Lecture 16.
Chapter 12 Inference for Linear Regression. Reminder of Linear Regression First thing you should do is examine your data… First thing you should do is.
Chapter 10 Inference for Regression
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 12 More About Regression 12.1 Inference for.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
1 Chapter 12: Analyzing Association Between Quantitative Variables: Regression Analysis Section 12.1: How Can We Model How Two Variables Are Related?
The Practice of Statistics Third Edition Chapter 15: Inference for Regression Copyright © 2008 by W. H. Freeman & Company.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
BPS - 5th Ed. Chapter 231 Inference for Regression.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 12 More About Regression 12.1 Inference for.
Chapter 13 Simple Linear Regression
Inference for Least Squares Lines
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
CHAPTER 12 More About Regression
CHAPTER 29: Multiple Regression*
Basic Practice of Statistics - 3rd Edition Inference for Regression
CHAPTER 12 More About Regression
Inference for Regression
CHAPTER 12 More About Regression
Presentation transcript:

Stat 512 – Lecture 18 Multiple Regression (Ch. 11)

Projects Guidelines handout (one per group)  Project Do’s and Don’ts  Presentations 5 minutes Will take volunteers for Tuesday Hit the highlights  Tons more detail for me in your paper  Sample hypothetical presentation Plan for technology in advance! Group evaluation form next week

Last Time – Inference for Regression Use residual plots to check technical conditions  Linearity: If residuals vs. EV/fitted values does not show pattern (e.g., curvature) will assume original relationship was linear  Independence: If have random sample or randomization, we will assume independence  Normality: If histogram of residuals is reasonably normal, will assume condition distributions at each x are all normal Will be a bit more forgiving on this condition with large n  Equal standard deviation: If residuals vs. EV/fitted values shows equal vertical spread across all the EV values, will assume conditions distributions at each x have same SD

Last Time – Inference for Regression Null hypothesis  H 0 : no association between RV and EV (identify)  H 0 : population slope  = 0  H o : no treatment effect from EV on RV (identify) Minitab/SAS output  Assumes two-sided alternative  t = observed slope – hypothesized slope SE(observed slope) d.f. = n-2  Equivalent to p-value reported by Minitab with correlation coefficient

PP – Money Making Movies box office = score

PP – Money Making Movies

Consequence: restrict population to movies earning less than $200 million

PP – Money Making Movies Is the relationship statistically significant? Highly significant (p-value <.001) But not all that useful (r 2 = 8.9%) Not a cause and effect relationship Not clear what population this represents

Can we improve on these models? Adding predictor variables to the model  Average response =  0 +  1 x 1 +   x 2 + …  Graphical displays  Interpreting coefficients  Interpreting R 2  Inference for model, coefficients  Checking technical conditions (the same!)

Three variables…

Summary Both mileage and number of stops appear useful in predicting cost, even after controlling for the other variable.  If number of stops held constant, each additional mile costs about 5 cents… The regression on mileage and number of stops allows us to explain 45.5% of the variability in airfares from LAX

Summary Can restrict population to “deal with” extreme outlier A statistically significant predictor individually, may not be significant when added to a model Overall F test just tells you that at least one of the slopes is non zero, use t tests to examine them individually

Summary If you want to remove variables from the model, do so one at a time as p-values will change each time Can use a 0-1 variable in the model. Interpret slope coefficient as average change in response between group 0 and group 1 (assuming the same relationship between response and explanatory)  Otherwise consider interactions….

Multicollinearity

For Tuesday Have a great Thanksgiving! Check Final Exam Schedule on Web  One Wed person back to Friday Submit PP (choice of procedure) HW 8 For Thursday  Submit last PP (review questions) Review sheet will be posted online  Presentations!