Stat 112 Notes 6 Today: –Chapter 4.1 (Introduction to Multiple Regression)

Slides:



Advertisements
Similar presentations
Lecture 17: Tues., March 16 Inference for simple linear regression (Ch ) R2 statistic (Ch ) Association is not causation (Ch ) Next.
Advertisements

Stat 112: Lecture 7 Notes Homework 2: Due next Thursday The Multiple Linear Regression model (Chapter 4.1) Inferences from multiple regression analysis.
Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
 Coefficient of Determination Section 4.3 Alan Craig
Guide to Using Minitab 14 For Basic Statistical Applications To Accompany Business Statistics: A Decision Making Approach, 8th Ed. Chapter 14: Introduction.
Class 16: Thursday, Nov. 4 Note: I will you some info on the final project this weekend and will discuss in class on Tuesday.
Lecture 22: Thurs., April 1 Outliers and influential points for simple linear regression Multiple linear regression –Basic model –Interpreting the coefficients.
Class 15: Tuesday, Nov. 2 Multiple Regression (Chapter 11, Moore and McCabe).
Lecture 16 – Thurs., March 4 Chi squared test for M&M experiment Simple linear regression (Chapter 7.2) Next class after spring break: Inference for simple.
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Lecture 6 Notes Note: I will homework 2 tonight. It will be due next Thursday. The Multiple Linear Regression model (Chapter 4.1) Inferences from.
Stat 112: Lecture 8 Notes Homework 2: Due on Thursday Assessing Quality of Prediction (Chapter 3.5.3) Comparing Two Regression Models (Chapter 4.4) Prediction.
Correlational Designs
Lecture 17 Interaction Plots Simple Linear Regression (Chapter ) Homework 4 due Friday. JMP instructions for question are actually for.
Lecture 20 – Tues., Nov. 18th Multiple Regression: –Case Studies: Chapter 9.1 –Regression Coefficients in the Multiple Linear Regression Model: Chapter.
Lecture 21 – Thurs., Nov. 20 Review of Interpreting Coefficients and Prediction in Multiple Regression Strategy for Data Analysis and Graphics (Chapters.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
Correlation & Regression
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Relationship of two variables
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
L 1 Chapter 12 Correlational Designs EDUC 640 Dr. William M. Bauer.
1 Chapter 10, Part 2 Linear Regression. 2 Last Time: A scatterplot gives a picture of the relationship between two quantitative variables. One variable.
Statistical Methods Statistical Methods Descriptive Inferential
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 12 Correlational Designs.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Stat 112 Notes 16 Today: –Outliers and influential points in multiple regression (Chapter 6.7)
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
SWBAT: Calculate and interpret the residual plot for a line of regression Do Now: Do heavier cars really use more gasoline? In the following data set,
Least Squares Regression.   If we have two variables X and Y, we often would like to model the relation as a line  Draw a line through the scatter.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 12 Correlational Designs.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Stat 112 Notes 6 Today: –Chapters 4.2 (Inferences from a Multiple Regression Analysis)
Using SPSS Note: The use of another statistical package such as Minitab is similar to using SPSS.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Topics, Summer 2008 Day 1. Introduction Day 2. Samples and populations Day 3. Evaluating relationships Scatterplots and correlation Day 4. Regression and.
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
Stat 112 Notes 8 Today: –Chapters 4.3 (Assessing the Fit of a Regression Model) –Chapter 4.4 (Comparing Two Regression Models) –Chapter 4.5 (Prediction.
Predicting Energy Consumption in Buildings using Multiple Linear Regression Introduction Linear regression is used to model energy consumption in buildings.
Correlation and Linear Regression
Chapter 8 Linear Regression.
Analysis and Interpretation: Multiple Variables Simultaneously
CHAPTER 3 Describing Relationships
Simple Linear Regression
ENME 392 Regression Theory
Regression Chapter 6 I Introduction to Regression
Multiple Regression.
Correlation and Regression-II
Linear Regression.
Simple Linear Regression
Regression Models - Introduction
Multiple Regression BPS 7e Chapter 29 © 2015 W. H. Freeman and Company.
Least Squares Regression Line LSRL Chapter 7-continued
EQ: How well does the line fit the data?
The Least-Squares Line Introduction
11C Line of Best Fit By Eye, 11D Linear Regression
Ch 4.1 & 4.2 Two dimensions concept
Regression Models - Introduction
Presentation transcript:

Stat 112 Notes 6 Today: –Chapter 4.1 (Introduction to Multiple Regression)

Multiple Regression In multiple regression analysis, we consider more than one explanatory variable, X 1,…,X K. We are interested in the conditional mean of Y given X 1,…,X K, E(Y| X 1,…,X K ). Two motivations for multiple regression: –We can obtain better predictions of Y by using information on X 1,…,X K rather than just X 1. –We can control for lurking variables

Automobile Example A team charged with designing a new automobile is concerned about the gas mileage (gallons per 1000 miles on a highway) that can be achieved. The design team is interested in two things: (1) Which characteristics of the design are likely to affect mileage? (2) A new car is planned to have the following characteristics: weight – 4000 lbs, horsepower – 200, length – 200 inches, seating – 5 adults. Predict the new car’s gas mileage. The team has available information about gallons per 1000 miles and four design characteristics (weight, horsepower, length, seating) for a sample of cars made in Data is in car04.JMP.

Best Single Predictor To obtain the correlation matrix and pairwise scatterplots, click Analyze, Multivariate Methods, Multivariate. If we use simple linear regression with each of the four explanatory variables, which provides the best predictions?

Best Single Predictor Answer: The simple linear regression that has the highest R 2 gives the best predictions because recall that Weight gives the best predictions of GPM1000Hwy based on simple linear regression. But we can obtain better predictions by using more than one of the independent variables.

Multiple Linear Regression Model

Point Estimates for Multiple Linear Regression Model We use the same least squares procedure as for simple linear regression. Our estimates of are the coefficients that minimize the sum of squared prediction errors: Least Squares in JMP: Click Analyze, Fit Model, put dependent variable into Y and add independent variables to the construct model effects box.

Root Mean Square Error Estimate of : = Root Mean Square Error in JMP For simple linear regression of GP1000MHWY on Weight,. For multiple linear regression of GP1000MHWY on weight, horsepower, cargo, seating, The multiple regression improves the predictions.

Residuals and Root Mean Square Errors Residual for observation i = prediction error for observation i = Root mean square error = Typical size of absolute value of prediction error As with simple linear regression model, if multiple linear regression model holds –About 95% of the observations will be within two RMSEs of their predicted value For car data, about 95% of the time, the actual GP1000M will be within 2*3.08=6.16 GP1000M of the predicted GP1000M of the car based on the car’s weight, horsepower, length and seating.

Residual Example

Interpretation of Regression Coefficients Gas mileage regression from car04.JMP