1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Non-linear Regression Example.

Slides:



Advertisements
Similar presentations
More on understanding variance inflation factors (VIFk)
Advertisements

1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Objectives (BPS chapter 24)
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Simple Linear Regression Estimates for single and mean responses.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Introduction to Linear Regression.
Checking Assumptions 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Assessing the Assumptions of the Regression Model Terry.
Exercise 7.1 a. Find and report the four seasonal factors for quarter 1, 2, 3 and 4 sn1 = 1.191, sn2 = 1.521, sn3 = 0.804, sn4 = b. What is the.
Linear Regression MARE 250 Dr. Jason Turner.
Simple Linear Regression Analysis
REGRESSION AND CORRELATION
MARE 250 Dr. Jason Turner Correlation & Linear Regression.
Simple Linear Regression and Correlation
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Terms, Scatterplots and Correlation.
Descriptive measures of the strength of a linear association r-squared and the (Pearson) correlation coefficient r.
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-3 Regression.
Model Checking Using residuals to check the validity of the linear regression model assumptions.
Describing the Relation Between Two Variables
M23- Residuals & Minitab 1  Department of ISM, University of Alabama, ResidualsResiduals A continuation of regression analysis.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Least-Squares Regression Section 3.3. Why Create a Model? There are two reasons to create a mathematical model for a set of bivariate data. To predict.
Chapter 12: Linear Regression 1. Introduction Regression analysis and Analysis of variance are the two most widely used statistical procedures. Regression.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Introduction to Linear Regression
1 © 2008 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 5 Summarizing Bivariate Data.
Slide 1 DSCI 5180: Introduction to the Business Decision Process Spring 2013 – Dr. Nick Evangelopoulos Lectures 5-6: Simple Regression Analysis (Ch. 3)
Session 10. Applied Regression -- Prof. Juran2 Outline Binary Logistic Regression Why? –Theoretical and practical difficulties in using regular (continuous)
Summarizing Bivariate Data
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
M25- Growth & Transformations 1  Department of ISM, University of Alabama, Lesson Objectives: Recognize exponential growth or decay. Use log(Y.
Introduction to Probability and Statistics Thirteenth Edition Chapter 12 Linear Regression and Correlation.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12: Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
Copyright ©2011 Nelson Education Limited Linear Regression and Correlation CHAPTER 12.
Solutions to Tutorial 5 Problems Source Sum of Squares df Mean Square F-test Regression Residual Total ANOVA Table Variable.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Time Series Analysis – Chapter 6 Odds and Ends
Multiple regression. Example: Brain and body size predictive of intelligence? Sample of n = 38 college students Response (Y): intelligence based on the.
Fitting Curves to Data 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 5: Fitting Curves to Data Terry Dielman Applied Regression.
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 5 Summarizing Bivariate Data.
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
Statistics and Numerical Method Part I: Statistics Week VI: Empirical Model 1/2555 สมศักดิ์ ศิวดำรงพงศ์ 1.
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
732G21/732G28/732A35 Lecture 4. Variance-covariance matrix for the regression coefficients 2.
Multiple Regression II 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 2) Terry Dielman.
Interaction regression models. What is an additive model? A regression model with p-1 predictor variables contains additive effects if the response function.
Agenda 1.Exam 2 Review 2.Regression a.Prediction b.Polynomial Regression.
Fixing problems with the model Transforming the data so that the simple linear regression model is okay for the transformed data.
Descriptive measures of the degree of linear association R-squared and correlation.
Design and Analysis of Experiments (5) Fitting Regression Models Kyung-Ho Park.
Model selection and model building. Model selection Selection of predictor variables.
1. Analyzing patterns in scatterplots 2. Correlation and linearity 3. Least-squares regression line 4. Residual plots, outliers, and influential points.
David Housman for Math 323 Probability and Statistics Class 05 Ion Sensitive Electrodes.
Chapter 15 Inference for Regression. How is this similar to what we have done in the past few chapters?  We have been using statistics to estimate parameters.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 13 Simple Linear Regression & Correlation Inferential Methods.
Chapter 3: Describing Relationships
Chapter 20 Linear and Multiple Regression
Multiple Regression Models
Inference for Least Squares Lines
Unit 4 Lesson 4 (5.4) Summarizing Bivariate Data
9/19/2018 ST3131, Lecture 6.
Lecture 18 Outline: 1. Role of Variables in a Regression Equation
Solutions for Tutorial 3
Essentials of Statistics for Business and Economics (8e)
Presentation transcript:

1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Non-linear Regression Example

2 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. The Greyhound problem with additional data The sample of fares and mileages from Rochester was extended to cover a total of 20 cities throughout the country. The resulting data and a scatterplot are given on the next few slides.

3 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Extended Greyhound Fare Example

4 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Extended Greyhound Fare Example

5 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Minitab reports the correlation coefficient, r=0.921, R 2 =0.849, s e =$17.42 and the regression line Standard Fare = Distance Notice that even though the correlation coefficient is reasonably high and 84.9 % of the variation in the Fare is explained, the linear model is not very usable. Extended Greyhound Fare Example

6 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Nonlinear Regression Example

7 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. From the previous slide we can see that the plot does not look linear, it appears to have a curved shape. We sometimes replace the one of both of the variables with a transformation of that variable and then perform a linear regression on the transformed variables. This can sometimes lead to developing a useful prediction equation. For this particular data, the shape of the curve is almost logarithmic so we might try to replace the distance with log 10 (distance) [the logarithm to the base 10) of the distance]. Nonlinear Regression Example

8 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Minitab provides the following output. Regression Analysis: Standard Fare versus Log10(Distance) The regression equation is Standard Fare = Log10(Distance) Predictor Coef SE Coef T P Constant Log10(Di S = R-Sq = 96.9% R-Sq(adj) = 96.7% High r % of the variation attributed to the model Typical Error = $7.87 Reasonably good Nonlinear Regression Example

9 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. The rest of the Minitab output follows Analysis of Variance Source DF SS MS F P Regression Residual Error Total Unusual Observations Obs Log10(Di Standard Fit SE Fit Residual St Resid R R denotes an observation with a large standardized residual The only outlier is Orlando and as you’ll see from the next two slides, it is not too bad. Nonlinear Regression Example

10 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Looking at the plot of the residuals against distance, we see some problems. The model over estimates fares for middle distances (1000 to 2000 miles) and under estimates for longer distances (more than 2000 miles Nonlinear Regression Example

11 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. When we look at how the prediction curve looks on a graph that has the Standard Fare and log10(Distance) axes, we see the result looks reasonably linear. Nonlinear Regression Example

12 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. When we look at how the prediction curve looks on a graph that has the Standard Fare and Distance axes, we see the result appears to work fairly well. By and large, this prediction model for the fares appears to work reasonable well. Nonlinear Regression Example