Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.

Slides:



Advertisements
Similar presentations
Chapter 4: Basic Estimation Techniques
Advertisements

Lesson 10: Linear Regression and Correlation
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Chapter 10 Curve Fitting and Regression Analysis
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
P M V Subbarao Professor Mechanical Engineering Department
Least Square Regression
Curve-Fitting Regression
Least Square Regression
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Part 4 Chapter 13 Linear Regression
SIMPLE LINEAR REGRESSION
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
SIMPLE LINEAR REGRESSION
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Correlation and Regression Analysis
Correlation & Regression
Calibration & Curve Fitting
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Lecture 3: Bivariate Data & Linear Regression 1.Introduction 2.Bivariate Data 3.Linear Analysis of Data a)Freehand Linear Fit b)Least Squares Fit c)Interpolation/Extrapolation.
SIMPLE LINEAR REGRESSION
Least-Squares Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Introduction to Linear Regression and Correlation Analysis
STATISTICS: BASICS Aswath Damodaran 1. 2 The role of statistics Aswath Damodaran 2  When you are given lots of data, and especially when that data is.
Simple Linear Regression Models
Chapter 6 & 7 Linear Regression & Correlation
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Chapter 8 Curve Fitting.
Curve-Fitting Regression
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Lynn Smith.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Simple Linear Regression The Coefficients of Correlation and Determination Two Quantitative Variables x variable – independent variable or explanatory.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
ELEC 413 Linear Least Squares. Regression Analysis The study and measure of the statistical relationship that exists between two or more variables Two.
The simple linear regression model and parameter estimation
Chapter 4: Basic Estimation Techniques
Chapter 4 Basic Estimation Techniques
Part 5 - Chapter
Part 5 - Chapter 17.
Statistics 101 Chapter 3 Section 3.
ENME 392 Regression Theory
Correlation and Simple Linear Regression
Basic Estimation Techniques
Correlation and Simple Linear Regression
Basic Estimation Techniques
Correlation and Regression
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Part 5 - Chapter 17.
Linear regression Fitting a straight line to observations.
Correlation and Simple Linear Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Least Square Regression
SIMPLE LINEAR REGRESSION
SKTN 2393 Numerical Methods for Nuclear Engineers
Algebra Review The equation of a straight line y = mx + b
Chapter Thirteen McGraw-Hill/Irwin
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter 17

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 2 Fit the best curve to a discrete data set and obtain estimates for other data points Two general approaches: –Data exhibit a significant degree of scatter Find a single curve that represents the general trend of the data. –Data is very precise. Pass a curve(s) exactly through each of the points. Two common applications in engineering: Trend analysis. Predicting values of dependent variable: extrapolation beyond data points or interpolation between data points. Hypothesis testing. Comparing existing mathematical model with measured data. Curve Fitting

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 3 In sciences, if several measurements are made of a particular quantity, additional insight can be gained by summarizing the data in one or more well chosen statistics: Arithmetic mean - The sum of the individual data points (y i ) divided by the number of points. Standard deviation – a common measure of spread for a sample or variance Coefficient of variation – quantifies the spread of data (similar to relative error) Simple Statistics

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 4 Fitting a straight line to a set of paired observations: (x 1, y 1 ), (x 2, y 2 ),…,(x n, y n ) y i : measured value e : error y i = a 0 + a 1 x i + e e = y i - a 0 - a 1 x i a 1 : slope a 0 : intercept Linear Regression e Error Line equation y = a 0 + a 1 x

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Minimize the sum of the residual errors for all available data? Inadequate! (see  ) Sum of the absolute values? Inadequate! (see  ) How about minimizing the distance that an individual point falls from the line? This does not work either! see  Choosing Criteria For a “Best Fit” Regression line

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 6 Best strategy is to minimize the sum of the squares of the residuals between the measured-y and the y calculated with the linear model: Yields a unique line for a given set of data Need to compute a 0 and a 1 such that S r is minimized! e Error

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Least-Squares Fit of a Straight Line Normal equations which can be solved simultaneously

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Least-Squares Fit of a Straight Line Normal equations which can be solved simultaneously Mean values

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. S r = Sum of the squares of residuals around the regression line S t = total sum of the squares around the mean (S t – S r ) quantifies the improvement or error reduction due to describing data in terms of a straight line rather than as an average value. For a perfect fit S r =0 and r = r 2 = 1 signifies that the line explains 100 percent of the variability of the data. For r = r 2 = 0  S r =S t  the fit represents no improvement r : correlation coefficient r 2 : coefficient of determination “Goodness” of our fit The spread of data (a)around the mean (b)around the best-fit line Notice the improvement in the error due to linear regression

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Linearization of Nonlinear Relationships (a)Data that is ill-suited for linear least-squares regression (b)Indication that a parabola may be more suitable Exponential Eq.

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Saturation growth-rate Eq. Power Eq. Linearization of Nonlinear Relationships

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Data to be fit to the power equation: xy After (log y)–(log x) plot is obtained Find   and   using: Linear Regression yields the result: log y = 1.75*log x –  2 = 1.75 log  2 =   2 = 0.5 See Exercises.xls log x log y

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 13 Polynomial Regression Some engineering data is poorly represented by a straight line. A curve (polynomial) may be better suited to fit the data. The least squares method can be extended to fit the data to higher order polynomials. As an example let us consider a second order polynomial to fit the data points:

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 14 To fit the data to an m th order polynomial, we need to solve the following system of linear equations ((m+1) equations with (m+1) unknowns) Matrix Form Polynomial Regression

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 15 A useful extension of linear regression is the case where y is a linear function of two or more independent variables. For example: y = a o + a 1 x 1 + a 2 x 2 For this 2-dimensional case, the regression line becomes a plane as shown in the figure below. Multiple Linear Regression

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 16 Which method would you use to solve this Linear System of Equations? Multiple Linear Regression

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Multiple Linear Regression Example 17.6 The following data is calculated from the equation y = 5 + 4x 1 - 3x 2 Use multiple linear regression to fit this data. Solution: this system can be solved using Gauss Elimination. The result is: a 0 =5 a 1 =4 and a 2 = -3 y = 5 + 4x 1 -3x 2 x1x1 x2x2 y