Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Prediction, Goodness-of-Fit, and Modeling Issues ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Chapter 10 Curve Fitting and Regression Analysis
Linear regression models
Simple Regression. Major Questions Given an economic model involving a relationship between two economic variables, how do we go about specifying the.
Ch11 Curve Fitting Dr. Deshi Ye
P M V Subbarao Professor Mechanical Engineering Department
The Simple Linear Regression Model: Specification and Estimation
Curve Fitting and Interpolation: Lecture (IV)
Function Approximation
Least Square Regression
Curve-Fitting Regression
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 26 Regression Analysis-Chapter 17.
Least Square Regression
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Part 4 Chapter 13 Linear Regression
SIMPLE LINEAR REGRESSION
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Business Statistics - QBM117 Statistical inference for regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Calibration & Curve Fitting
Least-Squares Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Introduction to Linear Regression and Correlation Analysis
STATISTICS: BASICS Aswath Damodaran 1. 2 The role of statistics Aswath Damodaran 2  When you are given lots of data, and especially when that data is.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Chapter 8 Curve Fitting.
ES 240: Scientific and Engineering Computation. Chapter 13: Linear Regression 13. 1: Statistical Review Uchechukwu Ofoegbu Temple University.
Curve-Fitting Regression
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Prediction, Goodness-of-Fit, and Modeling Issues Prepared by Vera Tabakova, East Carolina University.
EE3561_Unit 4(c)AL-DHAIFALLAH14351 EE 3561 : Computational Methods Unit 4 : Least Squares Curve Fitting Dr. Mujahed Al-Dhaifallah (Term 342) Reading Assignment.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae.
NON-LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 6 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Chapter 13 Objectives Familiarizing yourself with some basic descriptive statistics and the normal distribution. Knowing how to compute the slope and intercept.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
OLS Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Applied Numerical Methods With MATLAB ® for Engineers.
The simple linear regression model and parameter estimation
Regression and Correlation of Data Summary
Fitting Equations to Data
Regression Analysis AGEC 784.
Part 5 - Chapter
Part 5 - Chapter 17.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Correlation and Regression
CHAPTER 29: Multiple Regression*
Part 5 - Chapter 17.
Linear regression Fitting a straight line to observations.
Nonlinear regression.
M248: Analyzing data Block D UNIT D2 Regression.
Simple Linear Regression
Nonlinear Fitting.
Discrete Least Squares Approximation
Least Square Regression
SKTN 2393 Numerical Methods for Nuclear Engineers
Multiple Regression Berlin Chen
Presentation transcript:

Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae

Curve Fitting Finding a function passing thru -May require estimates at points between the discrete values. -May require a simplified version of a complicated function.

12.1 Two general approaches for curving fitting Least-squares regression The data exhibits “scatter” Interpolation Where the data is known to be very precise, the basic approach is to fit a curve or a series of curves that pass directly through each of the points.

12.2 Statistics Review n data points y1,y2,…,yn mean standard deviation variance coefficient of variation degrees of freedom

The Normal Distribution As n increases, the histogram often approaches the normal distribution. 68% of total measurements in

12.3 Least Squares Regression Minimize some measure of the difference between the approximating function and the given data points. In least squares method, the error is measured as :

Linear Least Squares Regression f(x) is in a linear form : f(x)=ax+b The error : e = y – ax - b The sum of the residuals errors for all the variable data, as in Does it work? -Any straight line passing through the midpoint of the connecting line get the minimum value Then NO !

Is minimized when : -The minimum of E occurs when the partial derivatives of E with respect to each of the variables are 0.  Equ. (12.15) and (12.16)

Example 12.2 Find a function of a straight line that fits (10, 25), (20, 70), (30, 380), (40, 550), (50, 610), (60, 1220), (70, 830), (80, 1450) in least squares method.

Quantification of Error

The spread of the points around the line is of similar magnitude along the entire range of the data. The distribution of these points about the line is normal. the estimates of a and b are the best!! This is called the maximum likelihood principle in statistics.

Standard Error of Estimate It is divided by n-2 since we lost two data a and b. Quantifies the spread around the regression line. Quantifies the spread around the mean.

Coefficient of Determination correlation coefficient r=1 => the line explains 100% of the variability of the data r=0 => the fit represents no improvement over mean

Example 12.3 These results indicate that 88.05% of the original uncertainty has been explained by the linear model.

Example 12.3

12.3.4 Linearization of Nonlinear Relationships The exponential equation The power equation The sturation-growth-rate equation

Example 12.4 f(x) is in a linear form : f(x)=ax+b (10, 25), (20, 70), (30, 380), (40, 550), (50, 610), (60, 1220), (70, 830), (80, 1450) using power equation. (That is we assume we fit a power equation to the given data.) f(x) is in a linear form : f(x)=ax+b

Example 12.4

12.4 Implementation of Linear Regression

MATLAB’s methods z=polyfit(x,y,n) : n : the degree of the polynomial yy=polyval(z,xx)