Curve Fitting and Regression EEE 244. Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics.

Slides:



Advertisements
Similar presentations
1 Chapter 13 Curve Fitting and Correlation This chapter will be concerned primarily with two separate but closely interrelated processes: (1) the fitting.
Advertisements

The Simple Regression Model
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Chapter 7 Statistical Data Treatment and Evaluation
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Introduction to MATLAB 7 for Engineers William J. Palm.
The General Linear Model. The Simple Linear Model Linear Regression.
CITS2401 Computer Analysis & Visualisation
ES 240: Scientific and Engineering Computation. InterpolationPolynomial  Definition –a function f(x) that can be written as a finite series of power functions.
Function Approximation
Least Square Regression
Curve-Fitting Regression
General Linear Least-Squares and Nonlinear Regression
Least Square Regression
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Part 4 Chapter 13 Linear Regression
Polynomial Interpolation
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Chapter 181 Interpolation Chapter 18 Estimation of intermediate.
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Lecture 5 Curve fitting by iterative approaches MARINE QB III MARINE QB III Modelling Aquatic Rates In Natural Ecosystems BIOL471 © 2001 School of Biological.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 181 Interpolation.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Interpolation Chapter 18.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Least-Squares Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
CMPS1371 Introduction to Computing for Engineers NUMERICAL METHODS.
Introduction to MATLAB for Engineers, Third Edition Chapter 6 Model Building and Regression PowerPoint to accompany Copyright © The McGraw-Hill Companies,
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
BIOL 582 Lecture Set 11 Bivariate Data Correlation Regression.
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
Part 4 Chapter 17 Polynomial Interpolation PowerPoints organized by Dr. Michael R. Gustafson II, Duke University All images copyright © The McGraw-Hill.
Polynomial Interpolation You will frequently have occasions to estimate intermediate values between precise data points. The function you use to interpolate.
Chapter 8 Curve Fitting.
Chapter 14 Curve Fitting : Polynomial Interpolation Gab Byung Chae.
ES 240: Scientific and Engineering Computation. Chapter 13: Linear Regression 13. 1: Statistical Review Uchechukwu Ofoegbu Temple University.
Curve-Fitting Regression
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Polynomials, Curve Fitting and Interpolation. In this chapter will study Polynomials – functions of a special form that arise often in science and engineering.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Chapter 13 Objectives Familiarizing yourself with some basic descriptive statistics and the normal distribution. Knowing how to compute the slope and intercept.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Interpolation - Introduction
Part 5 - Chapter
Part 5 - Chapter 17.
Polynomial Interpolation
Interpolation Estimation of intermediate values between precise data points. The most common method is: Although there is one and only one nth-order.
Chapter 18.
Chapter 18.
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Part 5 - Chapter 17.
6-1 Introduction To Empirical Models
Linear regression Fitting a straight line to observations.
Least Square Regression
11C Line of Best Fit By Eye, 11D Linear Regression
SKTN 2393 Numerical Methods for Nuclear Engineers
Curve Fitting in Matlab
Presentation transcript:

Curve Fitting and Regression EEE 244

Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics. Assuming some column vector s : –mean(s), median(s), mode(s) Calculate the mean, median, and mode of s. mode is a part of the statistics toolbox. –min(s), max(s) Calculate the minimum and maximum value in s. –var(s), std(s) Calculate the variance (square of standard deviation) and standard deviation of s Note - if a matrix is given, the statistics will be returned for each column.

Measurements of a Voltage Drain Current in mA at intervals t=10am-10pm Find mean, median, mode, min, max, variance and standard deviation using appropriate Matlab functions.

Histograms in MATLAB [n, x] = hist(s, x) – Determine the number of elements in each bin of data in s. x is a vector containing the center values of the bins. – Open Matlab help and run example with 1000 randomly generated values for s. [n, x] = hist(s, m) – Determine the number of elements in each bin of data in s using m bins. x will contain the centers of the bins. The default case is m=10 – Repeat the previous example with setting m=5 Hist(s) ->histogram plot

REGRESSION EEE 244

6 Linear Regression Fitting a straight line to a set of paired observations: (x 1, y 1 ), (x 2, y 2 ),…,(x n, y n ). y=a 0 +a 1 x+e a 1 - slope a 0 - intercept e- error, or residual, between the model and the observations

Linear Least-Squares Regression Linear least-squares regression is a method to determine the “best” coefficients in a linear model for given data set. “Best” for least-squares regression means minimizing the sum of the squares of the estimate residuals. For a straight line model, this gives:

Least-Squares Fit of a Straight Line Using the model: the slope and intercept producing the best fit can be found using:

Example V (m/s) F (N) ixixi yiyi (x i ) 2 xiyixiyi 

Standard Error of the Estimate Regression data showing (a) the spread of data around the mean of the dependent data and (b) the spread of the data around the best fit line: The reduction in spread represents the improvement due to linear regression.

MATLAB Functions MATLAB has a built-in function polyfit that fits a least-squares nth order polynomial to data: –p = polyfit(x, y, n) x : independent data y : dependent data n : order of polynomial to fit p : coefficients of polynomial f(x)=p 1 x n +p 2 x n-1 +…+p n x+p n+1 MATLAB’s polyval command can be used to compute a value using the coefficients. –y = polyval(p, x)

Polyfit function Can be used to perform REGRESSION if the number of data points is a lot larger than the number of coefficients –p = polyfit(x, y, n) x : independent data (Vce, 10 data points) y : dependent data (Ic) n : order of polynomial to fit n=1 (linear fit) p : coefficients of polynomial (two coefficients) f(x)=p 1 x n +p 2 x n-1 +…+p n x+p n+1

Polynomial Regression The least-squares procedure can be readily extended to fit data to a higher-order polynomial. Again, the idea is to minimize the sum of the squares of the estimate residuals. The figure shows the same data fit with: a)A first order polynomial b)A second order polynomial

Process and Measures of Fit For a second order polynomial, the best fit would mean minimizing: In general, this would mean minimizing:

INTERPOLATION EEE 244

Polynomial Interpolation You will frequently have occasions to estimate intermediate values between precise data points. The function you use to interpolate must pass through the actual data points - this makes interpolation more restrictive than fitting. The most common method for this purpose is polynomial interpolation, where an (n-1) th order polynomial is solved that passes through n data points:

Determining Coefficients using Polyfit MATLAB’s built in polyfit and polyval commands can also be used - all that is required is making sure the order of the fit for n data points is n-1.

Newton Interpolating Polynomials Another way to express a polynomial interpolation is to use Newton’s interpolating polynomial. The differences between a simple polynomial and Newton’s interpolating polynomial for first and second order interpolations are:

Newton Interpolating Polynomials (contd.) The first-order Newton interpolating polynomial may be obtained from linear interpolation and similar triangles, as shown. The resulting formula based on known points x 1 and x 2 and the values of the dependent function at those points is:

Newton Interpolating Polynomials (contd.) The second-order Newton interpolating polynomial introduces some curvature to the line connecting the points, but still goes through the first two points. The resulting formula based on known points x 1, x 2, and x 3 and the values of the dependent function at those points is:

Lagrange Interpolating Polynomials Another method that uses shifted value to express an interpolating polynomial is the Lagrange interpolating polynomial. The differences between a simply polynomial and Lagrange interpolating polynomials for first and second order polynomials is: where the L i are weighting coefficients that are functions of x.

Lagrange Interpolating Polynomials (contd.) The first-order Lagrange interpolating polynomial may be obtained from a weighted combination of two linear interpolations, as shown. The resulting formula based on known points x 1 and x 2 and the values of the dependent function at those points is:

23 As with Newton’s method, the Lagrange version has an estimated error of:

24 Figure 18.10

Lagrange Interpolating Polynomials (contd.) In general, the Lagrange polynomial interpolation for n points is: where L i is given by: