CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor

Slides:



Advertisements
Similar presentations
Interpolation A standard idea in interpolation now is to find a polynomial pn(x) of degree n (or less) that assumes the given values; thus (1) We call.
Advertisements

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Numerical Methods.  Polynomial interpolation involves finding a polynomial of order n that passes through the n+1 points.  Several methods to obtain.
Chapter 10 Curve Fitting and Regression Analysis
Ch11 Curve Fitting Dr. Deshi Ye
P M V Subbarao Professor Mechanical Engineering Department
MATH 685/ CSI 700/ OR 682 Lecture Notes
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
ES 240: Scientific and Engineering Computation. InterpolationPolynomial  Definition –a function f(x) that can be written as a finite series of power functions.
KFUPM SE301: Numerical Methods Topic 5: Interpolation Lectures 20-22:
Curve-Fitting Interpolation
Curve Fitting and Interpolation: Lecture (IV)
Function Approximation
Curve-Fitting Regression
Least Square Regression
Curve-Fitting Polynomial Interpolation
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 23 CURVE FITTING Chapter 18 Function Interpolation and Approximation.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 24 Regression Analysis-Chapter 17.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Interpolation Chapter 18.
Chapter 6 Numerical Interpolation
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Calibration & Curve Fitting
Least-Squares Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
ORDINARY DIFFERENTIAL EQUATION (ODE) LAPLACE TRANSFORM.
Interpolation. Interpolation is important concept in numerical analysis. Quite often functions may not be available explicitly but only the values of.
Chapter 9 Function Approximation
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Curve Fitting and Interpolation: Lecture (I)
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
1 Interpolation. 2 What is Interpolation ? Given (x 0,y 0 ), (x 1,y 1 ), …… (x n,y n ), find the value of ‘y’ at a.
Today’s class Spline Interpolation Quadratic Spline Cubic Spline Fourier Approximation Numerical Methods Lecture 21 Prof. Jinbo Bi CSE, UConn 1.
Chapter 8 Curve Fitting.
Numerical Methods For Slides Thanks to Lecture 6 Interpolation
Interpolation and Approximation To the question, "Why approximate?", we can only answer, "Because we must!" Mathematical models of physical or natural.
SWBAT: −Match functions to their parent graphs −Find domain and range of functions from a graph −Determine if a function is even or odd −Give the domain.
Curve-Fitting Regression
Interpolation produces a function that matches the given data exactly. The function then can be utilized to approximate the data values at intermediate.
CHAPTER 3 Model Fitting. Introduction Possible tasks when analyzing a collection of data points: Fitting a selected model type or types to the data Choosing.
CHAPTER 3 NUMERICAL METHODS
KFUPM SE301: Numerical Methods Topic 5: Interpolation Lectures 20-22:
EE3561_Unit 5(c)AL-DHAIFALLAH14361 EE 3561 : Computational Methods Unit 5 : Interpolation Dr. Mujahed AlDhaifallah (Term 351) Read Chapter 18, Sections.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
1 INTERPOLASI. Direct Method of Interpolation 3 What is Interpolation ? Given (x 0,y 0 ), (x 1,y 1 ), …… (x n,y n ), find the value of ‘y’ at a value.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
By: Mark Coose Joetta Swift Micah Weiss. What Problems Can Interpolation Solve? Given a table of values, find a simple function that passes through the.
Fourier Approximation Related Matters Concerning Fourier Series.
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
Interpolation - Introduction
1 Approximating functions, polynomial interpolation (Lagrange and Newton’s divided differences) formulas, error approximations.
Interpolation.
Reading Between the Lines
Interpolation.
INTERPOLATION Prof. Samuel Okolie, Prof. Yinka Adekunle & Dr
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Chapter 7 Numerical Differentiation and Integration
Today’s class Multiple Variable Linear Regression
MATH 2140 Numerical Methods
Linear regression Fitting a straight line to observations.
Numerical Analysis Lecture 26.
MATH 2140 Numerical Methods
Regression Lecture-5 Additional chapters of mathematics
INTERPOLASI.
Discrete Least Squares Approximation
SKTN 2393 Numerical Methods for Nuclear Engineers
SKTN 2393 Numerical Methods for Nuclear Engineers
Theory of Approximation: Interpolation
Presentation transcript:

CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor Yarmouk University Computer Engineering Department Math 685/CSI 700 Spring 08 CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor Chapter 3: Interpolation and Curve Fitting George Mason University, Department of Mathematical Sciences

Interpolation Basic problem: for given data (set of points) (xi , yi), i=1,2,….,m with x1 < x2 < … < xm determine the function f(xi) = yi, i=1,2,….,m such that f is interpolating function, for the given data

Purposes of Interpolation Plotting smooth curve through discrete data points Reading between lines of table Differentiating or integrating tabular data Replacing complicated function by simple one

Interpolation vs Approximation Interpolation function fits given data points exactly Interpolation is inappropriate if data points subject to significant errors Approximation is usually preferable for smoothing noisy data

Interpolating Functions Families of functions commonly used for interpolation include Polynomials Piecewise polynomials Trigonometric functions Exponential functions Rational functions We will focus on interpolation by polynomial and piecewise polynomials for now

Polynomial Interpolations Simplest type of interpolation uses Polynomials Unique polynomial of degree at most n-1 passes through n data points (xi yi), i = 1, …, n, where xi are distinct There are many ways to represent or compute polynomial, but in theory all must give same result

Lagrangian Polynomials Example We choose 4 points for the third degree polynomial : We need to find coefficients a, b, c, d Can be found using previous chapter methods, by formulating 4 equations for a,b,c and d, using the points above

Lagrangian Polynomials A simpler way is to use lagrangian Polynomials. For a cubic polynomial case, 4 points should be available, (x0,f0 ),(x1,f1),(x2,f2),(x3,f3) The interpolating polynomial is then defined by

Lagrangian Polynomials Example Find the interpolated value for x = 3.0 using a cubic polynomial fitting the first 4 data points of the Table in previous slides

Divided Difference Polynomial Need to re-compute the interpolation function if adding or removing a data point Divided-differences method avoids this problem using fewer arithmetic operations Divided-differences gives the same polynomial as Lagrangian interpolation

Divided Difference Consider the Interpolating polynomial is written as: If we choose ai so that Pn(x)=f(x) at the points (xi , fi ),i=0,…,n, then Pn(x) is an interpolating polynomial ai ’s are determined by the divided differences of the tabulated data

Divided Difference Given data points (xi, yi), I = 0,…,n, the divided differences, denoted by f[], is defined recursively by Where

Divided Difference Using the standard notations, the divided difference can be

Divided Difference Example

Divided Difference In the equation : Lets write the polynomial equations with x=x0, x=x1, x=x2, …, x=xn, we get

Divided Difference If Pn(x) is the interpolating polynomial , then it should match the table for all n+1 points

Divided Difference Pn(x) can be written now in terms of divided differences :

Divided Difference Using the data obtained in the divided difference table The interpolating polynomial of degree 3 is : The degree 4 polynomial is found by adding one term to P3(x)

Divided Differences For The divided difference table is For an nth-degree polynomial, Pn(x), whose highest power term has the coefficient an, the nth divided differences will always be equal to an.

Error of Polynomial Interpolation Interpolation works better for x within xi ‘s Error is smaller if x is centered within xi The error term of polynomial interpolation is : with ξ in the smallest interval that contains {x, ,x1 ,x2,…,xn }. Not very useful for computing real error as f is usually unknown. If the function is "smooth," a low-degree polynomial should work satisfactorily.

Error Estimation: Next Term Rule Error of the interpolates for f(1.75) using polynomials of degrees one, two, and three can be found by taking the derivatives and evaluating the minimum and maximum within an interval of the original function using:

Error Estimation: Next Term Rule En(x) = (approximately) the value of the next term that would be added to Pn(x). For the previous example

Evenly Spaced Data If data is given at evenly spaced intervals, arrange the date with the x values in ascending order. The difference table is then calculated “ without dividing by x difference” as

Evenly Spaced Data : difference table Where :

Polynomial for Evenly Spaced Data Newton-Gregory forward polynomial passes through equi_spaced points with an h distance between consecutive points Where :

Polynomial for Evenly Spaced Data For the data in the difference table Write a Newton-Gregory forward polynomial of degree 3 that fits for the four points at x = 0.4 to x = 1.0. Use it to interpolate for f(O. 73). To make the polynomial fit as specified, we must index the x's so that x0=4, it follows

Polynomial for Evenly Spaced Data

Least Square Approximation Given a set of (x,y) data points, Approximation is the process of finding a function (usually a line or a polynomial) that comes the “closest” to the data points. Data has “noise” – cannot find interpolating line.

Least Square Approximation : Linear Data Assume we have experimental data for the effect of temperature on resistance The graph suggest a linear relationship

Least Square Approximation: Linear Data The criterion used to find a and b is to minimize the sum of the squares of the errors, the "least-squares“ principle Let Yi represent an experimental value, and let yi be a value from the equation yi= a xi + b, least Square criterion requires

Least Square Approximation To find the minimum of S, the partial derivates Should be zero. Reducing we get :

Least Square Approximation For the Temperature data we have, Y is R and x is T The normal equation are then a = 3.395, b = 702.2,

Least Square Approximation: Nonlinear Data Nonlinear data can be fitted using exponential functions Perform linearization by taking the logarithms Rebuild the table to represent ln y and ln x instead of x and y

Least Square Approximation: Nonlinear Data Polynomial Approximation is the common method used to approximate nonlinear data . We assume the functional relationship to be : The error defined as The sum of squares defined by S is

Least Square Approximation: Polynomial approximation of nonlinear data At the minimum all partial derivates should be zero

Least Square Approximation: Polynomial approximation of nonlinear data Dividing each by -2 and rearranging gives the n + 1 normal equations to be solved simultaneously:

Least Square Approximation: Polynomial approximation of nonlinear data Putting the Previous Equation in Matrix Notation

Least Square Approximation: Polynomial approximation of nonlinear data Use quadratic polynomial to fit the data in the following table We need to calculate the normal sums as follows

Least Square Approximation: Polynomial approximation of nonlinear data Applying these sums in the normal equations we get Solving sets of equations for the coefficients we get The least square polynomial is then