Chapter 9 Function Approximation

Slides:



Advertisements
Similar presentations
Data Modeling and Parameter Estimation Nov 9, 2005 PSCI 702.
Advertisements

Ch11 Curve Fitting Dr. Deshi Ye
P M V Subbarao Professor Mechanical Engineering Department
MATH 685/ CSI 700/ OR 682 Lecture Notes
Selected from presentations by Jim Ramsay, McGill University, Hongliang Fei, and Brian Quanz Basis Basics.
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
1 Curve-Fitting Spline Interpolation. 2 Curve Fitting Regression Linear Regression Polynomial Regression Multiple Linear Regression Non-linear Regression.
ES 240: Scientific and Engineering Computation. InterpolationPolynomial  Definition –a function f(x) that can be written as a finite series of power functions.
Section 9.2a. Do Now – Exploration 1 on p.469 Construct a polynomial with the following behavior at : Since, the constant coefficient is Since, the coefficient.
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Chapter 5 Orthogonality
Ch 3.5: Nonhomogeneous Equations; Method of Undetermined Coefficients
Function Approximation
Ch 5.1: Review of Power Series
Ch 5.1: Review of Power Series Finding the general solution of a linear differential equation depends on determining a fundamental set of solutions of.
Revision.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 21 CURVE FITTING Chapter 18 Function Interpolation and Approximation.
Chapter 15 Fourier Series and Fourier Transform
Chapter 10 Real Inner Products and Least-Square (cont.)
Calibration & Curve Fitting
Physics 114: Lecture 17 Least Squares Fit to Polynomial
CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor
Computational Methods in Physics PHYS 3437 Dr Rob Thacker Dept of Astronomy & Physics (MM-301C)
Chapter 4, Integration of Functions. Open and Closed Formulas x 1 =a x 2 x 3 x 4 x 5 =b Closed formula uses end points, e.g., Open formulas - use interior.
CMPS1371 Introduction to Computing for Engineers NUMERICAL METHODS.
MA2213 Lecture 4 Numerical Integration. Introduction Definition is the limit of Riemann sums I(f)
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Extrapolation Models for Convergence Acceleration and Function ’ s Extension David Levin Tel-Aviv University MAIA Erice 2013.
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Department of Mechanical Engineering, LSU Session IV MATLAB Tutorials Session IV Mathematical Applications using MATLAB Rajeev Madazhy
Curve-Fitting Regression
6. Introduction to Spectral method. Finite difference method – approximate a function locally using lower order interpolating polynomials. Spectral method.
Lecture 10 2D plotting & curve fitting Subplots Other 2-D Plots Other 2-D Plots Curve fitting © 2007 Daniel Valentine. All rights reserved. Published by.
AGC DSP AGC DSP Professor A G Constantinides©1 Hilbert Spaces Linear Transformations and Least Squares: Hilbert Spaces.
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Lecture 22 Numerical Analysis. Chapter 5 Interpolation.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Polynomial Functions Definitions Degrees Graphing.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
Polynomials, Curve Fitting and Interpolation. In this chapter will study Polynomials – functions of a special form that arise often in science and engineering.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Lecture 17 - Approximation Methods CVEN 302 July 17, 2002.
Advanced Engineering Mathematics, 7 th Edition Peter V. O’Neil © 2012 Cengage Learning Engineering. All Rights Reserved. CHAPTER 4 Series Solutions.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
3 Polynomial and Rational Functions © 2008 Pearson Addison-Wesley. All rights reserved Sections 3.1–3.4.
Boyce/DiPrima 9 th ed, Ch 11.3: Non- Homogeneous Boundary Value Problems Elementary Differential Equations and Boundary Value Problems, 9 th edition, by.
1 Lecture 8 Post-Graduate Students Advanced Programming (Introduction to MATLAB) Code: ENG 505 Dr. Basheer M. Nasef Computers & Systems Dept.
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
Runge Kutta schemes Taylor series method Numeric solutions of ordinary differential equations.
Section 8 Numerical Analysis CSCI E-24 José Luis Ramírez Herrán October 20, 2015.
MTH 253 Calculus (Other Topics) Chapter 11 – Infinite Sequences and Series Section 11.8 –Taylor and Maclaurin Series Copyright © 2009 by Ron Wallace, all.
MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)
P.3 Functions & Their Graphs 1.Use function notation to represent and evaluate a function. 2.Find the domain and range of a function. 3.Sketch the graph.
Answers for Review Questions for Lectures 1-4. Review Lectures 1-4 Problems Question 2. Derive a closed form for the estimate of the solution of the equation.
Chapter 4: Linear Differential Equations
Chapter 9 Function Approximation
Linear regression Fitting a straight line to observations.
Warm-up: Find the equation of a quadratic function in standard form that has a root of 2 + 3i and passes through the point (2, -27). Answer: f(x) = -3x2.
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
CHAPTER 8. Approximation Theory
Discrete Least Squares Approximation
Curve Fitting Filling in the gaps.
Approximation of Functions
Approximation of Functions
Presentation transcript:

Chapter 9 Function Approximation 고려대학교 컴퓨터학과 음성정보처리 연구실 2003010594 조영규 2004020594 방규섭 2005020594 정재연

Contents 9.1 Least Squares Approximation 9.2 Continuous Least Squares 9.3 Function Approximation at a Point 9.4 Using Matlab’s Functions 음성정보처리연구실

9.1 Least Squares Approximation Some of the most common methods of approximating data are based on the desire to minimize some measure of the difference between the approximating function and the given data points. The method of least squares seeks to minimize the sum of the squares of the differences between the function value and the data value. Advantages to using the square of the differences at each point Positive differences do not cancel negative differences Differentiation is not difficult Small differences become smaller and large differences are magnified 음성정보처리연구실

9. 1. 1 Linear Least Squares Approximation - Example 9 9.1.1 Linear Least Squares Approximation - Example 9.1 Linear Approximation to Four Points 음성정보처리연구실

9.1.1 Linear Least Squares Approximation - Discussion (1/2) Normal equation for linear least square approximation (1/2) 음성정보처리연구실

9.1.1 Linear Least Squares Approximation - Discussion (2/2) Normal equation for linear least square approximation (2/2) Matlab Function for linear Least Squares Approximation 음성정보처리연구실

Example 9.2 Least Squares Straight Line to Fit Four Data Points 음성정보처리연구실

Example 9.3 Noisy Straight-Line Data 음성정보처리연구실

9.1.2 Quadratic Least Squares Approximation - Discussion (1/2) Normal equation for Quadratic least square approximation 음성정보처리연구실

9.1.2 Quadratic Least Squares Approximation - Discussion (2/2) Matlab Function for Quadratic Least Squares Approximation 음성정보처리연구실

Example 9.6 Oil Reservoir Using the Matlab function for quadratic least squares, we find that the normal equations are Az=b, with The solution is 음성정보처리연구실

9.1.3 Cubic Least Squares Approximation 음성정보처리연구실

Example 9.7 Cubic Least Squares 음성정보처리연구실

Example 9.8 Cubic Least Squares, Continued 음성정보처리연구실

9.1.4 Least Squares Approximation for Other Functional Forms If the data are best fit by an exponential function, it is convenient instead to fit the logarithm of the data by a straight line 음성정보처리연구실

Example 9.10 Least-Squares Approximation of a Reciprocal Relation The plot of the following data suggests that they could be fit by a function of the form y=1/(ax+b) x=[0 0.5 1 1.5 2] y=[1.00 0.50 0.30 0.20 0.20] 음성정보처리연구실

9.2 Continuous Least Squares To approximate the exact value of the function for all point in an interval The summations are replaced by the corresponding integrals To approximate a given function a quadratic function on the interval [0, 1] and [-1, 1], we minimize the following equation. 음성정보처리연구실

Example 9. 12 Continuous Least Squares To find the continuous least squares quadratic approximation to the exponential function on [-1, 1] The coefficient matrix 음성정보처리연구실

Example 9. 12 Continuous Least Squares (cont.) Exponential function 음성정보처리연구실

9.2.1 Continuous Least Squares with Orthogonal Polynomails The set of functions {f0, f1, f2, ….., fn} is linearly independent on the interval [a, b] A linear combination of the functions is the zero function only if all of the coefficients are zero In other words c0f0(x) + c1f1(x) + c2f2(x) + …. + cnfn(x) = 0 for all x in [a, b], then c0=c1=c2=….=cn=0. Example of linearly independent function f0=1, f1=x, f2=x2,…., fn=xn {p0, p1, p2, ….., pn} where pj is a polynomial of degree j {1, sin(x), cos(x), sin(2x), cos(2x),…., sin(nx), cos(nx)} The set of functions {f0, f1, f2, ….., fn} is orthogonal on [a, b] Specially, if the functions are orthogonal and dj =1 for all j, the functions are called orthonormal. 음성정보처리연구실

9.2.2 Gram-Schmidt process How to construct a sequence of polynomials that are arthogonal on the interval [a, b] Conditions Pn be a polynomial of degree n The coefficient of xn in pn be positive For n = 0, 1, 2, …. Starting by taking p0(x)=x>0 To satisfy condition 3 To construct p1(x), we begin by letting q1(x) = x + c1,0p0. q1(x) is orthogonal to p0 음성정보처리연구실

9.2.2 Gram-Schmidt process To satisfy condition 3, we normalize q1(x) To construct p2(x), we begin by letting q2(x) = x2 + c2,1p1(x) + x2,0p0(x). q2(x) is orthogonal to p1(x) Normalizing q2(x) 음성정보처리연구실

9.2.2 Gram-Schmidt process The process continues in the same manner, as we construct each higher degree polynomial in turn. Where, qn(x) = xn + cn,0p0(x) + cn,1p1(x) + ….. + cn,n-1pn-1(x) (The coefficients cn,0, … , cn,n-1 are found so that qn(x) is orthogonal to each of the previously generated polynomials) 음성정보처리연구실

9.2.3 Legendre Polynomials A function of x defined on any finite interval a≤ x≤ b To transformed to a function of t defined -1≤ t≤ 1 The continuous least squares function approximation of Legendre polynomials Legendre polynomials form an orthogonal set on [-1, 1] For normalization An alternative definition of the Legendre polynomials 음성정보처리연구실

9.2.4 Least Squares Approximation with Legendre Polynomials To find the quadratic least squares approximation to f(x) on the interval [-1, 1] , we need to determine the coefficient c0, c1, c2 that minimize Setting 음성정보처리연구실

9.2.4 Least Squares Approximation with Legendre Polynomials (cont.) In a similar manner, equations formed by setting and and The integrations on the left were performed earlier; the result is 음성정보처리연구실

Example 9.13 Least Squares Approximation Using Legendre Polynomials To find the quadratic least squares approximation to f(x) = ex on the interval [-1, 1]in terms of the Legendre polynomials g(x) = c0p0(x) + c1p1(x) + c2p2(x) where p0(x) = 1, p0(x) = x, p0(x) = x2 – 음성정보처리연구실

Example 9.13 Least Squares Approximation Using Legendre Polynomials Exponential function 음성정보처리연구실

Example 9.14 Pade Approximation of the Runge function Its first three derivatives at x =0 are f(0)=1, f′(0)=0, f″(0)=-50, f‴(0)=0 Taylor polynomial of the function (at x=0) To seed a rational-function representation, using k=3, m=1, and n=2 For k=3, the linear system of equation is 음성정보처리연구실

9.4 Using Matlab’s Functions % script for polynomial regression % generate data X = -3 : 0.1 : 3; y1 = sin(x); y2 = cos(2*x); yy = y1 + y2; y = 0.01*round(100*yy) % find least squares 6th-degree polynomial to fit data z = polyfit (x, y, 6) % evaluate t he polynomial p = polyval(z, x); % plot the polynomial plot(x, p) Hold on % plot the data plot (x, y, ‘+’) hold off 음성정보처리연구실