ECE3340 Numerical Fitting, Interpolation and Approximation

Slides:



Advertisements
Similar presentations
What Could We Do better? Alternative Statistical Methods Jim Crooks and Xingye Qiao.
Advertisements

Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Flexible smoothing with B-splines and Penalties or P-splines P-splines = B-splines + Penalization Applications : Generalized Linear and non linear Modelling.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
Data Modeling and Parameter Estimation Nov 9, 2005 PSCI 702.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Chapter 10 Curve Fitting and Regression Analysis
The General Linear Model. The Simple Linear Model Linear Regression.
Data mining and statistical learning - lecture 6
MATH 685/ CSI 700/ OR 682 Lecture Notes
Chapter 18 Interpolation The Islamic University of Gaza
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
Regression Regression: Mathematical method for determining the best equation that reproduces a data set Linear Regression: Regression method applied with.
Read Chapter 17 of the textbook
Psychology 202b Advanced Psychological Statistics, II February 1, 2011.
CURVE FITTING ENGR 351 Numerical Methods for Engineers
Least Square Regression
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 24 Regression Analysis-Chapter 17.
Analysis of Individual Variables Descriptive – –Measures of Central Tendency Mean – Average score of distribution (1 st moment) Median – Middle score (50.
Simple Linear Regression Analysis
Regression and Correlation Methods Judy Zhong Ph.D.
Least-Squares Regression
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Introduction to MATLAB for Engineers, Third Edition Chapter 6 Model Building and Regression PowerPoint to accompany Copyright © The McGraw-Hill Companies,
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Applications The General Linear Model. Transformations.
Lecture 22 - Exam 2 Review CVEN 302 July 29, 2002.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
Chapter 8 Curve Fitting.
Splines Vida Movahedi January 2007.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Introduction to regression 3D. Interpretation, interpolation, and extrapolation.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Principles of Extrapolation
Statistical Data Analysis 2010/2011 M. de Gunst Lecture 10.
1 Optimal design which are efficient for lack of fit tests Frank Miller, AstraZeneca, Södertälje, Sweden Joint work with Wolfgang Bischoff, Catholic University.
Psychology 202a Advanced Psychological Statistics October 22, 2015.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Regression Analysis1. 2 INTRODUCTION TO EMPIRICAL MODELS LEAST SQUARES ESTIMATION OF THE PARAMETERS PROPERTIES OF THE LEAST SQUARES ESTIMATORS AND ESTIMATION.
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
Basis Expansions and Generalized Additive Models Basis expansion Piecewise polynomials Splines Generalized Additive Model MARS.
LECTURE 17: BEYOND LINEARITY PT. 2 March 30, 2016 SDS 293 Machine Learning.
Model Selection and the Bias–Variance Tradeoff All models described have a smoothing or complexity parameter that has to be considered: multiplier of the.
Interpolation - Introduction
Stats Methods at IC Lecture 3: Regression.
The simple linear regression model and parameter estimation
Chapter 4: Basic Estimation Techniques
Chapter 4 Basic Estimation Techniques
Chapter 7. Classification and Prediction
Part 5 - Chapter
Linear Regression.
Basic Estimation Techniques
Non-linear relationships
CHAPTER 29: Multiple Regression*
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
6-1 Introduction To Empirical Models
Today’s class Multiple Variable Linear Regression
Simple Linear Regression
Least Square Regression
Basis Expansions and Generalized Additive Models (2)
Basis Expansions and Generalized Additive Models (1)
Introduction to Parametric Curve and Surface Modeling
SKTN 2393 Numerical Methods for Nuclear Engineers
Curve Fitting Filling in the gaps.
Topic 11: Matrix Approach to Linear Regression
Presentation transcript:

ECE3340 Numerical Fitting, Interpolation and Approximation Prof. Han Q. Le Note: PPT file is the main outline of the chapter topic – associated Mathematica file(s) contain details and assignments

Overview

A problem-centric perspective this chapter Linearization Fitting, Interpolation Approximation The problem , , , , , too complex for analytic solution non-linear relationships (not reducible) phenomenological (empirical model) with limited or discrete data Finite element methods (FEM) Finite difference methods (e. g. FDTD)

Data (empirical, simulation, or by design) unknown model known model, but unknown parameters Interpolation Approximation (parameterization) Regression fit Objective: estimate parameters Key considerations: confidence of model, ANOVA Objective: find approximated solutions Key considerations : sanity check; assess model validity

Example of model fit Find slope and interception data

Example of interpolation/extrapol data Interpolating function (or extrapolation if outside the range) smooth approximation When we solve a DE numerically, we already create an interpolation function: (remember this?)

Example: graphical design

Linear Regression

Model fit and Regression Linear regression single variable multiple variables Linearization of non-linear model linear-exponential or log-linear power-relationship: log-log general non-linear General model fit Least squares of linear combination of basis functions

Linear regression with single variable

Key concepts Model parameters: Linear regression statistics coefficients, correlation R2 standard error, covariance matrix, correlation matrix confidence ellipsoid Linear regression statistics residuals parameter t-statistics, P-value Analysis of variance (ANOVA): dof, sum of squares, mean squares, F-statistics

Covariance matrix of parameters – Confidence ellipsoid Joint distribution of a and b estimates Estimates for a and b are not independent. They are related by mean x and mean y as shown, hence, the distribution of their values are not independent. in class demo: if we know one coefficient by any other mean, this changes the estimate for the other coefficient (move the planes).

Key concepts Model parameters: Linear regression statistics coefficients, correlation R2 standard error, covariance matrix, correlation matrix confidence ellipsoid Linear regression statistics residuals parameter t-statistics, P-value Analysis of variance (ANOVA): dof, sum of squares, mean squares, F-statistics

Find the growth rate using log-scale plotting

Find the log-log correlation

Another example of correlation http://theincidentaleconomist.com/wordpress/most-important-chart-in-health-policy/

Key concepts Model parameters: Linear regression statistics coefficients, correlation R2 standard error, covariance matrix, correlation matrix confidence ellipsoid Linear regression statistics residuals parameter t-statistics, P-value Analysis of variance (ANOVA): dof, sum of squares, mean squares, F-statistics

Multivariable linear model example Consider this example water + glucose + fat simulator (intralipid) laser incident - absorption Guo B , Wang Y, Wang Y, Le H. Q. Mid-infrared laser measurements of aqueous glucose J Biomed Opt. 2007 Mar-Apr;12(2):024005

Example of two-variable linear regression absorption coefficient fit residuals fat concentration glucose concentration glucose concentration fat concentration The residuals are given as: The plane represents the regression fit: Guo B , Wang Y, Wang Y, Le H. Q. Mid-infrared laser measurements of aqueous glucose J Biomed Opt. 2007 Mar-Apr;12(2):024005

Interpolation/Extrapolation

Estimate the function value at a point not in the data set Extrapolation if the point is outside the data range. It is never a good idea to extrapolate if there is no information or guiding model what the function should behave outside the known range.

Common methods for interpolation data Polynomial: Newton Lagrange Chebyshev Hermite Piece-wise adaptive: Spline These avoid errors of the polynomial method; grow rapidly and widespread in numerous fields, especially with the emergence of computer graphics These have limited use because of potential large errors. Interpolation: globally (over all points) piecewise (by segments)

Spline introduction

Model building - Spline and smooth curve design

How do we draw with common computer software like ppt? control knot control one-side derivative A smooth spline curve is generated between knots

General concepts An interpolating approximation that: Key ideas: efficient and avoid the error and deficiency of the polynomial methods (especially to avoid high power-order oscillatory behavior) inspired by the old tried-and-true draftsman technique of spline that is known to be useful. (controllable derivatives) Key ideas: piecewise interpolation: each segment between “knots” (data points) is approximated independently from knots far away a constraint to limit the oscillatory behavior: roughness penalty forces the fit to minimize the highly oscillatory or rough behavior of the approximation: equivalent with minimum spline strain energy for least square data fitting for least “roughness”

Applications of Spline Computer graphics, including curve drawing for least square data fitting for least “roughness” For data fitting and unknown model building: select a criterion for balancing of the two terms (can be subjective) For pure interpolation and design: the first term = 0 criterion on the power order of each segment: cubic (3rd order) is the most tried-and-true spline function smoothness criterion: order of derivative continuity at each knot. basis function for each segment and the total function is a linear combination of basis functions: B-spline many types of spline has been developed for different requirements: especially for design (Bezier spline functions)

B-spline basis functions Basic concept: To approximate a function defined by a sequence of knots {x0,x1,x2,….,xn} by treating it as a linear combination of a set of basis functions. Each basis function is localized on a sub-sequence of knots of certain number (which defines a partition) within the global knot sequence. The basis functions are designed for smoothness with increasing order to join the partitions

B-spline basis - illustration m=2 m=3 m=4

Spline applications

Summary (takeaways) Least square regression is used to fit experimental data with models Key metrics: confidence of model parameter estimates (fit coefficients statistics) When data is not available or the solutions are known only at a limited number of points, interpolation/extrapolation can be used to fill in the gaps Interpolation: polynomial and cubic spline are two main methods. Spline allows constraint on derivatives. Extrapolation is intrinsically risky – but can be applied if there is sufficient understanding of the problem For designing a solution, the spline method can be used with considerations for smoothness and minimizing derivatives to mimic physically reasonable solutions.