6.7 Practical Problems with Curve Fitting simple conceptual problems

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Copyright © 2009 Pearson Education, Inc. Chapter 29 Multiple Regression.
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
The General Linear Model Or, What the Hell’s Going on During Estimation?
Objectives (BPS chapter 24)
P M V Subbarao Professor Mechanical Engineering Department
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Simple Linear Regression Analysis
כמה מהתעשייה? מבנה הקורס השתנה Computer vision.
Correlation & Regression
Calibration & Curve Fitting
Objectives of Multiple Regression
Least-Squares Regression
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 12 Describing Data.
Chapter 12 Multiple Regression and Model Building.
Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.
R. Kass/W03P416/Lecture 7 1 Lecture 7 Some Advanced Topics using Propagation of Errors and Least Squares Fitting Error on the mean (review from Lecture.
Physics 114: Exam 2 Review Lectures 11-16
Practical Statistical Analysis Objectives: Conceptually understand the following for both linear and nonlinear models: 1.Best fit to model parameters 2.Experimental.
Correlation & Regression – Non Linear Emphasis Section 3.3.
Slide 8- 1 Copyright © 2010 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Business Statistics First Edition.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
CS654: Digital Image Analysis Lecture 25: Hough Transform Slide credits: Guillermo Sapiro, Mubarak Shah, Derek Hoiem.
LECTURE 3: ANALYSIS OF EXPERIMENTAL DATA
EE3561_Unit 4(c)AL-DHAIFALLAH14351 EE 3561 : Computational Methods Unit 4 : Least Squares Curve Fitting Dr. Mujahed Al-Dhaifallah (Term 342) Reading Assignment.
Physics 114: Lecture 18 Least Squares Fit to Arbitrary Functions Dale E. Gary NJIT Physics Department.
Regression Analysis: Part 2 Inference Dummies / Interactions Multicollinearity / Heteroscedasticity Residual Analysis / Outliers.
Over-fitting and Regularization Chapter 4 textbook Lectures 11 and 12 on amlbook.com.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
CHAPTER 4 ESTIMATES OF MEAN AND ERRORS. 4.1 METHOD OF LEAST SQUARES I n Chapter 2 we defined the mean  of the parent distribution and noted that the.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
The simple linear regression model and parameter estimation
Chapter 4: Basic Estimation Techniques
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Chapter 4 Basic Estimation Techniques
Regression Analysis AGEC 784.
Statistical Data Analysis - Lecture /04/03
How do you assign an error to a measurement?
AP Statistics Chapter 14 Section 1.
Physics 114: Exam 2 Review Weeks 7-9
Non-Linear Models Tractable non-linearity Intractable non-linearity
Signal processing.
Multiple Regression.
Physics 114: Exam 2 Review Material from Weeks 7-11
Basic Estimation Techniques
Statistical Methods For Engineers
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Introduction to Instrumentation Engineering
Modelling data and curve fitting
Linear regression Fitting a straight line to observations.
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
6.2 Grid Search of Chi-Square Space
Simple Linear Regression
2.3 Estimating PDFs and PDF Parameters
5.2 Least-Squares Fit to a Straight Line
5.1 Introduction to Curve Fitting why do we fit data to a function?
M248: Analyzing data Block D UNIT D2 Regression.
6.1 Introduction to Chi-Square Space
Regression Statistics
HYDROLOGY Lecture 12 Probability
6.3 Gradient Search of Chi-Square Space
Feature extraction and alignment for LC/MS data
Presentation transcript:

6.7 Practical Problems with Curve Fitting simple conceptual problems more complex conceptual problems unequal coefficient contributions to chi-square space false minima in chi-square space "bumpy" chi-square space false minima with identical functional forms false minima with similar functions nearly impossible fits due to noise blurring false minima 6.7 : 1/9

Simple Conceptual Problems curve fitting will not work if the function does not represent the data (look at residuals!). having too few data points having too many parameters minimization of chi-square requires that the noise on y be adequately represented by a normal pdf poor coefficient estimates when searching chi-square space not checking the adequacy of the fit by comparing a regression line to the data, or not comparing the standard deviation of the fit to the expected error of the y measurement using a normal least-squares approach when the x-axis has error [see Mandel, "The Statistical Analysis of Experimental Data," Wiley Interscience, 1964, Chapter 12, p. 288] fitting a straight line to a log-log graph, when a slope different from unity makes no physical sense (instead fit the data to an equation of the form, y = a0 + x) using curve fitting to enhance the signal-to-noise ratio of poorly designed experiments 6.7 : 2/9

More Complex Conceptual Problems using a function that worked with one range of values to describe a different range of values. As an example, fluorescence is linear with low concentrations, but is non-linear at high concentrations. using an x-axis spacing so coarse that the measured points miss higher resolution features forgetting that assigning data to x-axis bins convolves the original function with the bin width. This makes a fitting function, that would be correct in the absence of binning, incorrect in the presence of binning. not recognizing that the data have an offset (in either x or y). The presence of an offset must be explicitly taken into account by the use of an additional equation coefficient. not recognizing that the instrument response is convolved with the measured data. As an example, an RC electronic time constant can modify the shape of a Gaussian peak by skewing it. trying to use advanced fitting of non-linear functions when there are too many equation coefficients for the quality of the data 6.7: 3/9

Unequal Contributions to c2-Space For an exponentially damped cosine (FID) an error in the cosine period has a far larger effect on chi-square than an equal-sized error in the decay constant. This makes mathematical sense since an error in period causes the trial cosine to "walk off" the data cosine. Solution: Normalize all parameters by the guess values such that the minima will be close to unity for all. 6.7 : 4/9

False Minima in Chi-Square Space Some functional forms create false minima far from the true minimum. This puts even more emphasis on the need for exceedingly good initial estimates of the coefficients. In this particular contour graph, violet is the minimum. All of the green "islands" are false minima. 6.7 : 5/9

False Minima with Identical Functions Chi-square space for two exponential functions with the same amplitude has two equally-deep minima. These minima differ only by the interchange of the parameters. Here m0 = 3 and m1 = 6. Without a good guess, programs that search chi-square space can become confused. 6.7 : 7/9

False Minima with Similar Functions With similar functions there will be one true and one or more false minima. As the functions become less similar, the false minima become less deep. With this case, it is important which function is associated with which guess. The false minimum will trap ANY method of searching chi-square space. 6.7 : 8/9

Nearly Impossible Fits In the presence of noise, it becomes extremely difficult to distinguish between the true minimum and the false minimum. Result: High coefficient uncertainties. The contour graphs are for the following function with two levels of noise added. a0 = 4 and a1 = 5 s = 0.001 s = 0.01 6.7 : 9/9