R Kass/SP07 P416 Lecture 4 1 Propagation of Errors ( Chapter 3, Taylor ) Introduction Example: Suppose we measure the current (I) and resistance (R) of.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

The Maximum Likelihood Method
General Statistics Ch En 475 Unit Operations. Quantifying variables (i.e. answering a question with a number) 1. Directly measure the variable. - referred.
Physics 270 – Experimental Physics. Let say we are given a functional relationship between several measured variables Q(x, y, …) What is the uncertainty.
Weibull exercise Lifetimes in years of an electric motor have a Weibull distribution with b = 3 and a = 5. Find a guarantee time, g, such that 95% of.
14.1 ARITHMETIC MEAN Experimental Readings are scattered around amean value Fig Scatter of the readings around the mean value.
CE 428 LAB IV Uncertainty in Measured Quantities Measured values are not exact Uncertainty must be estimated –simple method is based upon the size of the.
Class notes for ISE 201 San Jose State University
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 7-1 Introduction to Statistics: Chapter 8 Estimation.
Statistical Background
Types of Errors Difference between measured result and true value. u Illegitimate errors u Blunders resulting from mistakes in procedure. You must be careful.
Fall 2006 – Fundamentals of Business Statistics 1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter 7 Estimating Population Values.
Chi Square Distribution (c2) and Least Squares Fitting
Chapter 7 Estimating Population Values
Statistics of repeated measurements
Inferential Statistics
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
R. Kass/S07 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Introduction l Example: Suppose we measure the current (I) and resistance (R) of a resistor. u Ohm's law relates V and I: V = IR u If we know the uncertainties.
Development of An ERROR ESTIMATE P M V Subbarao Professor Mechanical Engineering Department A Tolerance to Error Generates New Information….
880.P20 Winter 2006 Richard Kass Propagation of Errors Suppose we measure the branching fraction BR(Higgs  +  - ) using the number of produced Higgs.
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 8-1 Chapter 8 Confidence Interval Estimation Basic Business Statistics 11 th Edition.
Confidence Interval Estimation
880.P20 Winter 2006 Richard Kass 1 Maximum Likelihood Method (MLM) Does this procedure make sense? The MLM answers this question and provides a method.
Combined Uncertainty P M V Subbarao Professor Mechanical Engineering Department A Model for Propagation of Uncertainty ….
PARAMETRIC STATISTICAL INFERENCE
General Statistics Ch En 475 Unit Operations. Quantifying variables (i.e. answering a question with a number) Each has some error or uncertainty.
Error Analysis Significant Figures and Error Propagation.
R. Kass/W03P416/Lecture 7 1 Lecture 7 Some Advanced Topics using Propagation of Errors and Least Squares Fitting Error on the mean (review from Lecture.
Physics 114: Exam 2 Review Lectures 11-16
LECTURER PROF.Dr. DEMIR BAYKA AUTOMOTIVE ENGINEERING LABORATORY I.
Calculating Electricity
Lab 3b: Distribution of the mean
General Statistics Ch En 475 Unit Operations. Quantifying variables (i.e. answering a question with a number) 1. Directly measure the variable. - referred.
Physics 270 – Experimental Physics. Let say we are given a functional relationship between several measured variables Q(x, y, …) x ±  x and x ±  y What.
1 2 nd Pre-Lab Quiz 3 rd Pre-Lab Quiz 4 th Pre-Lab Quiz.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
ME Mechanical and Thermal Systems Lab Fall 2011 Chapter 3: Assessing and Presenting Experimental Data Professor: Sam Kassegne, PhD, PE.
Statistics 300: Elementary Statistics Sections 7-2, 7-3, 7-4, 7-5.
Analysis of Experimental Data; Introduction
Statistical Inference Statistical inference is concerned with the use of sample data to make inferences about unknown population parameters. For example,
R.Kass/F02 P416 Lecture 1 1 Lecture 1 Probability and Statistics Introduction: l The understanding of many physical phenomena depend on statistical and.
Measurements and Their Analysis. Introduction Note that in this chapter, we are talking about multiple measurements of the same quantity Numerical analysis.
Lecture 8: Measurement Errors 1. Objectives List some sources of measurement errors. Classify measurement errors into systematic and random errors. Study.
CHAPTER- 3.2 ERROR ANALYSIS. 3.3 SPECIFIC ERROR FORMULAS  The expressions of Equations (3.13) and (3.14) were derived for the general relationship of.
CHAPTER- 3.1 ERROR ANALYSIS.  Now we shall further consider  how to estimate uncertainties in our measurements,  the sources of the uncertainties,
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
CHAPTER 4 ESTIMATES OF MEAN AND ERRORS. 4.1 METHOD OF LEAST SQUARES I n Chapter 2 we defined the mean  of the parent distribution and noted that the.
STATISTICS People sometimes use statistics to describe the results of an experiment or an investigation. This process is referred to as data analysis or.
This represents the most probable value of the measured variable. The more readings you take, the more accurate result you will get.
R. Kass/Sp07P416/Lecture 71 More on Least Squares Fit (LSQF) In Lec 5, we discussed how we can fit our data points to a linear function (straight line)
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Statistical Interpretation of Least Squares ASEN.
Chapter 8 Confidence Interval Estimation Statistics For Managers 5 th Edition.
Statistics for Business and Economics 7 th Edition Chapter 7 Estimation: Single Population Copyright © 2010 Pearson Education, Inc. Publishing as Prentice.
R. Kass/W04 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Probability and Statistics for Particle Physics Javier Magnin CBPF – Brazilian Center for Research in Physics Rio de Janeiro - Brazil.
The Maximum Likelihood Method
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Lecture 1 Probability and Statistics
The Gaussian Probability Distribution Function
The Maximum Likelihood Method
The Maximum Likelihood Method
Introduction to Instrumentation Engineering
Lecture 4 Propagation of errors
Propagation of Error Berlin Chen
Propagation of Error Berlin Chen
Presentation transcript:

R Kass/SP07 P416 Lecture 4 1 Propagation of Errors ( Chapter 3, Taylor ) Introduction Example: Suppose we measure the current (I) and resistance (R) of a resistor. Ohm's law relates V and I: V = IR If we know the uncertainties (e.g. standard deviations) in I and R,what is the uncertainty in V? More formally, given a functional relationship between several measured variables (x, y, z), Q=f(x, y, z) What is the uncertainty in Q if the uncertainties in x, y, and z are known? To answer this question we use a technique called Propagation of Errors. Usually when we talk about uncertainties in a measured variable such as x, we write: x . In most cases we assume that the uncertainty is “Gaussian” in the sense that 68% of the time we expect the “true” (but unknown) value of x to be in an interval given by [x- , x+  ]. BUT not all measurements can be represented by Gaussian distributions (more on that later)! Propagation of Error Formula To calculate the variance in Q as a function of the variances in x and y we use the following: If the variables x and y are uncorrelated then  xy = 0 and the last term in the above equation is zero. We can derive the above formula as follows: Assume we have several measurement of the quantities x (x 1, x 2...x N ) and y (y 1, y 2...y N ). The average of x and y:

R Kass/SP07 P416 Lecture 4 2 define: evaluated at the average values expand Q i about the average values: assume the measured values are close to the average values, and neglect the higher order terms: If the measurements are uncorrelatedthe summation in the above equation is zero uncorrelated errors Since the derivatives are evaluated at the average values (  x,  y ) we can pull them out of the summation

R Kass/SP07 P416 Lecture 4 3 If x and y are correlated, define  xy as: Example: Power in an electric circuit. P = I 2 R Let I = 1.0 ± 0.1 amp and R = 10. ± 1.0   P = 10 watts calculate the variance in the power using propagation of errors assuming I and R are uncorrelated P = 10± 2 watts If the true value of the power was 10 W and we measured it many times with an uncertainty (  ) of ± 2 W and Gaussian statistics apply then 68% of the measurements would lie in the range [8,12] watts Sometimes it is convenient to put the above calculation in terms of relative errors: In this example the uncertainty in the current dominates the uncertainty in the power! Thus the current must be measured more precisely if we want to reduce the uncertainty in the power. correlated errors

R Kass/SP07 P416 Lecture 4 4 Example: The error in the average (“error in the mean”). The average of several measurements each with the same uncertainty (  ) is given by: We can determine the mean better by combining measurements. But the precision only increases as the square root of the number of measurements. n Do not confuse   with  ! n  is related to the width of the pdf (e.g. Gaussian) that the measurements come from. n  does not get smaller as we combine measurements. The problem with the Propagation of Errors technique: In calculating the variance using propagation of errors: We usually assume the error in measured variable (e.g. x) is Gaussian. BUT if x is described by a Gaussian distribution f(x) may not be described by a Gaussian distribution! “error in the mean” It can be shown that if a function is of the form: f(x,y,z)=x a y b z c then the relative variance of f(x,y,z) is: generalization of Eq. 3.18

R Kass/SP07 P416 Lecture 4 5 l What does the standard deviation that we calculate from propagation of errors mean? Example: The new distribution is Gaussian. n Let y = Ax, with A = a constant and x a Gaussian variable.  y = A  x and  y = A  x n Let the probability distribution for x be Gaussian: Thus the new probability distribution for y, p(y,  y,  y ), is also described by a Gaussian dN/dy y = 2x with x = 10 ± 2  y  =  x  =  y Start with a Gaussian with  = 10,  = 2 Get another Gaussian with  = 20,  = 4

R Kass/SP07 P416 Lecture 4 6 u Example: When the new distribution is non-Gaussian: y = 2/x. n The transformed probability distribution function for y does not have the form of a Gaussian. l Unphysical situations can arise if we use the propagation of errors results blindly! u Example: Suppose we measure the volume of a cylinder: V =  R 2 L. n Let R = 1 cm exact, and L = 1.0 ± 0.5 cm. n Using propagation of errors:  V =  R 2  L =  /2 cm 3. V =  ±  /2 cm 3 n If the error on V (  V ) is to be interpreted in the Gaussian sense then there is a finite probability (≈ 3%) that the volume (V) is < 0 since V is only 2  away from 0! Clearly this is unphysical! Care must be taken in interpreting the meaning of  V dN/dy y = 2/x with x = 10 ± 2  y  =  x  x 2 y Start with a Gaussian with  = 10,  = 2. DO NOT get another Gaussian! Get a pdf with  = 0.2,  = This new pdf has longer tails than a Gaussian pdf: Prob(y >  y + 5  y ) = 5x10 -3, for Gaussian  3x10 -7