Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 1 UNCERTAINTY ANALYSIS: A BASIC OVERVIEW.

Similar presentations


Presentation on theme: "Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 1 UNCERTAINTY ANALYSIS: A BASIC OVERVIEW."— Presentation transcript:

1 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 1 UNCERTAINTY ANALYSIS: A BASIC OVERVIEW presented at CAVS by GLENN STEELE www.uncertainty-analysis.com August 31, 2011

2 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 2 EXPERIMENTAL UNCERTAINTY REFERENCES The ISO GUM: The de facto international standard

3 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 3 EXPERIMENTAL UNCERTAINTY REFERENCES http://www.oiml.org/publications/?publi=3&publi_langue=en

4 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 4 VALIDATION REFERENCES

5 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 5 VALIDATION REFERENCES

6 Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 6 “Degree of Goodness” When we use experimental results (such as property values) in an analytical solution, we should consider “how good” the data are and what influence that degree of goodness has on the interpretation and usefulness of the solution When we compare model predictions with experimental data, as in a validation process, we should consider the degree of goodness of the model results and the degree of goodness of the data.

7 Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 7 Typical comparison of predictions and data, considering no uncertainties: Result, C D Set point, Re

8 Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 8 Comparison of predictions and data considering only the likely uncertainty in the experimental result: Result, C D Set point, Re Uncertainties set the resolution at which meaningful comparisons can be made.

9 Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 9 Validation comparison considering all uncertainties: S  value from the simulation D  data value from experiment E  comparison error E = S - D =  S -  D where (  S =  model +  input +  num ) U Re USUS UCDUCD Result, C D Set point, Re

10 Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 10 “Degree of Goodness” and Uncertainty Analysis When an experimental approach to solving a problem is to be used, the question of “how good must the results be?” should be answered at the very beginning of the effort. This required degree of goodness can then be used as guidance in the planning and design of the experiment. We use the concept of uncertainty to describe the “degree of goodness” of a measurement or an experimental result.

11 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 11 ERRORS & UNCERTAINTIES

12 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 12 An error  is a quantity with a sign and magnitude. (We assume any error whose sign and magnitude is known has been corrected for, so the errors that remain are of unknown sign and magnitude.) An uncertainty u is an estimate of an interval  u that should contain .

13 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 13 Consider making a measurement of a steady variable X (whose true value is designated as X true ) that is influenced by errors  i from 5 elemental error sources. Postulate that errors  1 and  2 do not vary as measurements are made, and  3,  4, and  5 do vary during the measurement period:

14 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 14 The total error (  ) is the sum of –  (=  1 +  2 ) the systematic, or fixed, error –  (=  3 +  4 +  5 ) the random, or repeatability, error  =  +   varies) β (does not vary) 11 22

15 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 15 The k th measurement of X then appears as The total error (  k ) is the sum of –  k the systematic, or fixed, error –  k the random, or repeatability, error

16 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 16 Central Limit Theorem   statistics   ???

17 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 17 Histogram of temperatures read from a thermometer by 24 students

18 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 18  varies) β (does not vary) 11 22 Now consider again making the measurements of X

19 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 19 We can calculate the standard deviation s X of the distribution of N measurements of X and that will correspond to a standard uncertainty (u) estimate of the range of the  i ’s. We will call s X the random standard uncertainty.

20 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 20

21 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 21 We will estimate systematic standard uncertainties corresponding to the elemental systematic errors  i and use the symbol b i to denote such an uncertainty. Thus ±b 1 will be an uncertainty interval that should contain  1, ±b 2 will be an uncertainty interval that should contain  2, and so on.... The systematic standard uncertainty b i is understood to be an estimate of the standard deviation of the parent population from which the systematic error  i is a single realization.

22 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 22 The standard uncertainty in X -- denoted u X -- is defined such that the interval ± u X contains the (unknown) combination and, in accordance with the GUM, is given by

23 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 23 Categorizing and Estimating Uncertainties in the Measurement of a Variable GUM categorization by method of evaluation: –Type A  “method of evaluation of uncertainty by the statistical analysis of series of observations” –Type B  “method of evaluation of uncertainty by means other than the statistical analysis of series of observations” Traditional U.S. categorization by effect on measurement: –Random (component of) uncertainty  estimate of the effect of the random errors on the measured value –Systematic (component of) uncertainty  estimate of the effect of the systematic errors on the measured value Both are useful, and they are not inconsistent. Use of both will be illustrated in the examples in this course.

24 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 24 An Additional Uncertainty Categorization In the fields of Risk Analysis, Reliability Engineering, Systems Safety Assessment, and others, uncertainties are often categorized as Aleatory –Variability –Due to a random process Epistemic –Incertitude –Due to lack of knowledge

25 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 25 Uncertainty Categorization 100 % The key is to identify the significant errors and estimate the corresponding uncertainties – whether one divides them into categories for convenience of Random – Systematic Type A – Type B Aleatory – Epistemic Lemons – Chipmunks should make no difference in the overall estimate u if one proceeds properly.

26 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 26 OVERALL UNCERTAINTY OF A MEASUREMENT At the standard deviation level Systematic Standard Uncertainty = (for 2 elemental systematic errors) Random Standard Uncertainty = s X (or ) Combined Standard Uncertainty = u X Overall or Expanded Uncertainty at C % confidence

27 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 27 For large samples, assuming the total errors in the measurements have a roughly Gaussian distribution, and using a 95% confidence level, k 95 = 2 and The true value of the variable will then be within the limits about 95 times out of 100. To obtain a value of the coverage factor k, an assumption about the form of the distribution of the total errors (the  ’s) in X is necessary.

28 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 28 RESULT DETERMINED FROM MULTIPLE MEASURED VARIABLES

29 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 29 We usually combine several variables using a Data Reduction Equation (DRE) to determine an experimental result. These have the general DRE form There are two approaches used for propagating uncertainties through the DREs: –the Taylor Series Method (TSM) –the Monte Carlo Method (MCM)

30 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 30 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. For the case where the result r is a function of two variables x and y r = f(x,y) the combined standard uncertainty of the result, u r, is given by where s r is calculated from multiple result determinations and the b x and b y systematic standard uncertainties are determined from the combination of elemental systematic uncertainties that affect x and y as and TAYLOR SERIES METHOD OF UNCERTAINTY PROPAGATION

31 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 31 Monte Carlo Method of Uncertainty Propagation

32 32 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. Applying General Uncertainty Analysis – Experimental Planning Phase

33 33 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. GENERAL UNCERTAINTY ANALYSIS For a result given by a data reduction equation (DRE) the uncertainty is given by Example DRE Note that (assuming the large sample approximation) the U in the propagation equation can be interpreted as the 95% confidence U 95 = 2 u or as the standard uncertainty u as long as each term in the equation is treated consistently.

34 34 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. Example It is proposed that the shear modulus, M S, be determined for an alloy by measuring the angular deformation  produced when a torque T is applied to a cylindrical rod of the alloy with radius R and length L. The expression relating these variables is We wish to examine the sensitivity of the experimental result to the uncertainties in the variables that must be measured before we proceed with a detailed experimental design. The physical situation shown below (where torque T is given by aF) is described by the data reduction equation for the shear modulus

35 35 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

36 36 ESTIMATING RANDOM UNCERTAINTIES

37 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 37 Data sets for determining estimates of standard deviations and random uncertainties should be acquired over a time period that is large relative to the time scales of the factors that have a significant influence on the data and that contribute to the random errors.

38 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 38 Direct Calculation Approach for Random Uncertainty For a result that is determined M times the mean value of the result is and

39 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 39 ESTIMATING SYSTEMATIC UNCERTAINTIES

40 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 40 Propagation of systematic errors into an experimental result:

41 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 41 The systematic standard uncertainties for the elemental error sources are estimated in a variety of ways that were discussed in some detail in the course. Among the ways used to obtain estimates are: use of previous experience, manufacturer’s specifications, calibration data, results from specially designed “side” experiments, results from analytical models, and others.

42 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 42 Recall the definition of a systematic standard uncertainty, b. It is not the most likely value of , nor the maximum value. It is the standard deviation of the assumed parent population of possible values of .

43 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 43 SYSTEMATIC STANDARD UNCERTAINTY

44 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 44

45 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 45 Correlated Systematic Errors Typically occur when different measured variables share one or more elemental error sources –multiple variables measured with same transducer probe traversed across flow field multiple pressures ported sequentially to the same transducer (scanivalve) –multiple transducers calibrated against same standard electronically scanned pressure (ESP) systems in use in aerospace ground test facilities Examples –q = m C p (T o – T i ) – –u’v’

46 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 46 Using the TSM, there is a term in the b r 2 equation for each pair of variables in the DRE that might share an error source: For q = m C p (T o – T i ) For For u’v’....

47 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 47

48 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 48 Some Final Practical Points on Estimating Systematic Uncertainties When estimating b, we are not trying to estimate the most probable value nor the maximum possible value of  Always remember to view and use estimates with common sense. For example, a “% of full scale” b should not apply near zero if the instrument is nulled. Resources should not be wasted on obtaining good uncertainty estimates for insignificant sources – a practice we have observed too many times….

49 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 49 “V&V” – Verification & Validation: The Process Preparation –Specification of validation variables, validation set points, etc. (This specification determines the resource commitment that is necessary.) –It is critical for modelers and experimentalists to work together in this phase. The experimental and simulation results to be compared must be conceptually identical. Verification –Are the equations solved correctly? (MMS for code verification. Grid convergence studies, etc, for solution verification to estimate u num.) Validation –Are the correct equations being solved? (Compare with experimental data and attempt to assess  model ) Documentation

50 Copyright 2008 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 50 A Validation Comparison

51 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 51 V&V Overview – Sources of Error Shown in Ovals

52 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 52 Isolate the modeling error, having a value or uncertainty for everything else E=S-D =  model + (  input +  num -  D )  model = E - (  input +  num -  D ) If ± u val is an interval that includes (  input +  num -  D ) then  model lies within the interval E ± u val Strategy of the Approach E ± u val

53 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 53 Uncertainty Estimates Necessary to Obtain the Validation Uncertainty u val Uncertainty in simulation result due to numerical solution of the equations, u num (code and solution verification) Uncertainty in experimental result, u D Uncertainty in simulation result due to uncertainties in code inputs, u input Propagation by (A)Taylor Series (B)Monte Carlo

54 Methodology Simulation Uncertainty Modeling error for uncalibrated model used to make calculations between validation points where u sp = uncertainty contribution from the uncertainty of input parameters at the simulation calculation point and u E = uncertainty in E at the calculation point from the interpolation process 54 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

55 55 Uncertainty of Calibrated Models

56 Methodology Instrument Calibration Analogy Uncalibrated instrumentation system where u t = uncertainty of the transducer and u m = uncertainty of the meter Calibrated instrumentation system where u c is the calibration uncertainty 56 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

57 Methodology Instrument Calibration Analogy If a curve-fit is used to develop a relationship between the meter reading and the calibrated output value, then where u cf = the curve-fit uncertainty If the meter used in testing (m 2 ) is different from the meter used in calibration (m 1 ), then 57 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

58 Methodology Instrument Calibration Analogy The uncertainties, u, in the previous expressions are standard uncertainties, at the standard deviation level. To express the uncertainty at a given confidence level, such as 95%, the standard uncertainty is multiplied by an expansion factor. For most engineering applications, the expansion factor is 2 for 95% confidence. 58Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

59 Methodology Calibrated Model To calibrate a model, the simulation results are compared with a set of data and corrections are applied to the model to make it match the data. The simulation uncertainty is then As in the curve-fit uncertainty in the calibration of a transducer, there will be additional uncertainty in the calibrated model based on the error between the corrected simulation results and the data. 59 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

60 Methodology Calibrated Model would apply for simulation results over the range of the input parameter values used in the calibration of the model with the assumption that the input parameters in the simulation have the same uncertainties that they had in the calibration process. If the input parameter sources or transducers change for a simulation result, then 60Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.


Download ppt "Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 1 UNCERTAINTY ANALYSIS: A BASIC OVERVIEW."

Similar presentations


Ads by Google