Presentation is loading. Please wait.

Presentation is loading. Please wait.

- 1 - Calibration with discrepancy Major references –Calibration lecture is not in the book. –Kennedy, Marc C., and Anthony O'Hagan. "Bayesian calibration.

Similar presentations


Presentation on theme: "- 1 - Calibration with discrepancy Major references –Calibration lecture is not in the book. –Kennedy, Marc C., and Anthony O'Hagan. "Bayesian calibration."— Presentation transcript:

1 - 1 - Calibration with discrepancy Major references –Calibration lecture is not in the book. –Kennedy, Marc C., and Anthony O'Hagan. "Bayesian calibration of computer models." Journal of Royal Statistical Society: Series B (2001). –Campbell, Katherine. "Statistical calibration of computer simulations." Reliability Engineering & System Safety 91.10 (2006): 1358-1363. –Higdon, Dave, et al. "Combining field data and computer simulations for calibration and prediction." SIAM J Scientific Computing 26.2 (2004). –Bayarri, Maria J., et al. "A framework for validation of computer models." Technometrics 49.2 (2007). –Bayarri, M. J., et al. "Predicting vehicle crashworthiness: Validation of computer models for functional and hierarchical data." JASA (2009). –Loeppky, Jason L., Derek Bingham, and William J. Welch. "Computer model calibration or tuning in practice.“ (2006). –McFarland, John, et al. "Calibration and uncertainty analysis for computer simulations with multivariate output." AIAA journal 46.5 (2008).

2 - 2 - Calibration with discrepancy Motivation –Computer model approximates reality, but often less succeed due to incomplete knowledge. Then, discrepancy exist in more or less degree. –Accounting for this bias is the central issue for calibration. –If we ignore this, we will get unwanted large bounds in the prediction. How to model the discrepancy ? –Gaussian process regression (GPR) is employed to express the discrepancy in approximate manner. –Estimation includes not only the calibration parameters but also the associated GPR parameters. –The discrepancy term has two purposes. 1. close the gap between the model and reality, making further improvement of the calibration. 2. validate the model accuracy. If small discrepancy, the model is good.

3 - 3 - Calibration with discrepancy Formulation –Computer model with calibration parameters –Reality = model + discrepancy –Field data = reality + observation error Unknowns to be estimated –Calibration parameter  in the computer model y M = m ( x |  ) –GPR parameters ,  b in the discrepancy b(x) ~ N(F ,  b 2 Q) –Standard deviation  in the observation error  ~ N(0,  2 ) Field data y F Model y M =m(x|  ) Bias b(x) Error  ~ N(0,  2 ) Result =

4 - 4 - Calibration with discrepancy Practice the process by increasing order of complexity. Estimate bias function only. –Estimate GPR parameters ,  b in the bias function. –Calibration parameter  in the model is assumed as known. –Observation error  is assumed as zero. Estimate bias function plus error. –Calibration parameter  in the model is assumed as known. Estimate all: bias function, error & calibration parameter

5 - 5 - Estimate bias function only Problem statement –Computer model with single parameter Calibration parameter is known as  = 0.6223. –Field data are given with one data at a point, i.e., with no replication. Assume that there is no observation error , i.e.,  = 0. –Then problem is only to carry out GPR for the bias function. Field data = model + biasBias data = field data - model Estimate GPR parameters using bias data.

6 - 6 - Gaussian process regression - Review –Review from lecture note: 10 Bayesian correlated regression.ppt Concept of GPR –GP of a function b(x) is defined by multivariate normal distribution where  b are the parameters estimated from the data. –Mean F  responsible for global variation such as polynomial regression. Usually taken to be constant by having F=1. –Correlation matrix, responsible for smoothness of connection between points, is given by where h is the parameter that controls smoothness of the GPR. –If the matrix is changed to Q =  b 2 I, the function is reduced to the ordinary regression with the mean at F  (<- regression fit to a constant  ) and error being iid normal N(0,  2 ).

7 - 7 - Gaussian process regression - Review Correlation matrix –In the correlation of two points if h is large, we have high correlation for y’s at the points, i.e., y i & y j will not differ much. If h is small, y i & y j will behave independent. –Likewise, at point close to the observation, GPR y will not differ much from the observed y, which means we have small uncertainty. At point itself, uncertainty goes to zero & GPR is the same as observed y. This is why GPR is the interpolation. –If large h, high correlation extends over large distance and leads to smooth connection between observed y. –If small h, correlation quickly dies off, and y are made with iid normal error. The mean becomes close to F . (<- actually this just a constant and mean of observed data y.)

8 - 8 - Gaussian process regression - Review Process of GPR –Likelihood of y F –Posterior distribution of the parameters  b –Analytical formula of point estimation –Posterior predictive distribution

9 - 9 - Estimate bias function only Results (think the bias data b F = y F – y M as y F in GPR.) –As expected, GPR behave different with respect to parameter h. –With small h=0.01, GPR gets closer to a constant at the mean of data. –At h=0.2, MCMC is run several times and compared to point estimation. Results are stable, and agree closely to the point estimated values. –Arbitrary h can be avoided using MLE method to obtain optimum h. As a result, h = 0.416. However, with h>0.2, singularity occurs. –With large h=0.2, GPR connects smoothly over the data. Also the uncertainty bound is decreased.  b -0.0154 0.7380 -0.0954 0.7490 -0.0052 0.7432 -0.0356 0.7416 point estimation -0.0445 0.7383

10 - 10 - Estimate bias function plus error Remark –What if the data include observation error ? GPR parameters and observation error are estimated simultaneously. –Recall that in classical regression, we estimate regression model and error  simultaneously, each of which are obtained as distributions. y = y R +  =  i f i (x) +    regression error As a result, we get confidence bounds of regression, and predictive bounds of regression due to the error addition.

11 - 11 - Estimate bias function plus error Process of GPR including error –Likelihood of bias data b F = y F – y M where y M = 5exp(-  x) is given function –Posterior distribution of the bias parameters  b and error  –Posterior predictive distribution –Once we obtain b p, we get the original response y by adding y M

12 - 12 - Estimate bias function plus error Results –GPR results are given for parameter h=0.01, 0.1 and 0.2 respectively. 1 st figures are b(x) only. 2 nd figures are ym(x)+b(x)+error. Results are compared to that without error & bias.

13 - 13 - Calibration with discrepancy GPR including discrepancy –All the process is the same except the calibration parameter  is turned into the unknown to be estimated. –Likelihood of field data y F –Posterior distribution of all the parameters –Posterior predictive distribution Remember b F = y F – y M

14 - 14 - Calibration with discrepancy Results –MCMC results are given when h=0.2. left & right figures are trace and histograms of , ,  b respectively. –Following left & right figures scatter plot of ( ,  and ( ,  b ) respectively.

15 - 15 - Calibration with discrepancy Results –The two parameters ( ,  are severely correlated. This means that the computer model y M = m(x,  ) and bias function is confounded because of the relation y R = y M + b(x). Due to this, they are not statistically identifiable. –Nevertheless, the prediction will be the same whatever the combination will be. In this sense, interpretation of the calibrated parameter is not important.

16 - 16 - Calibration with discrepancy Results –Result by 1 st attempt –Result by 2 nd attempt 0.4930 0.3365 -0.4059 0.4711 1.2829 0.3511 0.3468 0.5540

17 - 17 - Calibration with discrepancy Results –Result by 3 rd attempt –Result comparison 0.7660 0.3361 -0.0652 0.5013


Download ppt "- 1 - Calibration with discrepancy Major references –Calibration lecture is not in the book. –Kennedy, Marc C., and Anthony O'Hagan. "Bayesian calibration."

Similar presentations


Ads by Google