Download presentation
Presentation is loading. Please wait.
Published byEmily Rayner Modified over 9 years ago
1
Bayesian calibration and comparison of process-based forest models Marcel van Oijen & Ron Smith (CEH-Edinburgh) Jonathan Rougier (Durham Univ.)
2
ContentsContents 1.Bayesian calibration of a forest model 2.… what measurements to take? 3.Bayesian comparison of forest models
3
1. Bayesian calibration of a forest model
4
Process-based forest models Soil C NPP Height Environmental scenarios Initial values Parameters Model
5
Using data We need a method that: 1.Quantifies how uncertainty about inputs and model structure causes output uncertainty 2.Efficiently uses data, on inputs & outputs, to reduce uncertainties
6
BASic FORest model (BASFOR) BASFOR 12 output variables 39 parameters
7
BASFOR: Inputs BASFOR 12 output variables
8
BASFOR: Prior pdf for parameters
9
Example: Simulating growth of Norway spruce Skogaby
10
BASFOR: Prior predictive uncertainty Height Biomass Prior uncertainty for Skogaby
11
BASFOR: Predictive uncertainty BASFOR 12 output variables High output uncertainty 39 parameters High input uncertainty Data: measurements of output variables Calibration of parameters
12
Bayes’ Theorem P( |D) = P( ) P(D| ) / P(D) P( ) P(D|f( )) “Posterior distribution of parameters” “Prior distribution of parameters” “Likelihood” of data, given mismatch with model output f = the model, e.g. BASFOR
13
Finding the posterior: MCMC MCMC: walk through parameter-space →set of visited points approaches the posterior parameter distribution P( |D) [e.g. using Metropolis-Hastings random walk] Sample of 10 4 -10 5 parameter vectors from the posterior distribution P( |D) for the parameters P( |D) P( ) P(D|f( ))
14
MCMC: Metropolis-Hastings random walk Metropolis (1953) algorithm 1.Start anywhere in parameter-space: p 1..39 (i=0) 2.Randomly choose p(i+1) = p(i) + δ 3.IF:[ P(p(i+1)) P(D|f(p(i+1))) ] / [ P(p(i)) P(D|f(p(i))) ] > Random[0,1] THEN: accept p(i+1) ELSE: reject p(i+1) i=i+1 4.IF i < 10 4 GOTO 2 Sample of 10 4 -10 5 parameter vectors from the posterior distribution P( |D) for the parameters
15
Forest data from Skogaby (Sweden) Planted: 1966, (2300 trees ha -1 ) Weather data: 1987-1995 Soil data: C, N, Mineralisation rate Tree data: Biomass, NPP, Height, [N], LAI Skogaby
16
BASFOR: Prior predictive uncertainty Height Biomass Prior pred. uncertainty Data Skogaby Data: Göran Ågren
17
MCMC parameter trace plots: 10000 steps Steps in MCMC Param. value
18
Posterior marginal distributions for parameters
19
Parameter correlations 39 parameters
20
Bayesian calibration: overview Data Bayesian calibration
21
Prior & posterior predictive uncertainty Height Biomass Prior pred. uncertainty Posterior uncertainty (using data Skogaby)
22
Partial corr. coefficients (PCC) parameters – outputs 12 output variables 39 parameters
23
2. What kind of measurements would have reduced uncertainty the most?
24
Prior predictive uncertainty & height-data Height Biomass Prior pred. uncertainty Height data Skogaby
25
Prior & posterior uncertainty: use of height data Height Biomass Prior pred. uncertainty Posterior uncertainty (using height data) Height data Skogaby
26
Prior & posterior uncertainty: use of height data Height Biomass Prior pred. uncertainty Posterior uncertainty (using height data) Height data (hypothet.)
27
Prior & posterior uncertainty: use of height data Height Biomass Prior pred. uncertainty Posterior uncertainty (using height data) Posterior uncertainty (using precision height data)
28
Summary of procedure Data D ± σModel f Prior P( ) Calibrated parameters, with covariances Uncertainty of model output Sensitivity analysis of model parameters “Error function” e.g. N(0, σ) MCMC Samples of (10 4 – 10 5 ) Samples of f( ) (10 4 – 10 5 ) P(D|f( )) Posterior P( |D) PCC
29
3. Bayesian comparison of forest models
30
Uncertainty regarding model structure Soil C NPP Height Environmental scenarios Initial values Parameters Model Imperfect understanding
31
Bayesian comparison of two models Bayes Theorem for model probab.: P(M|D) = P(M) P(D|M) / P(D) The “Integrated likelihood” P(D|M i ) can be approximated from the MCMC sample of outputs for model M i ( * ) Model 1 Model 2 P(M 2 |D) / P(M 1 |D) = P(D|M 2 ) / P(D|M 1 ) The “Bayes Factor” P(D|M 2 ) / P(D|M 1 ) quantifies how the data D change the odds of M 2 over M 1 P(M 1 ) = P(M 2 ) = ½ (*)(*) harmonic mean of likelihoods in MCMC-sample (Kass & Raftery, 1995)
32
Bayes Factor for two big forest models MCMC 5000 steps Calculation of P(D|BASFOR) Calculation of P(D|BASFOR+) Data Rajec: Emil Klimo
33
Bayes Factor for two big forest models MCMC 5000 steps Calculation of P(D|BASFOR) Calculation of P(D|BASFOR+) Data Rajec: Emil Klimo P(D|M 1 ) = 7.2e-016 P(D|M 2 ) = 5.8e-15 Bayes Factor = 7.8, so BASFOR+ supported by the data
34
Summary of procedure Data D Prior P( 1 ) Updated parameters MCMC Samples of 1 (10 4 – 10 5 ) Posterior P( 1 |D) Model 1 MCMC Prior P( 2 ) Model 2 Samples of 2 (10 4 – 10 5 ) Posterior P( 2 |D) Updated parameters P(D|M 1 )P(D|M 2 ) Bayes factor Updated model odds
35
ConclusionsConclusions Bayesian calibration using MCMC: Improves model predictive capacity, by updating parameters Quantifies uncertainty in parameters and output Forest model calibration: Benefits from high-precision tree height measurement Bayesian model comparison: Same probabilistic approach as Bayesian calibration Bayes Factor shows how new data change the odds of models Aid in model development
36
AppendicesAppendices
37
Bayesian calibration of big models P( |D) P( ) P(D|f( )) Calculating P( |D) costs much time: 1.Sample parameter-space representatively 2.For each sampled set of parameter-values: a.Calculate P( ) b.Run the model to calculate likelihood P(D|f( )) Sampling problem: Markov Chain Monte Carlo (MCMC) methods Computing problem: increased processor speed Solutions
38
Bayes Factor for two big forest models BASFOR 39 params BASFOR + 41 params (Penman eq., corrections) MCMC 10000 steps Calibration
39
Bayesian methods Bayes, T. (1763) Metropolis, N. (1953) Green, E.J. / MacFarlane, D.W. / Valentine, H.T., Strawderman, W.E. (1996, 1998, 1999, 2000) Jansen, M. (1997) Jaynes, E.T. (2003) Bayes’ Theorem MCMC Forest models Crop models Probability theory
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.