16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty A matter of (Bayesian) belief?

Slides:



Advertisements
Similar presentations
Uncertainty Quantification (UQ) and Climate Change Talking Points Mark Berliner, Ohio State Issues of continuing interest: Models, Data, Impacts & Decision.
Advertisements

Evaluating Provider Reliability in Risk-aware Grid Brokering Iain Gourlay.
Global warming: temperature and precipitation observations and predictions.
CMIP5: Overview of the Coupled Model Intercomparison Project Phase 5
Februar 2003 Workshop Kopenhagen1 Assessing the uncertainties in regional climate predictions of the 20 th and 21 th century Andreas Hense Meteorologisches.
Exploring uncertainty in cost effectiveness analysis NICE International and HITAP copyright © 2013 Francis Ruiz NICE International (acknowledgements to:
Theme D: Model Processes, Errors and Inadequacies Mat Collins, College of Engineering, Mathematics and Physical Sciences, University of Exeter and Met.
© Crown copyright Met Office ACRE working group 2: downscaling David Hein and Richard Jones Research funded by.
Carbon Cycle and Ecosystems Important Concerns: Potential greenhouse warming (CO 2, CH 4 ) and ecosystem interactions with climate Carbon management (e.g.,
June 17th 2003 Spruce VI1 On the use of statistics in complex weather and climate models Andreas Hense Meteorological Institute University Bonn.
Introduction of Probabilistic Reasoning and Bayesian Networks
What role should probabilistic sensitivity analysis play in SMC decision making? Andrew Briggs, DPhil University of Oxford.
Impact of climate scenarios on vulnerability and coping Impact of climate scenarios on vulnerability and coping or “How does the climate modeler fit in.
© Crown copyright Met Office Regional/local climate projections: present ability and future plans Research funded by Richard Jones: WCRP workshop on regional.
Projections of Future Atlantic Hurricane Activity Hurricane Katrina, Aug GFDL model simulation of Atlantic hurricane activity Tom Knutson NOAA /
A Bayesian hierarchical modeling approach to reconstructing past climates David Hirst Norwegian Computing Center.
Uncertainty and Climate Change Dealing with uncertainty in climate change impacts Daniel J. Vimont Atmospheric and Oceanic Sciences Department Center for.
Structural uncertainty from an economists’ perspective
University of Oxford Quantifying and communicating the robustness of estimates of uncertainty in climate predictions Implications for uncertainty language.
How Science Works Glossary AS Level. Accuracy An accurate measurement is one which is close to the true value.
Statistics for Managers Using Microsoft® Excel 5th Edition
Lecture II-2: Probability Review
Global Climate Change: Implications for South Africa Bruce Hewitson: Climate Systems Analysis Group (CSAG), University of Cape Town.
Statistical Methods For Engineers ChE 477 (UO Lab) Larry Baxter & Stan Harding Brigham Young University.
Page 1© Crown copyright 2004 Development of probabilistic climate predictions for UKCIP08 David Sexton, James Murphy, Mat Collins, Geoff Jenkins, Glen.
The IPCC as parliament of things Dealing with uncertainty and value commitments in climate simulation 13 March 2012 | Arthur Petersen.
Estimation and Hypothesis Testing. The Investment Decision What would you like to know? What will be the return on my investment? Not possible PDF for.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
CORDEX Scope, or What is CORDEX?  Provide a set of regional climate scenarios (including uncertainties) covering the period , for the majority.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Topics: Statistics & Experimental Design The Human Visual System Color Science Light Sources: Radiometry/Photometry Geometric Optics Tone-transfer Function.
© Crown copyright Met Office Using a perturbed physics ensemble to make probabilistic climate projections for the UK Isaac Newton workshop, Exeter David.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
STARDEX STAtistical and Regional dynamical Downscaling of EXtremes for European regions A project within the EC 5th Framework Programme EVK2-CT
Towards determining ‘reliable’ 21st century precipitation and temperature change signal from IPCC CMIP3 climate simulations Abha Sood Brett Mullan, Stephen.
Introduction Climate Futures VN Climate Futures approach to the provision of regional climate projection information CMAR/CLIMATE ADAPTATION FLAGSHIP Tim.
Where the Research Meets the Road: Climate Science, Uncertainties, and Knowledge Gaps First National Expert and Stakeholder Workshop on Water Infrastructure.
Introduction Osborn. Daubert is a benchmark!!!: Daubert (1993)- Judges are the “gatekeepers” of scientific evidence. Must determine if the science is.
“Facts are not science – as the dictionary is not literature” –Martin H. Fischer If science is not facts, what is it?
Chapter 14 Inference for Regression © 2011 Pearson Education, Inc. 1 Business Statistics: A First Course.
Toward Probabilistic Seasonal Prediction Nir Krakauer, Hannah Aizenman, Michael Grossberg, Irina Gladkova Department of Civil Engineering and CUNY Remote.
Statistical approach Statistical post-processing of LPJ output Analyse trends in global annual mean NPP based on outputs from 19 runs of the LPJ model.
Correlation Assume you have two measurements, x and y, on a set of objects, and would like to know if x and y are related. If they are directly related,
Statistical Inference for the Mean Objectives: (Chapter 9, DeCoursey) -To understand the terms: Null Hypothesis, Rejection Region, and Type I and II errors.
The use of multi-model ensembles in climate projections: on uncertainty characterization and quantification Claudia Tebaldi Climate Central Currently visiting.
Peter Knippertz et al. – Uncertainties of climate projections of severe European windstorms European windstorms Knippertz, Marsham, Parker, Haywood, Forster.
Uncertainty Management in Rule-based Expert Systems
Extracting valuable information from a multimodel ensemble Similarly, number of RCMs are used to generate fine scales features from GCM coarse resolution.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,
A Multi-Expert Scenario Analysis for Systematic Comparison of Expert Weighting Approaches * CEDM Annual Meeting Pittsburgh, PA May 20, 2012 Umit Guvenc,
Expert Group Meeting on MDG, Astana, 5-8 Oct.2009 MDG 3.2: Share of women in wage employment in the non-agricultural sector Sources of discrepancies between.
1 Module One: Measurements and Uncertainties No measurement can perfectly determine the value of the quantity being measured. The uncertainty of a measurement.
Welcome to the PRECIS training workshop
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
The Tyndall Centre comprises nine UK research institutions. It is funded by three Research Councils - NERC, EPSRC and ESRC – and receives additional support.
1 Probability and Statistics Confidence Intervals.
Using the past to constrain the future: how the palaeorecord can improve estimates of global warming 大氣所碩一 闕珮羽 Tamsin L. Edwards.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Model Comparison. Assessing alternative models We don’t ask “Is the model right or wrong?” We ask “Do the data support a model more than a competing model?”
Uncertain Judgements: Eliciting experts’ probabilities Anthony O’Hagan et al 2006 Review by Samu Mäntyniemi.
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Multiple Imputation using SOLAS for Missing Data Analysis
making certain the uncertainties
Statistical Methods for Model Evaluation – Moving Beyond the Comparison of Matched Observations and Output for Model Grid Cells Kristen M. Foley1, Jenise.
Gerald Dyer, Jr., MPH October 20, 2016
An Approach to Enhance Credibility of Decadal-Century Scale Arctic
Likelihood intervals vs. coverage intervals
CS 594: Empirical Methods in HCC Introduction to Bayesian Analysis
Presentation transcript:

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty A matter of (Bayesian) belief?

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 2 All Models Are Wrong... but anyway, how further? Selecting an ‘appropriate’ model(structure)?  Fitness for purpose  Reflecting current knowledge on most important processes  Being close to observations Problematic aspects  How to express appropriateness?  What if knowledge base is weak?  What if data has limited availability, quality and scope? What uncertainty is involved in selected model(structure)s? 2

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 3 Model selection  Multiple hypothesis testing  Penalty Function-based model selection AIC,BIC, FIC, MDL  Cross-validation methods  Bayesian Model Averaging or Frequentist Model Averaging .....? But how to deal with uncertainty involved in selected model(structure)s?  Some examples from Climate Change 3

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 44 The gospel of Thomas In the face of uncertainty, your Bayesian belief comes as a rescue

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 55 Giorgi, 2005

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 66 Issue of model structure uncertainty (12 different AR4 models)

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 7 MME:multi-model ensemble: CMIP-3 in IPCC-AR4  Model ensemble (carefully selected; all plausible models) Y 1 = F 1 (X* 1 ), Y 2 = F 2 (X* 2 ),..., Y n = F n (X* n ) –Multi-model averages are shown; –Inter-model standard deviation as measure of uncertainty.  But, what uncertainties are characterized by model ensemble, what is left out (e.g. surprises)?  Is the sample representative for ‘true’ uncertainties involved? Issue: scope and relatedness of X* i ~ X* j and F i ~ F j 7

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 8 Does MME underestimate uncertainty? 8 No true model; collection of best guesses; Mutual dependence (models sharing components/ideas and even sharing systematic and common errors) Sample is neither random, nor systematic, nor complete

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 99

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 10 Performance of models in this MME 10  What metrics of performance to be used?  Is it just to treat models equally? Or should we apply a weighting, accounting for their ‘performance’, by confronting them with observed climate? E.g. evaluating posterior probability of model, given observations P(M j |Y obs ) ~ exp[-1/2 BIC ( M j )]  cf. BMA  But, what does performance in the past tell about future performance?

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 11 Natural variability 11

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 12 Natural variability 12

Three different emission scenarios Seven different timeframes 25km grid, 16 admin regions, 23 river-basins and 9 marine regions UKCP09: Towards more complete probabilistic climate projections Uncertainty including information from models other than HadCM3 Variables and months This is what users requested David Sexton

Why we cannot be certain... Internal variability (initial condition uncertainty) Modelling uncertainty –Parametric uncertainty (land/atmosphere and ocean perturbed physics ensembles, ppe) –Structural uncertainty (multimodel ensembles) –Systematic errors common to all current climate models Forcing uncertainty –Different emission scenarios –Carbon cycle (perturbed physics ensembles, ppe) –Aerosol forcing (perturbed physics ensembles, ppe) 14 David Sexton

Production of UKCP09 predictions Simple Climate Model Time-dependent PDF 25km PDF UKCP09 Equilibrium PPE 4 time-dependent Earth System PPEs (atmos, ocean, carbon, aerosol) Equilibrium PDF Observations Other models 25km regional climate model David Sexton

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 16 Some remarks and innovative aspects (UKCP09)  Probability is considered as ‘Measure of the degree to which a particular outcome is consistent with the evidence’  Use of ‘emulators’ (metamodels), replacing more computer- intensive models  Explicit incorporation of model discrepancy, estimated from difference multi-model ensemble in relation to HadCM3.  Explicit incorporation of uncertainty linked to downscaling and timescaling.  Also accounting for uncertainties in forcing, due to carbon cycle and aerosol forcing.  Honest in explicitly stating underlying assumptions 16

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 17 Assumptions explicitly stated ; realistic ? (sometimes  )  That known sources of uncertainty not included in UKCP09 are not likely to contribute much extra uncertainty.  That structural uncertainty across the current state of the art models is a good proxy for structural error.  That models that simulate recent climate, and its recent trends well, are more accurate at simulating future climate.  That single results from other global climate models are equally credible.  That projected changes in climate are equally probable across a given 30 year time period.  That local carbon cycle feedbacks are small compared to the impact of the carbon cycle via change in global temperature.

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 18 But... (caveats) Probabilistic projections are always conditional on implied assumptions and used methods.  How far should we go in this?  How realistic is this sketch of uncertainty? –discrepancy assessed on basis of comparison with other models; –what about systematic errors common to all state-of-the-art climate models; missing processes, potential surprises, unk-unk?  How robust, representative and relevant are the results?  What about non-quantifiable aspects with respect to knowledge quality/underpinning?

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 19 Going beyond quantification of uncertainty: integral perspective 19

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 20 CONSENSUS ON KNOWLEDGE CONSENSUS ON VALUES + _ + _ Structured problem statistical uncertainty Moderately structured (consensus on means) problem value-ladenness Moderately structured (consensus on goals) problem methodological unreliability; recognized ignorance Unstructured problem recognized ignorance; methodological unreliablity; value- ladenness

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 21 Funtowicz and Ravetz, Science for the Post Normal age, Futures, 1993 The agnostic gospel

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 22

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 23 Graphical display of qualitatitive uncertainty

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 24 Pedigree matrix for assumptions

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 25 Conclusions (I) 1.Conditional character of probabilistic projections requires being clear on assumptions and potential consequences (e.g. robustness, things left out) 2.Room for further development in probabilistic uncertainty projections: how to deal decently with model ensembles, accounting for model discrepancies –Beyond the Bayesian paradigm  e.g. Dempster-Shafer –Second order uncertainty  imprecise probabilities 3.There is a role to be played for knowledge quality assessment, as complementary to more quantitative uncertainty assessment

16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty 26 Conclusions (II) 4.Recognizing ignorance often more important than characterizing statistical uncertainty 5.Communicate uncertainty in terms of societal/political risks