Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Earth System Model Evaluation Tool (ESMValTool) Axel Lauer 1, Veronika Eyring 1, Mattia Righi 1, Alexander Löw 2, Martin Evaldsson 3 1 Deutsches Zentrum.

Similar presentations


Presentation on theme: "The Earth System Model Evaluation Tool (ESMValTool) Axel Lauer 1, Veronika Eyring 1, Mattia Righi 1, Alexander Löw 2, Martin Evaldsson 3 1 Deutsches Zentrum."— Presentation transcript:

1 The Earth System Model Evaluation Tool (ESMValTool) Axel Lauer 1, Veronika Eyring 1, Mattia Righi 1, Alexander Löw 2, Martin Evaldsson 3 1 Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR), Oberpfaffenhofen, Germany 2 Department of Geography, University of Munich (LMU), Germany 3 Swedish Meteorological and Hydrological Institute (SMHI), Norrköping, Sweden 25 November 2015

2 Motivation for the development of the Earth System Model Evaluation Tool (ESMValTool) Facilitate the evaluation of complex Earth System Models, e.g., quick look at standard diagnostic plots & output diagnostic variables, easy comparison of new simulations (e.g. sensitivity runs or runs with new model versions) with existing runs and with observations Raise the standard for model evaluation include additional diagnostics of ongoing evaluation activities so that we do not have to start from scratch each time implement more observations, account for observational uncertainties Quick assessment of where we stand with a new set of model simulations via standard namelists that reproduce specific papers, reports, etc. Traceability and reproducibility Facilitates participation in and analysis of Model Intercomparison Projects easily comparison of models participating in CMIP and CMIP6-Endorsed MIPs Easy expandability synergies with ongoing projects to expand the tool (e.g. NCAR CVDP) useful for model groups & those analyzing models useful for model development

3 CRESCENDO Overarching goal Improve the process realism and future climate projection reliability of (European) climate models. Objective Develop and apply an evaluation tool for routine model benchmarking and more advanced analysis of feedbacks and future projections. Strategy Develop a community benchmarking system (ESMValTool) Comparing with observations and earlier model versions Benchmarking key aspects of simulated variability and trends Including additional biogeochemical and aerosol/trace gas metrics Adding process-level diagnostics Implement emergent contraints developed in CRESCENDO into ESMValTool Make ESMValTool available to the wider research community

4 Overview ESMValTool Routine benchmarking and evaluation of single or multiple ESMs, either against predecessor versions, a wider set of climate models, or observations Current implementations include sea ice assessments and other Essential Climate Variables (ECVs), tropical variability, atmospheric CO 2 and NO 2 budget, …  can easily be extended with additional analysis Community development under a subversion controlled repository  allows for multiple developers from different institutions to contribute and join Goals:  Enhance and improve routine benchmarking and evaluation of ESMs  Routinely run the tool on model output of CMIP6 alongside the Earth System Grid Federation (ESGF)  Support and facilitate the analysis of the ongoing CMIP Development, Evaluation, Characterization of Klima (DECK) and CMIP6 simulations with a fully internationally developed evaluation tool from multiple institutions Climate community is encouraged to contribute to this effort over the coming years.

5 Current Status: Contributing Institutions (currently ~60 developers from 22 institutions) 1.Deutsches Zentrum für Luft- und Raumfahrt (DLR), Institut für Physik der Atmosphäre, Germany 2.Swedish Meteorological and Hydrological Institute (SMHI), Norrköping, Sweden 3.Agenzia nazionale per le nuove tecnologie, l’energia e lo sviluppo economico sostenibile (ENEA), Italy 4.British Atmospheric Data Centre (BADC), UK 5.Centre for Australian Weather and Climate Research (CAWCR), Bureau of Meteorology, Australia 6.Deutsches Klimarechenzentrum (DKRZ), Germany 7.ETH Zurich, Switzerland 8.Finnish Meteorological Institute, Finland 9.Geophysical Fluid Dynamics Laboratory (GFDL) NOAA, USA 10.Institut Pierre Simon Laplace, France 11.Ludwig Maximilian University of Munich, Germany 12.Max-Planck-Institute for Meteorology, Germany 13.Met Office Hadley Centre, UK 14.Météo France, Toulouse, France 15.Nansen Environmental and Remote Sensing Center, Norway 16.National Center for Atmospheric Research (NCAR), USA 17.New Mexico Tech, USA 18.Royal Netherlands Meteorological Institute (KNMI), The Netherlands 19.University of East Anglia (UEA), UK 20.University of Exeter, Exeter, UK 21.University of Reading, UK 22.University of Wagingen, The Netherlands

6 Development of an Earth System Model Evaluation Tool Within EMBRACE: DLR, SMHI & EMBRACE partners in collaboration with NCAR, PCMDI, GFDL Open Source: Python script that calls NCL (NCAR Command Language) and other languages (R, Python) Input: CF compliant netCDF model output (CMIP standards) Observations: can be easily added Extensible: easy to (a) read models (b) process output [diagnostic] with observations and (c) use a standard plot type (e.g. lat-lon map) Current developments include Essential Climate Variables, e.g.,  Sea-Ice  Temperatures & Water Vapor  Radiation  CO 2  Ozone Tropical variability (incl. Monsoon, ENSO, MJO) Southern Ocean Continental dry biases and soil-hydrology-climate interactions (e.g., Standardized Precipitation Index) Atmospheric CO 2 and NO 2 budget More Observations (e.g., obs4MIPs, ESA CCI) Statistical measures of agreement Goal: Standard namelists to reproduce certain reports or papers (e.g., IPCC AR5 Chapter 9, Massonnet et al., 2012; Anav et al., 2012; Cox et al., 2013; Eyring et al., 2013)

7 WORKFLOW MANAGER namelist_XYZ.xml Model data Observations Plots (ps, eps, png, pdf) Output (NetCDF) Log file (references) diag_scripts/*.typ plot_scripts/typ/* derive_var.ncl cfg_XYZ/ launchers.py Call diagnostic scripts Different languages (typ) supported: NCL, python, R main.py ESMValTool main driver reformat.py Check/reformat the input according to CF/CMOR Processed data Calculate derived variable variable_defs/ reformat_default reformat_EMAC reformat_obs diag_scripts/lib Common libraries Reformat routines Schematic Overview of the ESMValTool Structure

8 Software requirements Python 2.* NCL 6.2 or higher CMIP5 style datasets ESMValTool (not yet officially released  contact PIs): tarball or from svn repository Installation www.ncl.ucar.edu www.python.org e.g.: esgf-data.dkrz.de/esgf-web-fe

9 Examples of Diagnostics Implemented Similar to Figure 9.24 of AR5 Similar to Figure 9.7 of AR5 Similar to Figure 9.5 of AR5Similar to Figure 9.45 of AR5 CMIP5 MMM CMIP5 MMM - OBS Monsoon Precipitation Intensity and Domain Similar to Figure 9.32 of AR5

10 Examples of Diagnostics Implemented Cloud Regime Error Metric (CREM, Williams and Webb, 2009) Tropospheric ozone (Righi et al., 2015) Interannual variability in surface pCO 2 (Rodenbeck et al., 2014) Atlantic Meridional Overturning Streamfunction (AMOC)

11 http://www.pa.op.dlr.de/ESMValTool/index.html Eyring et al., Geosci. Model Dev. Discuss, 8, 7541-7661, 2015

12 Summary  The ESM Evaluation Tool will facilitate the complex evaluation of ESMs and their simulations (e.g., submitted to international Model Intercomparison Projects such as CMIP, C4MIP, CCMI)  The tool is developed under a subversion controlled repository ‒ Allows multiple developers from different institutions to join the development team ‒ Broad community-wide and international participation in the development is envisaged ‒ Collaboration with the WGNE/WGCM metrics panel  Current extensions ‒ Atmospheric dynamics, biogeochemical processes, cryosphere and ocean ‒ Need for a wide range of observational data to be used with the tool ‒ Observations should ideally be provided with uncertainty and a technical documentation (e.g. similar to those from obs4MIPs) and in CF compliant netCDF ‒ Improving statistical comparison and visualization  Regular releases to the wider international community ‒ Further develop and share the diagnostic tool and routinely run it on CMIP DECK output and according observations (obs4MIPs) alongside the Earth System Grid Federation (ESGF) ‒ Release of ESMVal tool to the public → will contribute to metrics panel code repository ‒ Work towards a full community-tool developed by multiple institutions / countries

13 CRESCENDO WP3.1 – Objectives Further develop an ESM benchmarking and evaluation tool (ESMValTool): new Essential Climate Variables (ECVs) and standard diagnostics Make use of observations from projects such as ESA-CCI and obs4MIPs: new climate diagnostics, e.g., surface radiation, turbulent energy, water fluxes, precipitation variability Include new key diagnostics: biogeochemical and aerosol processes Extend ESMValTool with new diagnostics for operational analysis of multi ‐ model ESM projections: e.g. IPCC AR5 Ch. 9 diagnostics Provide user guidelines for other partners to include new diagnostics and performance metrics for process ‐ based model evaluation (RT2) and emergent constraints (WP3.2) into the ESMValTool Develop interfaces to existing diagnostic packages such as the UK Met Office Auto ‐ Assess package

14 Thank you!

15 Considering Observational Uncertainty in the ESMValTool  Observational uncertainty needs to be considered in the assessment of model performance and should be an integral part of the evaluation tools  The ESMValTool is quite flexible and can consider observational uncertainty in many ways, e.g.:  using more than one observational dataset to evaluate the models  showing the differences between the observational datasets in addition to the difference between the model and the reference dataset  easily usage of an observed uncertainty ensemble spanning the observed uncertainty space, consistent with statistics Example: HadCRUT4 surface temperatures, 5 (out of 100) realizations

16 Join the core development team with full access to: Subversion repositoryMantis bug tracker Teamsite & Wiki International cooperation for community development Options to contribute

17 Automated testing Any changes to a programming code have the risk of introducing unwanted side effects on some other parts of a code and of introducing bugs. Routine and automated testing is therefore essential to maximize the code quality and ensure integrity of all diagnostics implemented within ESMValTool. Automated testing within the ESMValTool is implemented on two complementary levels unittests are used to verify that small code units (e.g. functions/subroutines) provide the expected results integration testing is used to verify that a diagnostic integrates well into the ESMValTool framework and that a diagnostic provides expected results. This is verified by comparison of the results against a set of reference data generated during the implementation of the diagnostic.

18 Automated testing Nosetests (Python) While testing results of a diagnostic, a special namelist file is executed by ESMValTool which runs a diagnostic on a limited set of test data. A small test data set is chosen to minimize executing time for testing while ensuring on the other hand that the diagnostic produces the correct results. The following general tests are implemented at the moment: Check for file availability: a check is performed that all required output data have been successfully generated by the diagnostic. File checksum: currently the MD5 checksum is used to verify that contents of two files are identical. Extension graphics check: for graphic files an additional test will be implemented which verifies that two graphical outputs are identical. This is in particular useful to verify that outputs of a diagnostic remain the same after code changes.

19 Log-files Aim: transparency and reproducibility Log-files contain: Creation date Host and user Version number of the ESMValTool List of namelists / diagnostics run Variables and models processed List of all model files that have been used including full pathnames + Tracking ID (read from metadata if available) Patches applied to model data (if any) List of all observations used including references Contributing authors and acknowledgement of projects

20 Earth System Grid Federation (ESGF) Coupling Overview Aim: integration of the ESMValTool with the ESGF In practice this means ESMValTool is:  aware of ESGF directory structure  able to use “Solr” index search to locate remote datasets  helping user/administrator download remote datasets

21 ESGF Coupling What has been achieved whilst ESGF offline ESMValTool can read files from ESGF node cache  New ESGF enabled project classes locate datasets in node cache, no need to specify data directory  Path structure of each ESGF node cache is described in new ESGF config file, based on XML  Structure of node cache differs slightly between ESGF nodes, ESMValTool takes account of this so namelists are easily ported between ESGF sites  New ESGF project classes and config file fully documented

22 ESGF Coupling What is envisaged once ESGF back online ESMValTool will find remote files using “Solr” index search  If dataset not in ESGF node cache, or no access to ESGF node cache, ESMValTool can search custom local cache  If dataset not available locally, ESMValTool will perform Solr index search on ESGF using info provided in model line ESMValTool will assist with download of remote files  If remote dataset found, ESMValTool will provide URLs of required files, so user/administrator can download  May be possible to provide user with a text file containing config information for Synda, making downloading files even easier

23 http://www.pa.op.dlr.de/ESMValTool/index.html Eyring et al., Geosci. Model Dev. Discuss, 8, 7541-7661, 2015 Coming soon!


Download ppt "The Earth System Model Evaluation Tool (ESMValTool) Axel Lauer 1, Veronika Eyring 1, Mattia Righi 1, Alexander Löw 2, Martin Evaldsson 3 1 Deutsches Zentrum."

Similar presentations


Ads by Google