The PRISM infrastructure for Earth system models Eric Guilyardi, CGAM/IPSL and the PRISM Team Background and drivers PRISM project achievements The future.

Slides:



Advertisements
Similar presentations
European EO data & model fusion ( to maximise the value of ESA EO Data ) Alan ONeill National Centre for Earth Observation.
Advertisements

HALO Harmonised Coordintion of Atmosphere Land Ocean Integrated projects in the GMES Backbone Interoperability of WIS (> GTS + EUMETNET+ GEONET) with other.
CGW 08, Kraków, October 13-15th,  Integration of instrumentation and selected applications with e-Infrastructure and maintenance on production.
Kick-off meeting Paris 30th and 31rst of March 2009 Action soutenue par la Région Ile de France Infrastructure for the European Network for Earth System.
Earth System Curator Spanning the Gap Between Models and Datasets.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
1 OBJECTIVES To generate a web-based system enables to assemble model configurations. to submit these configurations on different.
First Marine Board Forum – 15 May Oostende Marine Data Challenges: from Observation to Information From observation to data.
PRISM Coupling and I/O System G. Berti, P. Bourcier, A. Caubel, D. Declat, M.-A. Foujols, J. Latour, S. Legutke, J. Polcher, R. Redler, H. Ritzdorf, T.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 CW2015, Manchester, 04/ Working Session II - Future Issues Working Session II – Future Issues: interoperability, sharing of models/infrastructure,
Integrated Frameworks for Earth and Space Weather Simulation Timothy Killeen and Cecelia DeLuca National Center for Atmospheric Research, Boulder, Colorado.
UK e-Science and the White Rose Grid Paul Townend Distributed Systems and Services Group Informatics Research Institute University of Leeds.
European Network for Earth System Modeling (ENES) The PRISM Project Guy P. Brasseur Max Planck Institute for Meteorology Hamburg, Germany.
Eric Guilyardi (LOCEAN/IPSL and Univ. Reading) and the Metafor team Common Metadata for Climate Modelling Digital Repositories IS-ENES kick-off meeting.
The Natural Resources Digital Library Needs, Partners, and Challenges Bonnie Avery, Janine Salwasser, & Janet Webster Oregon State University.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
1 Eric Guilyardi and the Metafor team Common Metadata for Climate Modelling Digital Repositories Metafor Dissemination Workshop Abingdon, 14 March 2011.
IS-ENES [ees-enes] InfraStructure for the European Network for Earth System Modelling IS-ENES will develop a virtual Earth System Modelling Resource Centre.
Metadata Creation with the Earth System Modeling Framework Ryan O’Kuinghttons – NESII/CIRES/NOAA Kathy Saint – NESII/CSG July 22, 2014.
IS-ENES Kick-off meeting Paris, March 2009 Overview of JRA2 European ESM: Performance Enhancement Graham Riley, University of Manchester IS-ENES Kick-off.
Update on the activities of the OECD’s Statistical Information System Collaboration Community MSIS - May 2012.
CIM – The Common Information Model in Climate Research
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
PRISM An Infrastructure Project for Climate Research in Europe by Nils ECMWF Contributions by A. Caubel, P. Constanza, D. Declat, J. Latour, V.
ENES and PRISM: A European approach to Earth System modelling Sophie Valcke, CERFACS and the PRISM team across Europe.
Service-enabling Legacy Applications for the GENIE Project Sofia Panagiotidi, Jeremy Cohen, John Darlington, Marko Krznarić and Eleftheria Katsiri.
A European Software Infrastucture Project to support Earth System Modelling Funded by the European Commission under contract number EVR
1 CW 2015, Manchester, 04/ Coupling technology benchmarking in IS-ENES2 Coupling technology benchmarking in IS-ENES2 IS-ENES2 WP10-T3 Evaluation.
EPOS a long term integration plan of research infrastructures for solid Earth Science in Europe Preparatory Phase Project
CERFACS: Christian Pagé Laurent Terray Météo-France: Philippe Dandin Pascale Delecluse Serge Planton Jean-Marc Moisselin Maryvonne Kerdoncuff High resolution.
1 Web: Steve Brewer: Web: EGI Science Gateways Initiative.
기후모델 : 기후변화연구의 인프라 Climate Model: Infrastructure for Climate Change Research Wonsun Park Leibniz Institute of Marine Sciences Kiel, Germany KOFST Ultra.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
The Future ENES Strategy – Toward a Foresight Document Jochem Marotzke Max Planck Institute for Meteorology (MPI-M) German Climate Computing Centre (DKRZ)
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Migration to Rose and High Resolution Modelling Jean-Christophe Rioual, CRUM, Met Office 09/04/2015.
Presented by An Overview of the Common Component Architecture (CCA) The CCA Forum and the Center for Technology for Advanced Scientific Component Software.
Synthesis of Strategic Issues (Climate, Disasters, Water) and a draft European strategic framework.
Towards Exascale File I/O Yutaka Ishikawa University of Tokyo, Japan 2009/05/21.
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
Recent Developments in CLARIN-NL Jan Odijk P11 LREC, Istanbul, May 23,
Sarah Callaghan 1, Eric Guilyardi 2, Charlotte Pascoe 3 and the Metafor Project Team 1 BADC- UK, 2 University of Reading, UK.
1 PRogramme for Integrated earth System Modelling (PRISM) An Infrastructure Project for Climate Research in Europe.
PRISM overview PRISM overview: from the FP5 project to a sustained effort ENES Meeting, Vienna, 2 April 2006 Scope and drivers PRISM project: achievements.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
22-24 Feb 05 Workshop DKRZ / M&D Stephanie Legutke Introduction to the Standard Compile Environment (SCE) of the Integrated Model & Data Infrastructure.
ProActive components and legacy code Matthieu MOREL.
Lawrence Livermore National Laboratory S&T Principal Directorate - Computation Directorate Tools and Scalable Application Preparation Project Computation.
Utility Computing: Security & Trust Issues Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
Lautenschlager + Thiemann (M&D/MPI-M) / / 1 Introduction Course 2006 Services and Facilities of DKRZ and M&D Integrating Model and Data Infrastructure.
Construction of Computational Segment at TSU HEPI Erekle Magradze Zurab Modebadze.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
EC Review – 01/03/2002 – WP9 – Earth Observation Applications – n° 1 WP9 Earth Observation Applications 1st Annual Review Report to the EU ESA, KNMI, IPSL,
Curator: Gap Analysis (from a schema perspective) Rocky Dunlap Spencer Rugaber Georgia Tech.
ESMF,WRF and ROMS. Purposes Not a tutorial Not a tutorial Educational and conceptual Educational and conceptual Relation to our work Relation to our work.
An update on BFG, The Bespoke Framework Generator Graham Riley (& Rupert Ford, STFC) Coupling Workshop Boulder, Colorado - February 20 th -22 nd.
WP4 - Strengthening the European Network on Earth System Modelling IS-ENES kick-off meeting – March 30-31, Paris Partners and general objectives CERFACS.
The Modeling Circle Courtesy M. Lautenschlager, DKRZ.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Data Requirements for Climate and Carbon Research John Drake, Climate Dynamics Group Computer.
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech 5 th GO-ESSP Community Meeting.
Page 1 AERES at IPSL : February, 3-4, 2014 The ESPRI project IPSL thematics: illustration of new perspectives February, 3-4, 2014 Marie-Alice Foujols Cathy.
European Perspective on Distributed Computing Luis C. Busquets Pérez European Commission - DG CONNECT eInfrastructures 17 September 2013.
TeraGrid Software Integration: Area Overview (detailed in 2007 Annual Report Section 3) Lee Liming, JP Navarro TeraGrid Annual Project Review April, 2008.
A Quick Tour of the NOAA Environmental Software Infrastructure and Interoperability Group Cecelia DeLuca Dr. Robert Detrick visit March 28, 2012
IS-ENES2: Contributions to HPC for climate modelling
JCOMM in-situ Observations Programme Support Centre www. jcommops
J. Marotzke, R. Budich, A. Navarra, P. Kabat, B. Lawrence
Presentation transcript:

The PRISM infrastructure for Earth system models Eric Guilyardi, CGAM/IPSL and the PRISM Team Background and drivers PRISM project achievements The future

The PRISM infrastructure Why a common software infrastructure ? Earth system modelling expertise widely distributed Geographically Thematically

The PRISM infrastructure Why a common software infrastructure ? Earth system modelling expertise widely distributed –Scientific motivation = facilitate sharing of scientific expertise and of models –Technical motivation = the technical challenges are large compared with available effort Need to keep scientific diversity while increasing efficiency – scientific and technical Need for concerted effort in view of initiatives elsewhere: –The Frontier Project, Japan –The Earth System Modelling Framework, US

The PRISM infrastructure « Share Earth System Modelling software infrastructure across community » To: share development, maintenance and support aid performance on a variety of platforms standardize model software environment ease use of different climate model components PRISM concept

The PRISM infrastructure Expected benefits high performance ESM software, developed by dedicated IT experts, available to institutes/teams at low cost: - helps scientists to focus on science - helps key scientific diversity (survival of smallers groups) Easier to assemble ESMs based on community models shared infrastructure = increased scientific exchanges computer manufacturers inclined to contribute: - efficiency (porting, optimisation) on variety of platforms - next generation platforms optimized for ESM needs - easier procurements and benchmarking - reduced computing costs

The PRISM infrastructure Software structure of an Earth System Model Running environment Coupling infrastructure Scientific core Supporting software Share (i.e. f90)

The PRISM infrastructure Hardware Fortran Compiler Earth System model (Science + support + environment) The long term view Today Modeller IT expert Standard support Library (incl. Env.) Hardware Fortran Compiler Earth System model (Science) Tomorrow Climate science work Towards standard ESM support library(ies)

The PRISM infrastructure The PRISM project Program for integrated Earth System Modelling –22 partners –3 Years, from Dec Nov 2004 –5 Mill. € funding, FP5 of the EC (~80 py) –Coordinators: G.Brasseur and G.Komen

The PRISM infrastructure PRISMinfrastructure - General principles - Constraints from physical interfaces,… - Coupler and I/O - Compile/run environment - GUI - Visualisation and diagnostics - requirements - beta testing - feedback The community models The science : The technical developments: The modelers/users: - Atmosphere - Atmos. Chemistry - Ocean - Ocean biogeochemistry - Sea-ice - Land surface - … Let’s NOT re-invent the wheel ! System specifications

The PRISM infrastructure System specifications - the people Reinhard Budich - MPI, Hamburg Andrea Carril - INGV, Bologna Mick Carter - Hadley Center, Exeter Patrice Constanza - MPI/M&D, Hamburg Jérome Cuny - UCL, Louvain-la-Neuve Damien Declat - CERFACS, Toulouse Ralf Döscher - SMHI, Stockholm Thierry Fichefet - UCL, Louvain-la-Neuve Marie-Alice Foujols - IPSL, Paris Veronika Gayler - MPI/M&D, Hamburg Eric Guilyardi* - CGAM, Reading and LSCE Rosalyn Hatcher - Hadley Center, Exeter Miles Kastowsky ­ MPI/BCG, Iena Luis Kornblueh - MPI, Hamburg Claes Larsson - ECMWF, Reading Stefanie Legutke - MPI/M&D, Hamburg Corinne Le Quéré - MPI/BCG, Iena Angelo Mangili - CSCS, Zurich Anne de Montety - UCL, Louvain-la-Neuve Serge Planton - Météo-France, Toulouse Jan Polcher - LMD/IPSL, Paris René Redler, NEC CCRLE, Sankt Augustin Martin Stendel - DMI, Copenhagen Sophie Valcke - CERFACS, Toulouse Peter van Velthoven - KNMI, De Bilt Reiner Vogelsang - SGI, Grasbrunn Nils Wedi - ECMWF, Reading * Chair

The PRISM infrastructure 1. a standard coupler and I/O software, OASIS3 (CERFACS) and OASIS4 2. a standard compiling environment (SCE) at the scripting level 3. a standard running environment (SRE) at the scripting level 4. a Graphical User Interface (GUI) to the SCE (PrepIFS, ECMWF) 5. a GUI to the SRE for monitoring the coupled model run (SMS, ECMWF) 6. standard diagnostic and visualisation tools PRISM achievements (so far): Adaptation of community Earth System component models (GCMs) and demonstration coupled configurations A well co-ordinated network of expertise Community buy-in and trust-building Software environment (the tool box):

The PRISM infrastructure The PRISM shells Standard Running Environment Standard Compile Environ. PSMILe (coupling and I/O) Historic I/O Outer shells Inner shell Scientific core

The PRISM infrastructure Adapting Earth System Components to PRISM SRE SCE UserInterface Levels of adaptation + PSMILe + PMIOD PRISM Model Interface Library Potential Model IO Description

The PRISM infrastructure Configuration management and deployment Binary executables SCE UserInterface SRE Transf. Driver disks

The PRISM infrastructure Internet PRISM GUI remote functionality User PRISMRepositories (CSCS, MPI) Config. Instrumentedsites Transf. Driver Architecture A Transf. Driver Architect. B Transf. Driver Architect. CDeploy PrepIFS/SMS Web services

The PRISM infrastructure Standard scripting environments Standard Compiling Environment SCE Standard Runtime Environment SRE

The PRISM infrastructure Data processing and visualisation

The PRISM infrastructure Demonstration experiments Platforms Assembled Coupled models CGAM contribution (Jeff Cole)

The PRISM infrastructure Development coordinators The coupler and I/O - Sophie Valcke (CERFACS) The standard environments - Stephanie Legutke (MPI) The user interface and web services - Claes Larsson (ECMWF) Analysis and visualisation - Mick Carter (Hadley Centre) The assembled models - Stephanie Legutke (MPI) The demonstration experiments - Andrea Carril (INGV)

The PRISM infrastructure Community buy-in Growing ! –Workshops and seminars –15 pionneer models adapted (institutes involvement) –9 test super-computers intrumented –Models distributed under PRISM env. (ECHAM5, OPA 9.0) –Community programmes relying on PRISM framework (ENSEMBLES, COSMOS, MERSEA, GMES, NERC,…) To go further: – PRISM perspective: maintain and develop tool box – Institute perspective: get timing and involvement in next steps right

The PRISM infrastructure Collaborations Active collaborations: ESMF (supporting software, PMIOD, MOM4) FLUME (PRISM software) PCMDI (visualisation, PMIOD) CF group (CF names) NERC (BADC & CGAM) (meta-data, PMIOD) M&D, MPI (data) Earth Simulator (install PRISM system V.0) Coupling infrastructure Scientific core Supporting software Running environment PRISM has put Europe in the loop for community-wide convergence on basic standards in ES modelling

The PRISM infrastructure PRISM Final Project Meeting De Bilt, October 7-8, 2004

The PRISM infrastructure The future PRISM has delivered a tool box, a network of expertise and demonstrations Community buy-in growing Key need for sustainability of –tool box maintenance/development (new features) –network of expertise PRISM sustained initiative Set-up meeting held in Paris Oct