Angèle Simard Canadian Meteorological Center Meteorological Service of Canada MSC Computing Update.

Slides:



Advertisements
Similar presentations
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
Advertisements

WRF Modeling System V2.0 Overview
© The Aerospace Corporation 2014 Observation Impact on WRF Model Forecast Accuracy over Southwest Asia Michael D. McAtee Environmental Satellite Systems.
Page 1© Crown copyright 2005 Numerical space weather prediction: can meteorologists forecast the way ahead? Dr Mike Keil, Dr Richard Swinbank and Dr Andrew.
June 2003Yun (Helen) He1 Coupling MM5 with ISOLSM: Development, Testing, and Application W.J. Riley, H.S. Cooley, Y. He*, M.S. Torn Lawrence Berkeley National.
Ver 0.1 Page 1 SGI Proprietary Introducing the CRAY SV1 CRAY SV1-128 SuperCluster.
“Where America’s Climate, Weather and Ocean Services Begin” NCEP CONDUIT UPDATE Brent A Gordon NCEP Central Operations January 31, 2006.
CHOICES FOR HURRICANE REGIONAL ENSEMBLE FORECAST (HREF) SYSTEM Zoltan Toth and Isidora Jankov.
Korean Meteorological Administration KMA System Upgrade Korea Meteorological Administration Dong-il Lee Korean Meteorological.
1 WRF Development Test Center A NOAA Perspective WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006 Fred Toepfer NOAA Environmental.
1 HPC and the ROMS BENCHMARK Program Kate Hedstrom August 2003.
Hitachi SR8000 Supercomputer LAPPEENRANTA UNIVERSITY OF TECHNOLOGY Department of Information Technology Introduction to Parallel Computing Group.
Current Status of the Development of the Local Ensemble Transform Kalman Filter at UMD Istvan Szunyogh representing the UMD “Chaos-Weather” Group Ensemble.
Wally Kowal, President and Founder Canadian Cloud Computing Inc.
ECMWF Slide 1CAS2K3, Annecy, 7-10 September 2003 Report from ECMWF Walter Zwieflhofer European Centre for Medium-Range Weather Forecasting.
UNIVERSITY of MARYLAND GLOBAL LAND COVER FACILITY High Performance Computing in Support of Geospatial Information Discovery and Mining Joseph JaJa Institute.
Slide 1 Bilateral meeting 2011Slide 1, ©ECMWF Status and plans for the ECMWF forecasting System.
Jordan G. Powers Mesoscale and Microscale Meteorology Division NCAR Earth System Laboratory National Center for Atmospheric Research Space Weather Workshop.
Francesca Marcucci, Lucio Torrisi with the contribution of Valeria Montesarchio, ISMAR-CNR CNMCA, National Meteorological Center,Italy First experiments.
UKmet February Hybrid Ensemble-Variational Data Assimilation Development A partnership to develop and implement a hybrid 3D-VAR system –Joint venture.
GOES Users’ Conference The Role of Geostationary Satellites in WMO’s Global Observing System Dr. Donald E. Hinsman Senior Scientific Officer WMO Satellite.
October 16-18, Research Data Set Archives Steven Worley Scientific Computing Division Data Support Section.
Operational Global Model Plans John Derber. Timeline July 25, 2013: Completion of phase 1 WCOSS transition August 20, 2013: GDAS/GFS model/analysis upgrade.
Solution of the Implicit Formulation of High Order Diffusion for the Canadian Atmospheric GEM Model “High Performance Computing and Simulation Symposium.
1 NOAA’s Environmental Modeling Plan Stephen Lord Ants Leetmaa November 2004.
National Weather Service National Weather Service Central Computer System Backup System Brig. Gen. David L. Johnson, USAF (Ret.) National Oceanic and Atmospheric.
Willem A. Landman Asmerom Beraki Francois Engelbrecht Stephanie Landman Supercomputing for weather and climate modelling: convenience or necessity.
SMHI in the Arctic Lars Axell Oceanographic Research Unit Swedish Meteorological and Hydrological Institute.
Earth Science Division National Aeronautics and Space Administration 18 January 2007 Paper 5A.4: Slide 1 American Meteorological Society 21 st Conference.
Impact study with observations assimilated over North America and the North Pacific Ocean at MSC Stéphane Laroche and Réal Sarrazin Environment Canada.
NWP Activities at INM Bartolomé Orfila Estrada Area de Modelización - INM 28th EWGLAM & 13th SRNWP Meetings Zürich, October 2005.
10° COSMO GENERAL MEETING Plans and State of Art at USAM/CNMCA Massimo Ferri.
1 Requirements for hurricane track and intensity guidance Presented By: Vijay Tallapragada and Sam Trahan (NWS/NCEP) Contributors: HWRF Team at EMC, NHC.
Mathematics and Computer Science & Environmental Research Divisions ARGONNE NATIONAL LABORATORY Regional Climate Simulation Analysis & Vizualization John.
ATM 401/501 Status of Forecasting: Spring Forecasting at NCEP Environmental Modeling Center Ocean Prediction Center.
Data assimilation and observing systems strategies Pierre Gauthier Data Assimilation and Satellite Meteorology Division Meteorological Service of Canada.
1 Global Model Development Priorities Presented By: Hendrik Tolman & Vijay Tallapragada (NWS/NCEP) Contributors: GCWMB (EMC), NGGPS (NWS)
1 Monday, 26 October 2015 © Crown copyright Met Office Computing Update Paul Selwood, Met Office.
THORPEX Interactive Grand Global Ensemble (TIGGE) China Meteorological Administration TIGGE-WG meeting, Boulder, June Progress on TIGGE Archive Center.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Migration to Rose and High Resolution Modelling Jean-Christophe Rioual, CRUM, Met Office 09/04/2015.
Towards retrieving 3-D cloud fractions using Infrared Radiances from multiple sensors Dongmei Xu JCSDA summer colloquium, July August
A Sequential Hybrid 4DVAR System Implemented Using a Multi-Grid Technique Yuanfu Xie 1, Steven E. Koch 1, and Steve Albers 1,2 1 NOAA Earth System Research.
KISTI-GSDC SITE REPORT Sang-Un Ahn, Jin Kim On the behalf of KISTI GSDC 24 March 2015 HEPiX Spring 2015 Workshop Oxford University, Oxford, UK.
Adrianne Middleton National Center for Atmospheric Research Boulder, Colorado CAM T340- Jim Hack Running the Community Climate Simulation Model (CCSM)
© Crown copyright Met Office Data Assimilation Developments at the Met Office Recent operational changes, and plans Andrew Lorenc, DAOS, Montreal, August.
CLIM Fall 2008 What are the Roles of Satellites & Supercomputers in Studying Weather and Climate? CLIM 101.
Oct WRF 4D-Var System Xiang-Yu Huang, Xin Zhang Qingnong Xiao, Zaizhong Ma, John Michalakes, Tom henderson and Wei Huang MMM Division National.
AIRS Radiance and Geophysical Products: Methodology and Validation Mitch Goldberg, Larry McMillin NOAA/NESDIS Walter Wolf, Lihang Zhou, Yanni Qu and M.
Computing at Météo-France CAS 2003, Annecy 1.News from computers 2.News from models.
Title_Sli de Quality Controlling Wind Power Data for Data Mining Applications Gerry Wiener Research Applications Laboratory Software Engineering Assembly,
Slides for NUOPC ESPC NAEFS ESMF. A NOAA, Navy, Air Force strategic partnership to improve the Nation’s weather forecast capability Vision – a national.
Current and Future Use of Satellite Data in NWP at Environment Canada Satellite Direct Readout Conference 2011 Miami, USA David Bradley, Gilles Verner,
COSMO – 09/2007 STC Report and Presentation by Cosmo Partners DWD, MCH, USAM / ARPA SIM, HNMS, IMGW, NMA, HMC.
11 January 2005 High Performance Computing at NCAR Tom Bettge Deputy Director Scientific Computing Division National Center for Atmospheric Research Boulder,
NWP Activities at INM José A. García-Moya SMNT – INM 27th EWGLAM & 12th SRNWP Meetings Ljubljana, October 2005.
STATUS REPORT 19 th North America/Europe Data Exchange Meeting NOAA Silver Spring, MD May 3-5, 2006 Paul Poli (CNRM/GMAP) replacing Bruno Lacroix (DPrévi/COMPAS)
NCAR April 1 st 2003 Mesoscale and Microscale Meteorology Data Assimilation in AMPS Dale Barker S. Rizvi, and M. Duda MMM Division, NCAR
Slide 1 NEMOVAR-LEFE Workshop 22/ Slide 1 Current status of NEMOVAR Kristian Mogensen.
Status of the NWP-System & based on COSMO managed by ARPA-SIM COSMO I77 kmBCs from IFSNudgingCINECA COSMO I22.8 kmBCs from COSMO I7 Interpolated from COSMO.
Scheduling a 100,000 Core Supercomputer for Maximum Utilization and Capability September 2010 Phil Andrews Patricia Kovatch Victor Hazlewood Troy Baer.
Application of Emerging Computational Architectures (GPU, MIC) to Atmospheric Modeling Tom Henderson NOAA Global Systems Division
T511-L60 CY26R MPI tasks and 4 OpenMP threads
Assimilation of Global Positioning System Radio Occultation Observations Using an Ensemble Filter in Atmospheric Prediction Models Hui Liu, Jefferey Anderson,
Comparison of different combinations of ensemble-based and variational data assimilation approaches for deterministic NWP Mark Buehner Data Assimilation.
NWP Strategy of DWD after 2006 GF XY DWD Feb-19.
MOGREPS developments and TIGGE
Poster Session: Numerical Weather Prediction at MeteoSwiss
Presentation transcript:

Angèle Simard Canadian Meteorological Center Meteorological Service of Canada MSC Computing Update

2 Canadian Meteorological Centre  National Meteorological Centre for Canada  Provide services to National Defence and Emergency organizations  International WMO commitments ( emergency response, Telecomm, data, etc…)  Supercomputer used for NWP, climate, air quality, etc… ; operations and R&D

3 Outline  Current Status  Supercomputer Procurement  Conversion Experience  Scientific Direction

Current Status

5 MSC’s Supercomputing History

6

7 Supercomputers: NEC – 10 node – 80 PE – 640 GB Memory – 2 TB disks – 640 Gflops (peak)

8 Archive Statistics

9 Automated Archiving System  22,000 tape mounts/month  11 Terabytes growth/month  maximum of 214 TB

Supercomputer Procurement

11 RFP/Contract Process  Competitive process  Formal RFP released in mid-2002  Bidders need to meet all mandatory requirements to make it to next phase  Bidders were rejected if bids above the funding envelope  Bidders were rated on performance (90%) and price (10%)  IBM came first  Live preliminary benchmark at IBM facility  Contract awarded to IBM in November 2002

12 IBM Committed Sustained Performance in MSC TFlops

13 On-going Support  The systems must achieve a minimum of 99 % availability.  Remedial maintenance 24/7: 30 minute response time between 8 A.M. and 5 P.M. on weekdays. One hour response outside above periods.  Preventive maintenance or engineering changes: maximum of eight (8) hours a month, in blocks of time not exceeding two (2) or four (4) hours per day (subject to certain conditions).  Software support: Provision of emergency and non- emergency assistance.

14

15 Supercomputers: Now and Then… NEC – 10 node – 80 PE – 640 GB Memory – 2 TB disks –640 Gflops (peak) IBM – 112 nodes – 928 PE – 2.19 TB Memory – 15 TB Disks – Tflops (peak)

Conversion Experience

17 Transition to IBM Prior to conversion:  Developed and tested code on multiple platforms (MPP & vector)  Where possible, avoided proprietary tools  When required, hid proprietary routines under a local API to centralize changes  Following contract award, obtained early access to the IBM (thanks to NCAR)

18 Transition to IBM Application Conversion Plan:  Plan was made to comply to tight schedule  Plan’s key element included a rapid implementation of an operational “core” (to test mechanics)  Phased approach for optimized version of models  End to End testing scheduled to begin mid- September

19 Scalability Rule of Thumb  To obtain required wall-clock timings:  Require 4 times more IBM processors to obtain same performance with NEC SX-6

20 Timing (min) Configuration NECIBM NECIBM GEMDM regional MPI 8 MPI x4 OpenMP (48 hour) (32 total) GEMDM Global MPI 4 MPI x4 OpenMP (240 hour)(16 total) 3D-Var(R1) CPU OpenMP (without AMSU-B) 3D-Var (G2) CPU OpenMP (with AMSU-B) Preliminary Results

SCIENTIFIC DIRECTIONS

22 Global System Now:  Uniform resolution of 100 km (400 X 200 X 28)  3D-Var at T108 on model levels, 6-hr cycle  Use of raw radiances from AMSUA, AMSUB and GOES 2004:  Resolution to 35 km (800 X 600 X 80)  Top at 0.1 hPa (instead of 10 hPa) with additional AMSUA and AMSUB channels  4D-Var assimilation, 6-hr time window with 3 outer loops at full model resolution and inner loops at T108 (cpu equivalent of a 5- day forecast of full resolution model)  new datasets: profilers, MODIS winds, QuikScat 2005+:  Additional datasets (AIRS, MSG, MTSAT, IASI, GIFTS, COSMIC)  Improved data assimilation

23 Regional System Now:  Variable resolution, uniform region at 24 km (354 X 415 X 28)  3D-Var assimilation on model levels at T108, 12-hr spin-up 2004:  Resolution to 15 km in uniform region (576 X 641 X 58)  Inclusion of AMSU-B and GOES data in assimilation cycle  Four model runs a day (instead of two)  New datasets: profilers, MODIS winds, QuikScat 2006:  Limited area model at 10 km resolution (800 X 800 X 60)  LAM 4D-Var data assimilation  Assimilation of Radar data

24 Ensemble Prediction System Now:  16 members global system (300 X 150 X 28)  Forecasts up to 10 days once a day at 00Z  Optimal Interpolation assimilation system, 6-hr cycle,use of derived radiance data (Satems) 2004:  Ensemble Kalman Filter assimilation system, 6-hr cycle, use of raw radiances from AMSUA, AMSUB and GOES  Two forecast runs per day (12Z run added)  Forecasts extended to 15 days (instead of 10)

25...Ensemble Prediction System 2005:  Increased resolution to 100 km (400 X 200 X 58)  Increased members to 32  Additional datasets such as in global deterministic system 2007:  Prototype regional ensemble  10 members LAM (500 X 500 X 58)  No distinct data assimilation; initial and boundary conditions from global EPS

26 Mesoscale System Now:  Variable resolution, uniform region at 10 km (290 X 371 X 35) Two windows; no data assimilation 2004:  Prototype Limited area model at 2.5 km (500 X 500 X 45) over one area 2005:  Five Limited area model windows at 2.5 km (500 X 500 X 45) 2008:  4D data assimilation

27 Coupled Models Today  In R&D: coupled atmosphere, ocean, ice, wave, hydrology, biology & chemistry  In Production: storm surge, wave Future  Global coupled model for both prediction & data assimilation

28 Estuary & Ice Circulation

29 Merchant Ship Routing Puissance Temps …2 jours... Route sud Route nord

30 Challenges  IT Security  Accelerated Storage Needs  Replacement of Front-end & Storage Infrastructure