Petascale –LLNL Appro AMD: 9K processors [today] –TJ Watson Blue Gene/L: 40K processors [today] –NY Blue Gene/L: 32K processors –ORNL Cray XT3/4 : 44K.

Slides:



Advertisements
Similar presentations
Parallel IO in the Community Earth System Model
Advertisements

Computation of High-Resolution Global Ocean Model using Earth Simulator By Norikazu Nakashiki (CRIEPI) Yoshikatsu Yoshida (CRIEPI) Takaki Tsubono (CRIEPI)
John Dennis Dave Brown Kevin Paul Sheri Mickelson
Nov. 2002NERSC/LBNL1 Climate Modeling: Coupling Component Models by MPH for Distributed Multi-Component Environment Chris Ding and Yun (Helen) He NERSC.
Dynamical Downscaling of CCSM Using WRF Yang Gao 1, Joshua S. Fu 1, Yun-Fat Lam 1, John Drake 1, Kate Evans 2 1 University of Tennessee, USA 2 Oak Ridge.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Global Climate Modeling Research John Drake Computational Climate Dynamics Group Computer.
Phil’s Promised Presentation on POP’s Present Progress, Performance and Penultimate Post-present Plan POP People P. Malone, P. Smith, P. Maltrud, P. Jones,
Parallel Computing Overview CS 524 – High-Performance Computing.
Mesoscale & Microscale Meteorological Division / NCAR ESMF and the Weather Research and Forecast Model John Michalakes, Thomas Henderson Mesoscale and.
A CASE STUDY OF COMMUNICATION OPTIMIZATIONS ON 3D MESH INTERCONNECTS University of Illinois at Urbana-Champaign Abhinav Bhatele, Eric Bohm, Laxmikant V.
Coupling ROMS and WRF using MCT
High Performance Computing and Atmospheric Modeling
WRF-VIC: The Flux Coupling Approach L. Ruby Leung Pacific Northwest National Laboratory BioEarth Project Kickoff Meeting April 11-12, 2011 Pullman, WA.
Template Development of a Plume-in-Grid Version of Global-through-Urban WRF/Chem Prakash Karamchandani, Krish Vijayaraghavan, Shu-Yun Chen ENVIRON International.
Reference: / Parallel Programming Paradigm Yeni Herdiyeni Dept of Computer Science, IPB.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT Adoption and field tests of M.I.T General Circulation Model (MITgcm) with ESMF Chris Hill ESMF.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH First Field Tests of ESMF GMAO Seasonal Forecast NCAR/LANL CCSM NCEP.
1 First-Principles Molecular Dynamics for Petascale Computers François Gygi Dept of Applied Science, UC Davis
Statistical Performance Analysis for Scientific Applications Presentation at the XSEDE14 Conference Atlanta, GA Fei Xing Haihang You Charng-Da Lu July.
2 Chip-Multiprocessors & You John Dennis March 16, 2007 John Dennis March 16, 2007.
Computational Design of the CCSM Next Generation Coupler Tom Bettge Tony Craig Brian Kauffman National Center for Atmospheric Research Boulder, Colorado.
The WRF Model The Weather Research and Forecasting (WRF) Model is a mesoscale numerical weather prediction system designed for both atmospheric research.
Using HDF5 in WRF Part of MEAD - an alliance expedition.
Computer Science Section National Center for Atmospheric Research Department of Computer Science University of Colorado at Boulder Blue Gene Experience.
ESMF Application Status GMAO Seasonal Forecast NCAR/LANL CCSM NCEP Forecast GFDL FMS Suite MITgcm NCEP/GMAO Analysis Climate Data Assimilation.
Experience with COSMO MPI/OpenMP hybrid parallelization Matthew Cordery, William Sawyer Swiss National Supercomputing Centre Ulrich Schättler Deutscher.
John Dennis ENES Workshop on HPC for Climate Models 1.
ESMF Performance Evaluation and Optimization Peggy Li(1), Samson Cheung(2), Gerhard Theurich(2), Cecelia Deluca(3) (1)Jet Propulsion Laboratory, California.
CSEG Update Mariana Vertenstein CCSM Software Engineering Group Mariana Vertenstein CCSM Software Engineering Group.
● Institution Contacts ● Institution Contacts Prof. Minghua Zhang, ITPA Prof. Minghua Zhang, ITPA Stony Brook University Stony Brook University
PetaApps: Update on software engineering and performance J. Dennis M. Vertenstein N. Hearn.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Migration to Rose and High Resolution Modelling Jean-Christophe Rioual, CRUM, Met Office 09/04/2015.
1 CCSM Component Performance Benchmarking and Status of the CRAY X1 at ORNL Patrick H. Worley Oak Ridge National Laboratory Computing in Atmospheric Sciences.
Regional Models in CCSM CCSM/POP/ROMS: Regional Nesting and Coupling Jon Wolfe (CSEG) Mariana Vertenstein (CSEG) Don Stark (ESMF)
Climate-Weather modeling studies Using a Prototype Global Cloud-System Resolving Model Zhi Liang (GFDL/DRC)
Components, Coupling and Concurrency in the Earth System Modeling Framework N. Collins/NCAR, C. DeLuca/NCAR, V. Balaji/GFDL, G. Theurich/SGI, A. da Silva/GSFC,
Overcoming Scaling Challenges in Bio-molecular Simulations Abhinav Bhatelé Sameer Kumar Chao Mei James C. Phillips Gengbin Zheng Laxmikant V. Kalé.
ROMS as a Component of the Community Climate System Model (CCSM) Enrique Curchitser, IMCS/Rutgers Kate Hedstrom, ARSC/UAF Bill Large, Mariana Vertenstein,
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.
High performance parallel computing of climate models towards the Earth Simulator --- computing science activities at CRIEPI --- Yoshikatsu Yoshida and.
Adrianne Middleton National Center for Atmospheric Research Boulder, Colorado CAM T340- Jim Hack Running the Community Climate Simulation Model (CCSM)
Towards development of a Regional Arctic Climate System Model --- Coupling WRF with the Variable Infiltration Capacity land model via a flux coupler Chunmei.
CCSM Portability and Performance, Software Engineering Challenges, and Future Targets Tony Craig National Center for Atmospheric Research Boulder, Colorado,
Computing Environment The computing environment rapidly evolving ‑ you need to know not only the methods, but also How and when to apply them, Which computers.
1 Scaling CCSM to a Petascale system John M. Dennis: June 22, 2006 John M. Dennis: June 22,
SC03, Nov 2003Y.He1 MPH: a Library for Coupling Climate Component Models on Distributed Memory Architectures Chris Ding and Yun (Helen) He
12/5/20151 CCSM4 - A Flexible New Infrastructure for Earth System Modeling Mariana Vertenstein NCAR CCSM Software Engineering Group.
ATmospheric, Meteorological, and Environmental Technologies RAMS Parallel Processing Techniques.
Chemistry-Climate Working Group Meeting (March 22-24, 2006) Background –SSC expectations and the next IPCC (Bill Collins) Summarize where we are now Discuss.
CCSM Performance, Successes and Challenges Tony Craig NCAR RIST Meeting March 12-14, 2002 Boulder, Colorado, USA.
CS- 492 : Distributed system & Parallel Processing Lecture 7: Sun: 15/5/1435 Foundations of designing parallel algorithms and shared memory models Lecturer/
On the Road to a Sequential CCSM Robert Jacob, Argonne National Laboratory Including work by: Mariana Vertenstein (NCAR), Ray Loy (ANL), Tony Craig (NCAR)
Data Structures and Algorithms in Parallel Computing Lecture 7.
Outline Why this subject? What is High Performance Computing?
WRF Software Development and Performance John Michalakes, NCAR NCAR: W. Skamarock, J. Dudhia, D. Gill, A. Bourgeois, W. Wang, C. Deluca, R. Loft NOAA/NCEP:
Parallelization Strategies Laxmikant Kale. Overview OpenMP Strategies Need for adaptive strategies –Object migration based dynamic load balancing –Minimal.
SDM Center High-Performance Parallel I/O Libraries (PI) Alok Choudhary, (Co-I) Wei-Keng Liao Northwestern University In Collaboration with the SEA Group.
Presented by Ricky A. Kendall Scientific Computing and Workflows National Institute for Computational Sciences Applications National Institute for Computational.
The Community Climate System Model (CCSM): An Overview Jim Hurrell Director Climate and Global Dynamics Division Climate and Ecosystem.
Hybrid Parallel Implementation of The DG Method Advanced Computing Department/ CAAM 03/03/2016 N. Chaabane, B. Riviere, H. Calandra, M. Sekachev, S. Hamlaoui.
BLUE GENE Sunitha M. Jenarius. What is Blue Gene A massively parallel supercomputer using tens of thousands of embedded PowerPC processors supporting.
SciDAC CCSM Consortium: Software Engineering Update Patrick Worley Oak Ridge National Laboratory (On behalf of all the consorts) Software Engineering Working.
Overview of the CCSM CCSM Software Engineering Group June
Towards development of a Regional Arctic Climate System Model ---
Regional and Global Ramifications of Boundary Current Upwelling
Software Practices for a Performance Portable Climate System Model
Performance of the VIC land surface model in coupled simulations
Initial Implementation of VIC within CCSM System through CPL7
FUJIN: a parallel framework for meteorological models
Presentation transcript:

Petascale –LLNL Appro AMD: 9K processors [today] –TJ Watson Blue Gene/L: 40K processors [today] –NY Blue Gene/L: 32K processors –ORNL Cray XT3/4 : 44K processors [Jan 2008] –TACC Sun : 55K processors [Jan 2008] –ANL Blue Gene/P : 160K processors [Jan 2008]

CCSM and Component Models –POP (Ocean) –CICE (Sea Ice) –CLM (Land Model) –CPL (Coupler) –CAM (Atmosphere) –CCSM

Status of POP (John Dennis) –17K Cray XT4 processors [12.5 years/day] –29K IBM Blue Gene/L [8.5 years/day] (BG Ready in Expedition Mode) Parallel I/O [Underway] Land causes load imbalance at 0.1 degree resolutions

Status of CAM (John Dennis) –CAM HOMME In Expedition Mode – Standard CAM “may be” run at 1 degree resolution or slightly higher on BG

Simulation rate for HOMME: Held-Suarez 1/2  1/3  1/4 

CAM & CCSM BG/L Expedition not from climate scientists Parallel I/O is the biggest bottleneck

Cloud Resolving Models/LES  Active Tracer High-resolution Atmospheric Model (ATHAM):  modularized  parallel-ready (MPI)  Goddard Cloud Ensemble Model (GCE):  well-established ( 70s- present)  parallel-ready (MPI)  scales linearly (99% up to 256 tasks)  comprehensive

Implementations  Been done (NERSC IBM SP, GFSC):  ATHAM: 2D & 3D bulk cloud physics  GCE: 3D bulk cloud physics 2D size-bins cloud physics  Being & to be done (Blue Gene):  GCE(ATHAM): 3D size-bins cloud physics larger domain longer simulation period finer resolution …

From: John Michalakes, NCAR

Model domains are decomposed for parallelism on two-levels Patch: section of model domain allocated to a distributed memory node Tile: section of a patch allocated to a shared-memory processor within a node; this is also the scope of a model layer subroutine. Distributed memory parallelism is over patches; shared memory parallelism is over tiles within patches Slide Courtesy: NCAR  Single version of code for efficient execution on: –Distributed-memory –Shared-memory –Clusters of SMPs –Vector and microprocessors Parallelism in WRF: Multi-level Decomposition Logical domain 1 Patch, divided into multiple tiles Inter-processor communication

NCAR WRF Issues With Bluegene/L (from John Michalakes)   Relatively slow I/O   Limited memory per node   Relatively poor processor performance   “Lots of of little gotchas mostly related to immaturity, especially in the programming environment.”