Thermospheric General Circulation Models (TGCM’s) Brief History of Software Development Current State of the Codes Software Challenges and Development.

Slides:



Advertisements
Similar presentations
GEMS Kick- off MPI -Hamburg CTM - IFS interfaces GEMS- GRG Review of meeting in January and more recent thoughts Johannes Flemming.
Advertisements

Weather Research & Forecasting: A General Overview
The NCAR TIE-GCM: Model Description, Development, and Validation
Priority Research Direction (I/O Models, Abstractions and Software) Key challenges What will you do to address the challenges? – Develop newer I/O models.
A Parallel Structured Ecological Model for High End Shared Memory Computers Dali Wang Department of Computer Science, University of Tennessee, Knoxville.
Chapter 13 Embedded Systems
B1 -Biogeochemical ANL - Townhall V. Rao Kotamarthi.
70-290: MCSE Guide to Managing a Microsoft Windows Server 2003 Environment Chapter 2: Managing Hardware Devices.
1 Chapter 13 Embedded Systems Embedded Systems Characteristics of Embedded Operating Systems.
How do gravity waves determine the global distributions of winds, temperature, density and turbulence within a planetary atmosphere? What is the fundamental.
Lecture 29 Fall 2006 Lecture 29: Parallel Programming Overview.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT Adoption and field tests of M.I.T General Circulation Model (MITgcm) with ESMF Chris Hill ESMF.
TIEGCM Community Release of Version 1.94 Ben Foster June, 2011 CEDAR Workshop 1.
1 Babak Behzad, Yan Liu 1,2,4, Eric Shook 1,2, Michael P. Finn 5, David M. Mattli 5 and Shaowen Wang 1,2,3,4 Babak Behzad 1,3, Yan Liu 1,2,4, Eric Shook.
70-290: MCSE Guide to Managing a Microsoft Windows Server 2003 Environment, Enhanced Chapter 2: Managing Hardware Devices.
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
CCA Common Component Architecture Manoj Krishnan Pacific Northwest National Laboratory MCMD Programming and Implementation Issues.
Computational Design of the CCSM Next Generation Coupler Tom Bettge Tony Craig Brian Kauffman National Center for Atmospheric Research Boulder, Colorado.
Initial Results from the Integration of Earth and Space Frameworks Cecelia DeLuca/NCAR, Alan Sussman/University of Maryland, Gabor Toth/University of Michigan.
1 Addressing Critical Skills Shortages at the NWS Environmental Modeling Center S. Lord and EMC Staff OFCM Workshop 23 April 2009.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
Altitude (km) January Global AverageTemperature (K) Pressure (hPa) With O( 3 P) Cooling WACCM-X The Whole Atmosphere Community Climate Model – eXtended.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
4.2.1 Programming Models Technology drivers – Node count, scale of parallelism within the node – Heterogeneity – Complex memory hierarchies – Failure rates.
29 April 2010 Space Weather Workshop 2010 From Research To Operations: Transitioning CISM Models W. Jeffrey Hughes Center for Integrated Space Weather.
C5- IT Infrastructure and Emerging Technologies. Input – Process - Output 2 A computer  Takes data as input  Processes it  Outputs information CPU.
Advanced Computer Networks Topic 2: Characterization of Distributed Systems.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Climate-Weather modeling studies Using a Prototype Global Cloud-System Resolving Model Zhi Liang (GFDL/DRC)
Towards Exascale File I/O Yutaka Ishikawa University of Tokyo, Japan 2009/05/21.
CSC 7600 Lecture 28 : Final Exam Review Spring 2010 HIGH PERFORMANCE COMPUTING: MODELS, METHODS, & MEANS FINAL EXAM REVIEW Daniel Kogler, Chirag Dekate.
02/09/2010 Industrial Project Course (234313) Virtualization-aware database engine Final Presentation Industrial Project Course (234313) Virtualization-aware.
High performance parallel computing of climate models towards the Earth Simulator --- computing science activities at CRIEPI --- Yoshikatsu Yoshida and.
Adrianne Middleton National Center for Atmospheric Research Boulder, Colorado CAM T340- Jim Hack Running the Community Climate Simulation Model (CCSM)
GEON2 and OpenEarth Framework (OEF) Bradley Wallet School of Geology and Geophysics, University of Oklahoma
1 Community Modeling in the Atmospheric Sciences Stan Solomon High Altitude Observatory National Center for Atmospheric Research SHINE Meeting Whistler,
Multilevel Parallelism using Processor Groups Bruce Palmer Jarek Nieplocha, Manoj Kumar Krishnan, Vinod Tipparaju Pacific Northwest National Laboratory.
CCGrid, 2012 Supporting User Defined Subsetting and Aggregation over Parallel NetCDF Datasets Yu Su and Gagan Agrawal Department of Computer Science and.
CCSM Portability and Performance, Software Engineering Challenges, and Future Targets Tony Craig National Center for Atmospheric Research Boulder, Colorado,
August 2001 Parallelizing ROMS for Distributed Memory Machines using the Scalable Modeling System (SMS) Dan Schaffer NOAA Forecast Systems Laboratory (FSL)
ATmospheric, Meteorological, and Environmental Technologies RAMS Parallel Processing Techniques.
GEOSCIENCE NEEDS & CHALLENGES Dogan Seber San Diego Supercomputer Center University of California, San Diego, USA.
Software Deployment and Mobility. Introduction Deployment is the placing of software on the hardware where it is supposed to run. Redeployment / migration.
CCSM Performance, Successes and Challenges Tony Craig NCAR RIST Meeting March 12-14, 2002 Boulder, Colorado, USA.
SVN tag tiegcm1.92_r325 has been saved as an interim code to tiegcm1.93 We propose that this revision be tested and evaluated over a 2-week period, during.
ESMF,WRF and ROMS. Purposes Not a tutorial Not a tutorial Educational and conceptual Educational and conceptual Relation to our work Relation to our work.
Sol-Terra: A Roadmap to Operational Sun-to- Earth Space Weather Forecasting Mike Marsh 1, David Jackson 1, Alastair Pidgeon 2, Gareth Lawrence 2, Simon.
Parallel IO for Cluster Computing Tran, Van Hoai.
A TIME-GCM CAM Multi-executable Coupled Model Using ESMF and InterComm Robert Oehmke, Michael Wiltberger, Alan Sussman, Wenbin Wang, and Norman Lo.
5-7 May 2003 SCD Exec_Retr 1 Research Data, May Archive Content New Archive Developments Archive Access and Provision.
HPC University Requirements Analysis Team Training Analysis Summary Meeting at PSC September Mary Ann Leung, Ph.D.
Hernán García CeCalcULA Universidad de los Andes.
Energy inputs from Magnetosphere to the Ionosphere/Thermosphere ASP research review Yue Deng April 12 nd, 2007.
Using Cyberinfrastructure to Study the Earth’s Climate and Air Quality Don Wuebbles Department of Atmospheric Sciences University of Illinois, Urbana-Champaign.
30 April 2009 Space Weather Workshop 2009 The Challenge of Predicting the Ionosphere: Recent results from CISM. W. Jeffrey Hughes Center for Integrated.
Defining the Competencies for Leadership- Class Computing Education and Training Steven I. Gordon and Judith D. Gardiner August 3, 2010.
Towards a High Performance Extensible Grid Architecture Klaus Krauter Muthucumaru Maheswaran {krauter,
SCEC Capability Simulations on TeraGrid
VisIt Project Overview
Introduction to Parallel Computing: MPI, OpenMP and Hybrid Programming
Atmosphere-Ionosphere Wave Coupling as Revealed in Swarm Plasma Densities and Drifts Jeffrey M. Forbes Department of Aerospace Engineering Sciences, University.
Thermosphere-Ionosphere Issues for DASI - I:
Initial Adaptation of the Advanced Regional Prediction System to the Alliance Environmental Hydrology Workbench Dan Weber, Henry Neeman, Joe Garfield and.
University of Technology
Tomoko Matsuo DAI/GSP *in collaboration with Jeff Anderson(DAI),
Many-core Software Development Platforms
WRF-GC: on-line two-way coupling of WRF and GEOS-Chem
Han-Li Liu, Raymond G. Roble, Arthur D. Richmond, Stanley C
The Upper Atmosphere: Problems in Developing Realistic Models
Department of Computer Science, University of Tennessee, Knoxville
Presentation transcript:

Thermospheric General Circulation Models (TGCM’s) Brief History of Software Development Current State of the Codes Software Challenges and Development Goals August 17, 2010 Ben Foster NCAR/HAO Three-Dimensional Time-Dependent Numeric Simulation Models of the Earth’s Neutral Upper Atmosphere and Ionosphere

TGCM [1970’s] TIGCM [1989] TIEGCM [1991] TIMEGCM [1990s to present] TIEGCM [1990s to present] WACCM Community Model Open source (CCMC) R.G. Roble, R.E. Dickinson, E.C. Ridley CAM MOZART CMIT LFM

Software Milestones ( ) Rewrote legacy code in standard-compliant f90, and ported to a variety of platforms. 1d, then 2d domain decomposition with MPI. NetCDF history file i/o. Coupled with CCM at lower boundary (TIMEGCM) Double-Resolution (2.5 deg, 4 gr/sh). Mars and Venus adaptations (Bougher, et.al.) Coupled with LFM at CISM (TIEGCM) Community release of TIEGCM to CCMC (Open-source with academic license agreement)

Current Model Configuration and Capabilities TIEGCMTIMEGCM Approx 56K lines of f90, 84 source files under SVN control 97 km LB 5 deg, dz=0.5 (72K cells), or 2.5 deg, dz=0.25 (580K cells) 1-5 minute timestep MPI only, lat x lon decomposition geo and mag Netcdf history restart and diagnostic output files IBM/AIX, Linux desktop, SGI T,U,V,O2,O,N4S,NO,O+,N2D,TI,TE,NE,O2+,W,Z,POTEN Approx 75K lines of f90, 107 source files under SVN control 30 km LB 5 deg, dz=0.5 (124K cells), or 2.5 deg, dz=0.25 (995K cells) 0.5 – 3 minute timestep MPI only, lat x lon decomposition geo and mag Netcdf history restart and diagnostic output files IBM/AIX, Linux desktop, SGI T,U,V,O2,OX,N4S,NOZ,CO,CO2,H2O,H2,HOX,O+,CH4,O21D,NO2,NO,O3,O,OH,HO2,H,N2 D,TI,TE,NE,O2+,W,Z,POTEN Infrastructure Support: csh and perl scripts, IDL and f90 post-proc and visualization

Software Issues and Challenges Model/Regime coupling Loose coupling: separate execs, exchange disk data ( GSWM in TGCMs ). Tight coupling: separate execs, under control of a mediary ( TIEGCM in CMIT ) Host model incorporation: single exec, new module.( Heelis in TIEGCM ) Transfer of regime physics and dynamics into the software structure of a host model ( TIMEGCM to WACCM ) Disparate temporal scales and spatial geometries ( TIEGCM in CMIT ) Regime boundary transition (spatial, chemical, dynamic) Efficient use of the hardware (e.g., load-balancing, memory structure) Seamless in science, separate in software..

Software Issues and Challenges Parallelization, performance Distributed and/or shared memory (MPI, PVM or OpenMP) Domain decomposition (tgcm: lat x lon in geographic and magnetic coords) Scaling (tgcm scales only to 32 or 64 processors, due to serial dynamo) Amdahl’s law: max speedup = 1/((1-P)+P/N) (a 3% serial code will never achieve better than 3x speedup (must strive to minimize % of code that is serial) Hardware architecture is trending toward large numbers of commodity chips, and away from small numbers of specialized vector processors. Software must be able to scale up in this environment. Also consider GPUs, but they have limitations as well. Significant software effort is required to optimize a problem for different hardware configurations, while maintaining portability.

Speedup curve (top) for 5-day runs of timegcm at 2.5 deg resolution. Percentage of wall-clock time spent in serial dynamo (bottom) demonstrates how Amdahl’s Law limits scalability, and the importance of 100% parallel code

Software Issues and Challenges, cont. Data assimilation/data drivers Observations are used to provide initial conditions or to dynamically drive global models, but are inherently limited in space and time. Measurements of or proxies for physical properties guide primitive equations and parameterizations used to advance the simulation. Integration of observational data with background state of numerical models TIMEGCM optionally uses NCEP and ECMWF data for lower boundary conditions, and solar flux/wind data to drive ionospheric dynamics. Model benchmarking and Community support Use SVN to provide timely updates, bug fixes, developer team coordination Standard control runs per revision provide restart files and benchmark simulations ( TIEGCM1.93 benchmark runs ) TIEGCM1.93 benchmark runs Providing model source, supporting scripts, data files, documentation, and analysis tools to the scientific community requires a considerable ongoing personnel effort.

Software Issues and Challenges, cont High resolution modeling High spatial and temporal resolution is required to resolve sub-grid scale ionospheric variability, gravity waves, thermal tides, and auroral processes. Increasing resolution requires tuning of parameterizations Implications for performance, memory scalability, storage, and post- processing and analysis Parallel I/O: reorganize memory for efficient i/o Libraries vs source code Keep number of large libs down (tgcm: netcdf,MPI,ESMF) Incorporate source code for targeted purposes (e.g., FFT) Raises portability and maintenance issues

WACCM-X neutral species density above about 100 km is made possible through transfer of TIMEGCM physics, dynamics and chemistry into the WACCM software structure.

WACCM-X ionosphere is made possible through careful transfer of TIMEGCM physics, dynamics and chemistry into the WACCM software structure.

~125 km~300 km WACCMX MSIS TIEGCM Wave structure of thermal tides in the lower thermosphere is resolved in WACCM-X by introduction of TIEGCM/TIMEGCM physics and dynamics of the MLT system.