Adrianne Middleton National Center for Atmospheric Research Boulder, Colorado CAM T340- Jim Hack Running the Community Climate Simulation Model (CCSM)

Slides:



Advertisements
Similar presentations
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
Advertisements

A Look At The Research Perspective Assessed in IPCC Third Assessment Report (TAR) Climate Change 2001: The Scientific Basis (Working Group 1; Sir John.
Resolution and Athena – some introductory comments Tim Palmer ECMWF and Oxford.
Scientific Grand Challenges Workshop Series: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale Warren M. Washington National.
Numerical Modeling of Climate Hydrodynamic equations: 1. equations of motion 2. thermodynamic equation 3. continuity equation 4. equation of state 5. equations.
Part 1a: Overview of the UM system
Dynamical Downscaling of CCSM Using WRF Yang Gao 1, Joshua S. Fu 1, Yun-Fat Lam 1, John Drake 1, Kate Evans 2 1 University of Tennessee, USA 2 Oak Ridge.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Global Climate Modeling Research John Drake Computational Climate Dynamics Group Computer.
Climate Change and Malaysia
Climate modeling Current state of climate knowledge – What does the historical data (temperature, CO 2, etc) tell us – What are trends in the current observational.
NCAR GIS Program : Bridging Gaps
NCAS Unified Model Introduction Part 1a: Overview of the UM system University of Reading, 3-5 December 2014.
Topic 6: Climate change and climate models in Colombia.
REFERENCES Maria Val Martin 1 C. L. Heald 1, J.-F. Lamarque 2, S. Tilmes 2 and L. Emmons 2 1 Colorado State University 2 NCAR.
WFM 6311: Climate Risk Management © Dr. Akm Saiful Islam WFM 6311: Climate Change Risk Management Akm Saiful Islam Lecture-4: Module- 3 Regional Climate.
1 Geophysical Fluid Dynamics Laboratory Review June 30 - July 2, 2009.
Overview of Coupled Model Intercomparison Project (CMIP) and IPCC AR5 Activities Ronald J Stouffer Karl Taylor, Jerry Meehl and many others June 2009.
The Philosophy of Climate Modeling Jeffrey T. Kiehl Climate Change Research Section NCAR Jeffrey T. Kiehl Climate Change Research Section NCAR.
WRF-VIC: The Flux Coupling Approach L. Ruby Leung Pacific Northwest National Laboratory BioEarth Project Kickoff Meeting April 11-12, 2011 Pullman, WA.
Cyberinfrastructure Enabling Breakthrough Science: Changing the World with the IPCC AR4 Cyberinfrastructure Enabling Breakthrough Science: Changing the.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH First Field Tests of ESMF GMAO Seasonal Forecast NCAR/LANL CCSM NCEP.
CLIMARES, NERSC, October 2009 Arctic climate and future scenarios Ola M. Johannessen and Mats Bentsen Nansen Environmental and Remote Sensing Center.
Climate Models: Everything You Ever Wanted to Know, Ask, and Teach Randy Russell and Lisa Gardiner Spark – science education at NCAR All materials from.
Forecasting and Numerical Weather Prediction (NWP) NOWcasting Description of atmospheric models Specific Models Types of variables and how to determine.
A Vision Fulfilled Transformative Science, Data Infrastructures and the IPCC Experience Lawrence Buja National Center for Atmospheric Research Boulder,
Golden, Mar 14, 2009WIPS – The Earth's Climate System1.
ESMF Application Status GMAO Seasonal Forecast NCAR/LANL CCSM NCEP Forecast GFDL FMS Suite MITgcm NCEP/GMAO Analysis Climate Data Assimilation.
Preliminary Results of Global Climate Simulations With a High- Resolution Atmospheric Model P. B. Duffy, B. Govindasamy, J. Milovich, K. Taylor, S. Thompson,
Climate Modeling Jamie Anderson May Monitoring tells us how the current climate has/is changing Climate Monitoring vs Climate Modeling Modeling.
Guy P. Brasseur Climate Service Center-Germany GKSS, Hamburg, Germany and National Center for Atmospheric Research Boulder, Colorado, USA.
PetaApps: Update on software engineering and performance J. Dennis M. Vertenstein N. Hearn.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Petascale –LLNL Appro AMD: 9K processors [today] –TJ Watson Blue Gene/L: 40K processors [today] –NY Blue Gene/L: 32K processors –ORNL Cray XT3/4 : 44K.
Global Observing System Simulation Experiments (Global OSSEs) How It Works Nature Run 13-month uninterrupted forecast produces alternative atmosphere.
(Mt/Ag/EnSc/EnSt 404/504 - Global Change) Climate Models (from IPCC WG-I, Chapter 10) Projected Future Changes Primary Source: IPCC WG-I Chapter 10 - Global.
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.
Course Evaluation Closes June 8th.
A Portable Regional Weather and Climate Downscaling System Using GEOS-5, LIS-6, WRF, and the NASA Workflow Tool Eric M. Kemp 1,2 and W. M. Putman 1, J.
Climate Simulation for Climate Change Studies D.C. Bader 1, J. Hack 2, D. Randall 3 and W. Collins 2 1 Lawrence Livermore National Laboratory 2 National.
Dr. Lawrence Buja National Center for Atmospheric Research Boulder, Colorado, USA Dr. Lawrence Buja National Center for Atmospheric Research Boulder, Colorado,
CCSM Portability and Performance, Software Engineering Challenges, and Future Targets Tony Craig National Center for Atmospheric Research Boulder, Colorado,
 CLIMATE MODEL PRECIPITATION TREND ANALYSIS IN THE 20 TH CENTURY Karen Rivas Key Acknowledgements: Xubin Zeng & Koichi Sakaguchi UA/NASA Space Grant Symposium.
The evolution of climate modeling Kevin Hennessy on behalf of CSIRO & the Bureau of Meteorology Tuesday 30 th September 2003 Canberra Short course & Climate.
CCSM Performance, Successes and Challenges Tony Craig NCAR RIST Meeting March 12-14, 2002 Boulder, Colorado, USA.
Running CESM An overview
Welcome to the PRECIS training workshop
Perspectives in Computational Earth System Science an oceanographer’s view Aike Beckmann Division of Geophysics, Department of Physical Sciences
High Resolution Global Ocean Model
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Data Requirements for Climate and Carbon Research John Drake, Climate Dynamics Group Computer.
Presented by LCF Climate Science Computational End Station James B. White III (Trey) Scientific Computing National Center for Computational Sciences Oak.
WCC-3, Geneva, 31 Aug-4 Sep 2009 Advancing Climate Prediction Science – Decadal Prediction Mojib Latif Leibniz Institute of Marine Sciences, Kiel University,
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.
“CLIMATE IS WHAT WE EXPECT, AND WEATHER IS WHAT WE GET” ~ MARK TWAIN.
Coordinated climate change experiments to be assessed as part of the IPCC AR5 Gerald A. Meehl National Center for Atmospheric Research Boulder, Colorado.
The Community Climate System Model (CCSM): An Overview Jim Hurrell Director Climate and Global Dynamics Division Climate and Ecosystem.
Moving from Global Climate Simulations to Regional Model Predictions Moving from Global Climate Simulations to Regional Model Predictions Lawrence Buja.
Impact of Convective Triggering Mechanisms on CAM2 Model Simulations Shaocheng Xie, Gerald L. Potter, Richard T. Cederwall, and James S. Boyle Atmospheric.
ESSL Holland, CCSM Workshop 0606 Predicting the Earth System Across Scales: Both Ways Summary:Rationale Approach and Current Focus Improved Simulation.
Overview of the CCSM CCSM Software Engineering Group June
Course Evaluation Now online You should have gotten an with link.
Course Evaluation Now online You should have gotten an with link.
GFDL Climate Model Status and Plans for Product Generation
Ronald J Stouffer Karl Taylor, Jerry Meehl and many others
National Center for Atmospheric Research
Course Evaluation Now online You should have gotten an with link.
CCSM3’s IPCC Simulations and Lessons Learned
Emerging signals at various spatial scales
National Center for Atmospheric Research
Decadal prediction in the Pacific
Presentation transcript:

Adrianne Middleton National Center for Atmospheric Research Boulder, Colorado CAM T340- Jim Hack Running the Community Climate Simulation Model (CCSM) at NERSC

Years (constant-1870-conditions control run) TS (Globally averaged surface temperature) 1870 Control Run

380 b 400 c 420 d 440 e 360 a Years TS (Globally averaged surface temperature) A B C DE Member Historical Ensemble

CCSM IPCC Graph

Climate of the last Millennium Caspar Ammann NCAR/CGD Model Vs Observations

CCSM 3.0 T85 on bassi Atmosphere 128 pes Ocean 40 pes Coupler 8 pes Sea Ice 16 pes Land 16 pes Small number of processors (208), but long run time (16 days) Want 5 model years/day Queue is generally 3 days long Model runs roughly 1 model year/3 hours or max 4 years/run slot

HPC dimensions of Climate Prediction Data Assimilation New Science Spatial Resolution Ensemble size Timescale Better Science (parameterization → explicit model) (new processes/interactions not previously included) (simulate finer details, regions & transients) (quantify statistical properties of simulation) (decadal prediction/ initial value forecasts) (Length of simulations) Lawrence Buja (NCAR) / Tim Palmer (ECMWF)

Spatial Resolution (x*y*z) Ensemble size Timescale (Years*timestep) Today Terascale Climate Model Petascale 1.4° 160km 0.2° 22km AMR Km Regular Earth System Model 100yr* 20min 1000yr* 3min 1000yr * ? Code Rewrite Cost Multiplier Data Assimilation ESM+multiscale GCRM New Science Better Science HPC dimensions of Climate Prediction ? Lawrence Buja (NCAR) Exascale

Global General Circulation Continental large-scale flow Regional local Paleoclimate BGC/Carbon Cycle Spin-ups MJO convergence Resolve Hurricanes IPCC AR TF IPCC AR IPCC AR TF CCSM Grand Challenge PF

Community Climate System Model (CCSM) Current Configuration Hub and spoke design with 5 executables Exchange boundary information through coupler Each code quite large: k lines per code Need 5 simulated years/day --> Must run at “low” resolution Standard configuration run at scaling sweetspot of O(200) processors Petascale Configuration Single executable 5 years wall-clock day Targeting 10K - 120K processors per simulation 0.25° (30 km, L66) 0.1° Demonstrated 8.5 years/day on 28K Bluegene 0.1° Demonstrated 42 years/day on 32K Bluegene 0.1° –Cpl

Data Processing Ample disk space (We use up to 600 GB at a time) Ample memory (We use 2-4 GB) Long serial interactive jobs are fine Software we use includes the netCDF Operators (NCO), NCAR Graphics Command Language (NCL), Ferret 5.81, and more. “It's dependable, reliable, robust, and fits my requirements quite well.” -- Gary Strand

Thanks! Any Questions?