Adrianne Middleton National Center for Atmospheric Research Boulder, Colorado CAM T340- Jim Hack Running the Community Climate Simulation Model (CCSM) at NERSC
Years (constant-1870-conditions control run) TS (Globally averaged surface temperature) 1870 Control Run
380 b 400 c 420 d 440 e 360 a Years TS (Globally averaged surface temperature) A B C DE Member Historical Ensemble
CCSM IPCC Graph
Climate of the last Millennium Caspar Ammann NCAR/CGD Model Vs Observations
CCSM 3.0 T85 on bassi Atmosphere 128 pes Ocean 40 pes Coupler 8 pes Sea Ice 16 pes Land 16 pes Small number of processors (208), but long run time (16 days) Want 5 model years/day Queue is generally 3 days long Model runs roughly 1 model year/3 hours or max 4 years/run slot
HPC dimensions of Climate Prediction Data Assimilation New Science Spatial Resolution Ensemble size Timescale Better Science (parameterization → explicit model) (new processes/interactions not previously included) (simulate finer details, regions & transients) (quantify statistical properties of simulation) (decadal prediction/ initial value forecasts) (Length of simulations) Lawrence Buja (NCAR) / Tim Palmer (ECMWF)
Spatial Resolution (x*y*z) Ensemble size Timescale (Years*timestep) Today Terascale Climate Model Petascale 1.4° 160km 0.2° 22km AMR Km Regular Earth System Model 100yr* 20min 1000yr* 3min 1000yr * ? Code Rewrite Cost Multiplier Data Assimilation ESM+multiscale GCRM New Science Better Science HPC dimensions of Climate Prediction ? Lawrence Buja (NCAR) Exascale
Global General Circulation Continental large-scale flow Regional local Paleoclimate BGC/Carbon Cycle Spin-ups MJO convergence Resolve Hurricanes IPCC AR TF IPCC AR IPCC AR TF CCSM Grand Challenge PF
Community Climate System Model (CCSM) Current Configuration Hub and spoke design with 5 executables Exchange boundary information through coupler Each code quite large: k lines per code Need 5 simulated years/day --> Must run at “low” resolution Standard configuration run at scaling sweetspot of O(200) processors Petascale Configuration Single executable 5 years wall-clock day Targeting 10K - 120K processors per simulation 0.25° (30 km, L66) 0.1° Demonstrated 8.5 years/day on 28K Bluegene 0.1° Demonstrated 42 years/day on 32K Bluegene 0.1° –Cpl
Data Processing Ample disk space (We use up to 600 GB at a time) Ample memory (We use 2-4 GB) Long serial interactive jobs are fine Software we use includes the netCDF Operators (NCO), NCAR Graphics Command Language (NCL), Ferret 5.81, and more. “It's dependable, reliable, robust, and fits my requirements quite well.” -- Gary Strand
Thanks! Any Questions?