IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.

Slides:



Advertisements
Similar presentations
Chapter 13 – Weather Analysis and Forecasting
Advertisements

Weather Research & Forecasting: A General Overview
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
WRF Modeling System V2.0 Overview
Brian Doty and Jennifer Adams
A Cloud Resolving Model with an Adaptive Vertical Grid Roger Marchand and Thomas Ackerman - University of Washington, Joint Institute for the Study of.
2012: Hurricane Sandy 125 dead, 60+ billion dollars damage.
0 Future NWS Activities in Support of Renewable Energy* Dr. David Green NOAA, NWS Office of Climate, Water & Weather Services AMS Summer Community Meeting.
Climate modeling Current state of climate knowledge – What does the historical data (temperature, CO 2, etc) tell us – What are trends in the current observational.
NCAR GIS Program : Bridging Gaps
Nesting. Eta Model Hybrid and Eta Coordinates ground MSL ground Pressure domain Sigma domain  = 0  = 1  = 1 Ptop  = 0.
Mesoscale & Microscale Meteorological Division / NCAR ESMF and the Weather Research and Forecast Model John Michalakes, Thomas Henderson Mesoscale and.
1 Tuesday, September 26, 2006 Wisdom consists of knowing when to avoid perfection. -Horowitz.
The Puget Sound Regional Environmental Prediction System: An Update.
Weather Model Background ● The WRF (Weather Research and Forecasting) model had been developed by various research and governmental agencies became the.
Weather Research & Forecasting Model (WRF) Stacey Pensgen ESC 452 – Spring ’06.
This is the footer WRF Basics Weather Research and Forecasting.
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
Consortium Meeting June 3, Thanks Mike! Hit Rates.
1 NCEP Mark Iredell Chief NCEP/EMC Global Climate and Weather Modeling Branch May 23, 2006 NCEP: “where America’s climate, weather, and ocean services.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
MDSS Challenges, Research, and Managing User Expectations - Weather Issues - Bill Mahoney & Kevin Petty National Center for Atmospheric Research (NCAR)
Next Gen AQ model Need AQ modeling at Global to Continental to Regional to Urban scales – Current systems using cascading nests is cumbersome – Duplicative.
WRF-VIC: The Flux Coupling Approach L. Ruby Leung Pacific Northwest National Laboratory BioEarth Project Kickoff Meeting April 11-12, 2011 Pullman, WA.
Kelvin K. Droegemeier University of Oklahoma NCAR 50th Anniversary Special Symposium The Future of Weather Forecasting and Potential Roles to be Played.
Alok 1Northwestern University Access Patterns, Metadata, and Performance Alok Choudhary and Wei-Keng Liao Department of ECE,
Forecasting and Numerical Weather Prediction (NWP) NOWcasting Description of atmospheric models Specific Models Types of variables and how to determine.
Warn on Forecast Briefing September 2014 Warn on Forecast Brief for NCEP planning NSSL and GSD September 2014.
The National Environmental Agency of Georgia L. Megrelidze, N. Kutaladze, Kh. Kokosadze NWP Local Area Models’ Failure in Simulation of Eastern Invasion.
+ Best Practices in Regional Climate Modeling Dr. Michel d. S. Mesquita Bjerknes Centre for Climate Research, Uni Research
RDFS Rapid Deployment Forecast System Visit at: Registration required.
Challenges in Urban Meteorology: A Forum for Users and Providers OFCM Workshop Summaries Lt Col Rob Rizza Assistant Federal Coordinator for USAF/USA Affairs.
Earth Science Division National Aeronautics and Space Administration 18 January 2007 Paper 5A.4: Slide 1 American Meteorological Society 21 st Conference.
ATD Research Needs and Priorities Panelists Dr. Jay Boris – Navy/NRL Mr. Walter Schalk – NOAA/ARL Mr. John Pace – DoD/DTRA Ms. Jocelyn Mitchell - NuRC.
Birmingham Urban Heat Islands during 2003 Heatwave Period  Preliminary WRF/BEP Simulation Xiaoming Cai, Richard Bassett and John E. Thornes School of.
Fly - Fight - Win 16 th Weather Squadron Evan Kuchera Fine Scale Models and Ensemble 16WS/WXN Template: 28 Feb 06 Air Force Weather Ensembles.
4.2.1 Programming Models Technology drivers – Node count, scale of parallelism within the node – Heterogeneity – Complex memory hierarchies – Failure rates.
The New Zealand Institute for Plant & Food Research Limited Use of Cloud computing in impact assessment of climate change Kwang Soo Kim and Doug MacKenzie.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Petascale –LLNL Appro AMD: 9K processors [today] –TJ Watson Blue Gene/L: 40K processors [today] –NY Blue Gene/L: 32K processors –ORNL Cray XT3/4 : 44K.
CPPA Past/Ongoing Activities - Ocean-Atmosphere Interactions - Address systematic ocean-atmosphere model biases - Eastern Pacific Investigation of Climate.
3 rd Annual WRF Users Workshop Promote closer ties between research and operations Develop an advanced mesoscale forecast and assimilation system   Design.
Higher Resolution Operational Models. Major U.S. High-Resolution Mesoscale Models (all non-hydrostatic ) WRF-ARW (developed at NCAR) NMM-B (developed.
Adrianne Middleton National Center for Atmospheric Research Boulder, Colorado CAM T340- Jim Hack Running the Community Climate Simulation Model (CCSM)
Status of the COSMO-Model Package Ulrich Schättler.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss WG 4 activities.
Exascale climate modeling 24th International Conference on Parallel Architectures and Compilation Techniques October 18, 2015 Michael F. Wehner Lawrence.
Generic GUI – Thoughts to Share Jinping Gwo EMSGi.org.
WRF Software Development and Performance John Michalakes, NCAR NCAR: W. Skamarock, J. Dudhia, D. Gill, A. Bourgeois, W. Wang, C. Deluca, R. Loft NOAA/NCEP:
Exscale – when will it happen? William Kramer National Center for Supercomputing Applications.
Update on the Northwest Regional Modeling System 2015 Cliff Mass and David Ovens University of Washington.
Modeling and Evaluation of Antarctic Boundary Layer
Proposed THORPEX/HEPEX Hydrologic Ensemble Project (THEPS) Presentation for 3 rd THORPEX Science Symposium September 14-18, 2009 Prepared by John Schaake,
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
Support to scientific research on seasonal-to-decadal climate and air quality modelling Pierre-Antoine Bretonnière Francesco Benincasa IC3-BSC - Spain.
Climate-SDM (1) Climate analysis use case –Described by: Marcia Branstetter Use case description –Data obtained from ESG –Using a sequence steps in analysis,
Higher Resolution Operational Models
NOAA Hurricane Forecast Improvement Project Development Fred Toepfer, HFIP Manager Bob Gall, HFIP Development Manager.
Hurricane Joaquin Frank Marks AOML/Hurricane Research Division 10 May 2016 Frank Marks AOML/Hurricane Research Division 10 May 2016 Research to Improve.
ESSL Holland, CCSM Workshop 0606 Predicting the Earth System Across Scales: Both Ways Summary:Rationale Approach and Current Focus Improved Simulation.
Energy efficient SCalable
Grid Point Models Surface Data.
CAPS is one of the first 11 NSF Science and Technology (S&T) Centers
How do models work? METR 2021: Spring 2009 Lab 10.
gWRF Workflow and Input Data Requirements
CAPS Mission Statement
How will the earth’s temperature change?
Activities WG-MWFR Help set up/involvement in RDP’s/FDP’s (COPS, Sochi, HYMEX, …) Push mesoscale research cooperation / proposals on: Mesoscale modelling.
Presentation transcript:

IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide

Panel Format Sequence – Alphabetical Few bullet points for each question, each participant can address/supplement it After each panel member has finished it, we move on to the next question Moderators can adjust depending on discussions and time constraints

Panel Members Steve Finn & Sharan Kalwani Panel ParticipantAffiliation Jim DoyleNRL Jim HackORNL John MichelakesNCAR Henry TufoUniversity of Colorado

Q1. Relative Importance of data/resolution/micro-physics ! To a certain degree, data assimilation, resolution and physics are all important (non-linear system of eqns.) Metric dependent: quantitative precipitation skill, low- level winds, clouds, forecast length (nowcast vs climate) For typical Navy metrics (winds,visibility,waves,clouds)  Data assimilation is essential (accurate synoptic and mesoscale initial state, spin-up of physics)  Physics: Boundary, surface layer, cloud/convection  Resolution –Sufficient to capture key geographic features –High enough to avoid bulk parameterizations Convection (  x~2-4 km), turbulence (  x~ m) –Predictability: tradeoffs between ensembles &  x

Q2. Adaptive Mesh or Embedded Grids: their impact… Please discuss the use of Adaptive Mesh or embedded grids ( urban area, terrain impact on advection..) and how would future increased use of this, impact system requirements such as system interconnects? Adaptive meshes are challenging  Time step and run time (for operations) issues  Physical parameterizations (resolution dependence)  Mesh refinement needs to consider complex multi-scale interactions (difficulty in determining hi-res areas). Nested grids currently used in Navy mesoscale model  Moving meshes to follow features (hurricanes), ships Impact on system requirements (interconnects)  Load balance may be an issue (decomposition)  Cores as a function of grid points (communication)

Q3. Ratio of Date to Compute: Background… What are your Bytes per Flop for future requirements? * This is a relatively *low* ratio compared to some other benchmarks? Geerd Hoffman of DW said the Oct 2007 HPC User Forum in Stuttgart) Problem description Memory requirement per core (bytes) Peak performance per core (FLOPS) Sustained bytes per FLOP Peak bytes per FLOP Current dimensions run today 1.5x10 8 3x Future dimensions run today 1.5x x Future dimensions run on today’s petascale 1.5x x Future dimensions run on future’s petascale 1.5x x Problem dimension: nNest·nx·ny·nz·nVariables·nTimeLevels·Precision Today: 5x100x100x50x50x3x4 Future: 5x100x100x100x100x3x8

Q4. Open Source codes in the community… What is the Importance and impact of open source / community code applications such as CCSM, WRF,….? Information assurance issues for DoD  Open source may be problematic for operations Navy open source code can be useful  Physics (from other institutions, agencies)  Framework (Earth System Modeling Framework)  Post processing, graphics etc.  COAMPS code (has > 350 users) Fosters collaboration, and leverages expertise (within and beyond CWO) among agencies, universities.

Q5. Data and collaboration, formats, future needs… What is the level of collaboration and standardization of data management, observational & results data bases: such as use of common file formats, web based data, etc. What is needed in the future? Common file standards for exchange among agencies (grib2 for operations, some netcdf for research). Static databases (land characteristics, etc.) are commonly shared, but often not standardized. Standardized observational databases (common format with other agencies is being considered) Future:  Much larger databases  Larger need for standardized output (input) for community shared projects (TIGGE, HFIP, etc.)

Q6. Ensemble model: your experiences… Has the use of ensemble model had any positive or negative impact in reducing the code scaling requirements? Limiting factor is how well deterministic model scales Ensembles are embarrassingly parallel and should perform well on large multi-core clusters. Need efficient I/O to exchange state information between the model output and post processing (DA) Ensemble approaches present some challenges for post processing (archival) and file management.

Q7. Obligatory Question: (no pun intended!) Cloud computing: your views (unfiltered)… What is your current / future interest in grid or cloud computing ? Grid / cloud computing may potential work well for ensembles, although there are obvious challenges (I/O) Domain decomposition across the grid, could present big challenges Models require huge input datasets and produce large output datasets (persistent data storage) Model paradigm may have to be re-visited (communication, latency between nodes might not be consistent). Information assurance could be an issue (particularly for DoD operations).