Presentation is loading. Please wait.

Presentation is loading. Please wait.

IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.

Similar presentations


Presentation on theme: "IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide."— Presentation transcript:

1 IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide

2 Panel Format Sequence – Alphabetical Few bullet points for each question, each participant can address/supplement it After each panel member has finished it, we move on to the next question Moderators can adjust depending on discussions and time constraints

3 Panel Members Steve Finn & Sharan Kalwani Panel ParticipantAffiliation Jim DoyleNRL Jim HackORNL John MichelakesNCAR Henry TufoUniversity of Colorado

4 Q1. Relative Importance of data/resolution/micro-physics ! To a certain degree, data assimilation, resolution and physics are all important (non-linear system of eqns.) Metric dependent: quantitative precipitation skill, low- level winds, clouds, forecast length (nowcast vs climate) For typical Navy metrics (winds,visibility,waves,clouds)  Data assimilation is essential (accurate synoptic and mesoscale initial state, spin-up of physics)  Physics: Boundary, surface layer, cloud/convection  Resolution –Sufficient to capture key geographic features –High enough to avoid bulk parameterizations Convection (  x~2-4 km), turbulence (  x~20-200 m) –Predictability: tradeoffs between ensembles &  x

5 Q2. Adaptive Mesh or Embedded Grids: their impact… Please discuss the use of Adaptive Mesh or embedded grids ( urban area, terrain impact on advection..) and how would future increased use of this, impact system requirements such as system interconnects? Adaptive meshes are challenging  Time step and run time (for operations) issues  Physical parameterizations (resolution dependence)  Mesh refinement needs to consider complex multi-scale interactions (difficulty in determining hi-res areas). Nested grids currently used in Navy mesoscale model  Moving meshes to follow features (hurricanes), ships Impact on system requirements (interconnects)  Load balance may be an issue (decomposition)  Cores as a function of grid points (communication)

6 Q3. Ratio of Date to Compute: Background… What are your Bytes per Flop for future requirements? * This is a relatively *low* ratio compared to some other benchmarks? Geerd Hoffman of DW said 4Bytes/Flop @ the Oct 2007 HPC User Forum in Stuttgart) Problem description Memory requirement per core (bytes) Peak performance per core (FLOPS) Sustained bytes per FLOP Peak bytes per FLOP Current dimensions run today 1.5x10 8 3x10 9 0.55 Future dimensions run today 1.5x10 10 3x10 9 505 Future dimensions run on today’s petascale 1.5x10 10 1x10 10 151.5 Future dimensions run on future’s petascale 1.5x10 10 5x10 10 30.3 Problem dimension: nNest·nx·ny·nz·nVariables·nTimeLevels·Precision Today: 5x100x100x50x50x3x4 Future: 5x100x100x100x100x3x8

7 Q4. Open Source codes in the community… What is the Importance and impact of open source / community code applications such as CCSM, WRF,….? Information assurance issues for DoD  Open source may be problematic for operations Navy open source code can be useful  Physics (from other institutions, agencies)  Framework (Earth System Modeling Framework)  Post processing, graphics etc.  COAMPS code (has > 350 users) Fosters collaboration, and leverages expertise (within and beyond CWO) among agencies, universities.

8 Q5. Data and collaboration, formats, future needs… What is the level of collaboration and standardization of data management, observational & results data bases: such as use of common file formats, web based data, etc. What is needed in the future? Common file standards for exchange among agencies (grib2 for operations, some netcdf for research). Static databases (land characteristics, etc.) are commonly shared, but often not standardized. Standardized observational databases (common format with other agencies is being considered) Future:  Much larger databases  Larger need for standardized output (input) for community shared projects (TIGGE, HFIP, etc.)

9 Q6. Ensemble model: your experiences… Has the use of ensemble model had any positive or negative impact in reducing the code scaling requirements? Limiting factor is how well deterministic model scales Ensembles are embarrassingly parallel and should perform well on large multi-core clusters. Need efficient I/O to exchange state information between the model output and post processing (DA) Ensemble approaches present some challenges for post processing (archival) and file management.

10 Q7. Obligatory Question: (no pun intended!) Cloud computing: your views (unfiltered)… What is your current / future interest in grid or cloud computing ? Grid / cloud computing may potential work well for ensembles, although there are obvious challenges (I/O) Domain decomposition across the grid, could present big challenges Models require huge input datasets and produce large output datasets (persistent data storage) Model paradigm may have to be re-visited (communication, latency between nodes might not be consistent). Information assurance could be an issue (particularly for DoD operations).


Download ppt "IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide."

Similar presentations


Ads by Google