Mesoscale & Microscale Meteorological Division / NCAR WRF Modeling System “Short” Tutorial John Michalakes, Jimy Dudhia, Wei Wang, Cindy Bruyere, Michael.

Slides:



Advertisements
Similar presentations
Weather Research & Forecasting: A General Overview
Advertisements

ATMO5332 WRF-ARW Tutorial 0.01”.
WRF Modeling System V2.0 Overview
® IBM Software Group © 2010 IBM Corporation What’s New in Profiling & Code Coverage RAD V8 April 21, 2011 Kathy Chan
High-Resolution Land Use Data in WPS/WRF for Urban Regions
Eta Model. Hybrid and Eta Coordinates ground MSL ground Pressure domain Sigma domain  = 0  = 1  = 1 Ptop  = 0.
Nesting. Eta Model Hybrid and Eta Coordinates ground MSL ground Pressure domain Sigma domain  = 0  = 1  = 1 Ptop  = 0.
Mesoscale & Microscale Meteorological Division / NCAR ESMF and the Weather Research and Forecast Model John Michalakes, Thomas Henderson Mesoscale and.
This is the footer WRF Basics Weather Research and Forecasting.
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
This is the footer Running WRF on HECToR Ralph Burton, NCAS (Leeds) Alan Gadian, NCAS (Leeds) With thanks to Paul Connolly, Hector.
Installing and running COMSOL on a Windows HPCS2008(R2) cluster
Jordan G. Powers Mesoscale and Microscale Meteorology Division NCAR Earth System Laboratory National Center for Atmospheric Research Space Weather Workshop.
WRF-VIC: The Flux Coupling Approach L. Ruby Leung Pacific Northwest National Laboratory BioEarth Project Kickoff Meeting April 11-12, 2011 Pullman, WA.
Installing WPS and WRF Michael Duda1 and Wei Wang1
1 Weather Research and Forecasting (WRF) Modeling System A Brief Overview WMO, Training Course, September 2011 Alanya, Turkey Dr Meral Demirtaş Turkish.
Mesoscale & Microscale Meteorological Division / NCAR How to Set Up and Run WRF (real.exe & wrf.exe)? Wei Wang June 29, 2004.
1 WRF PreProcessing System (WPS) A Brief Overview WMO, Training Course, September 2011 Alanya, Turkey Dr Meral Demirtaş Turkish State Meteorological.
L. Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO Developmental Testbed Center, Boulder, CO Sara Michelson.
Donald Stark National Center for Atmospheric Research (NCAR) The Developmental Testbed Center (DTC) Wednesday 29 June, 2011 GSI Fundamentals (1): Setup.
Donald Stark National Center for Atmospheric Research (NCAR) The Developmental Testbed Center (DTC) Thursday 13 August, 2015 Downloading and Building EnKF.
Introduction to the WRF Modeling System Wei Wang NCAR/MMM.
Applied Meteorology Unit 1 An Operational Configuration of the ARPS Data Analysis System to Initialize WRF in the NWS Environmental Modeling System 31.
Development of WRF-CMAQ Interface Processor (WCIP)
WRF Portal (A GUI Front End For WRF) WRF Domain Wizard (A GUI Front End For WPS) Presented by Jeff Smith January 18, 2007.
WRF Domain Wizard A tool for the WRF Preprocessing System Jeff Smith Paula McCaslin July 17, 2008.
Atmospheric Modeling in an Arctic System Model John J. Cassano Cooperative Institute for Research in Environmental Sciences and Department of Atmospheric.
WRF Modeling System Overview
Install Software. UNIX Shell The UNIX/LINUX shell is a program important part of a Unix system. interface between the user & UNIX kernel starts running.
A Public Release of WRF Portal Jeff Smith and Mark Govett June 24, 2008.
CMAQ Runtime Performance as Affected by Number of Processors and NFS Writes Patricia A. Bresnahan, a * Ahmed Ibrahim b, Jesse Bash a and David Miller a.
WRF Domain Wizard The WRF Preprocessing System GUI Jeff S Smith Paula McCaslin and Mark Govett AMS 2008 Presentation.
RAMS Evaluation and Visualization Utilities (REVU) Post Processing.
Non-hydrostatic Numerical Model Study on Tropical Mesoscale System During SCOUT DARWIN Campaign Wuhu Feng 1 and M.P. Chipperfield 1 IAS, School of Earth.
Ming Hu Developmental Testbed Center Introduction to Practice Session 2011 GSI Community Tutorial June 29-July 1, 2011, Boulder, CO.
Higher Resolution Operational Models. Operational Mesoscale Model History Early: LFM, NGM (history) Eta (mainly history) MM5: Still used by some, but.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
Mesoscale & Microscale Meteorological Division / NCAR WRF Modeling System Overview Jimy Dudhia.
L. Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO Developmental Testbed Center, Boulder, CO Sara Michelson.
3 rd Annual WRF Users Workshop Promote closer ties between research and operations Develop an advanced mesoscale forecast and assimilation system   Design.
WRF Four-Dimensional Data Assimilation (FDDA) Jimy Dudhia.
Ligia Bernardet, S. Bao, C. Harrop, D. Stark, T. Brown, and L. Carson Technology Transfer in Tropical Cyclone Numerical Modeling – The Role of the DTC.
Higher Resolution Operational Models. Major U.S. High-Resolution Mesoscale Models (all non-hydrostatic ) WRF-ARW (developed at NCAR) NMM-B (developed.
Donald Stark National Center for Atmospheric Research (NCAR) The Developmental Testbed Center (DTC) 15 January, 2014 Building the HWRF Components.
Oct WRF 4D-Var System Xiang-Yu Huang, Xin Zhang Qingnong Xiao, Zaizhong Ma, John Michalakes, Tom henderson and Wei Huang MMM Division National.
August 2001 Parallelizing ROMS for Distributed Memory Machines using the Scalable Modeling System (SMS) Dan Schaffer NOAA Forecast Systems Laboratory (FSL)
1 National Environmental Modeling System (NEMS) Status M. Iredell and EMC Staff.
Introduction to the WRF Modeling System
WRF Software Development and Performance John Michalakes, NCAR NCAR: W. Skamarock, J. Dudhia, D. Gill, A. Bourgeois, W. Wang, C. Deluca, R. Loft NOAA/NCEP:
Toward GSI Community Code Louisa Nance, Ming Hu, Hui Shao, Laurie Carson, Hans Huang.
ESMF,WRF and ROMS. Purposes Not a tutorial Not a tutorial Educational and conceptual Educational and conceptual Relation to our work Relation to our work.
Mass Coordinate WRF Dynamical Core - Eulerian geometric height coordinate (z) core (in framework, parallel, tested in idealized, NWP applications) - Eulerian.
___________________________________________________________________________WRF-SI ___________________________________________________Community Modeling.
NCAR April 1 st 2003 Mesoscale and Microscale Meteorology Data Assimilation in AMPS Dale Barker S. Rizvi, and M. Duda MMM Division, NCAR
NOAA's Earth System Research Lab in Boulder, CO WRF Domain Wizard A GUI for the WRF Preprocessing System WRF Portal A GUI for running WRF Presented by.
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
Mesoscale Modeling Jon Schrage Summer WRF-“Weather Research and Forecasting” Developed by: – National Center for Atmospheric Research (NCAR) – the.
November 21 st 2002 Summer 2009 WRFDA Tutorial WRF-Var System Overview Xin Zhang, Yong-Run Guo, Syed R-H Rizvi, and Michael Duda.
DET Module 1 Ensemble Configuration Linda Wharton 1, Paula McCaslin 1, Tara Jensen 2 1 NOAA/GSD, Boulder, CO 2 NCAR/RAL, Boulder, CO 3/8/2016.
Higher Resolution Operational Models
Hernán García CeCalcULA Universidad de los Andes.
Numerical Weather Forecast Model (governing equations)
WRF Four-Dimensional Data Assimilation (FDDA)
National Scientific Library at Tbilisi State University
Overview of the COSMO NWP model
National Scientific Library at Tbilisi State University
gWRF Workflow and Input Data Requirements
The Weather Research and Forecast – Advanced Weather Research (WRF-ARW) Model Dr Jonathan Fairman 26 April 2016.
WRF Application in COAWST
Presentation transcript:

Mesoscale & Microscale Meteorological Division / NCAR WRF Modeling System “Short” Tutorial John Michalakes, Jimy Dudhia, Wei Wang, Cindy Bruyere, Michael Duda, Dave Gill, William Skamarock NCAR/MMM

Mesoscale & Microscale Meteorological Division / NCAR Outline Part I –Overview of WRF Modeling System –Work through an example from on-line tutorial on NYBlue Part II –WRF Software Overview and Usage

Mesoscale & Microscale Meteorological Division / NCAR What is WRF? WRF: Weather Research and Forecasting Model –Used for both research and operational forecasting It is a supported “community model”, i.e. a free and shared resource with distributed development and centralized support Its development is led by NCAR, NOAA/GSD and NOAA/NCEP/EMC with partnerships at AFWA, FAA, NRL, and collaborations with universities and other government agencies in the US and overseas

Mesoscale & Microscale Meteorological Division / NCAR WRF Dynamical Cores The Advanced Research WRF (ARW) and Nonhydrostatic Mesoscale Model (NMM) are dynamical cores –Dynamical core includes mostly advection, pressure-gradients, Coriolis, buoyancy, filters, diffusion, and time-stepping Both are Eulerian mass dynamical cores with terrain-following vertical coordinates ARW support and development are centered at NCAR/MMM NMM development is centered at NCEP/EMC and support is provided by NCAR/DTC Both are downloadable in the same WRF tar file Physics, the software framework, and parts of data pre- and post-processing are shared between the dynamical cores

Mesoscale & Microscale Meteorological Division / NCAR Modeling System Components WRF Pre-processing System (WPS) –Real-data interpolation for NWP runs –Replaces old Standard Initialization (SI) - still maintained WRF-Var (3d-Var) WRF Model Graphics tools

Mesoscale & Microscale Meteorological Division / NCAR WRF 3DVAR Supported data types –Conventional surface and upper air, wind profiler –Remote sensing data: Cloud-track winds, ATOVS thickness, ground-based GPS TPW, SSM/I, SSM/T1, SSM/T2, SSM/I brightness temp, Quikscat ocean surface winds, radar radial velocity Two background error covariance models –NCEP model –UK / NCAR

Mesoscale & Microscale Meteorological Division / NCAR Key features: Fully compressible, non-hydrostatic (with hydrostatic option) Mass-based terrain following coordinate,  where  is hydrostatic pressure,  is column mass Arakawa C-grid staggering v u T u v ARW Details

Mesoscale & Microscale Meteorological Division / NCAR Key features: 3rd-order Runge-Kutta time-split integration High-order positive-definite advection Two-way interacting telescoping moving nests Grid-nudging and obs-nudging (FDDA) ARW Details

Mesoscale & Microscale Meteorological Division / NCAR WRF as a Community Model Version 1.0 WRF was released December 2000 Version 2.0 May 2004 (NMM added, EM nesting released) –Version Jun 2004 –Version Jun 2004 –Version Dec 2004 Version 2.1 August 2005 (EM becomes ARW) –Version Nov 2005 (NMM released) –Version Jan 2006 Version 2.2 December 2006 (WPS released) –NMM nesting released in 2007 –2.2.1 released in Nov 2007 Version 3.0 released in April 2008

Mesoscale & Microscale Meteorological Division / NCAR Version 3 Nesting & moving nests Global WRF Variable time step New physics, CAM3 lw/sw radiation Chemistry Grid and obs. nudging (FDDA) WPS (SI replacement) WRF Domain Wizard (right) New/enhanced support –IBM (+Blue Gene), SGI, Cray, NEC, Apple, Linux, WindowsCCS, Sun, NVIDIA –PGI, Intel, Pathscale, g95, gfortran, vendor

Mesoscale & Microscale Meteorological Division / NCAR

WRF-Var in the WRF Modeling System

Mesoscale & Microscale Meteorological Division / NCAR User Support User Web pages: –Model –Var system –Web-based WRF Users’ Forum Documentation –Users’ Guide –Technical Note –Software documentation: –Online tutorial

Mesoscale & Microscale Meteorological Division / NCAR Practical Session Installation of WRF system on NYBlue Work through a real-data end-to-end case Note: the tutorial notes haven’t been updated to WRFV3 yet, but will note differences as we encounter them

Mesoscale & Microscale Meteorological Division / NCAR Blue Gene Specifics BlueGene/L has two types of node –Front-end nodes for interactive use –Compute nodes under batch system control –Programs must be compiled and run differently depending on where they run Front end –WPS and WRF utilities –NetCDF and NCAR graphics utilities Compute nodes –Components of modeling system itself All software installations in this tutorial will be compute node installations. These may be run on the compute nodes if you wish; they won’t work on the front-end I have installed a complete set of front-end apps. Add: /gpfs/home2/WRF/BG-Install/fe_tools/gnu32/bin to your path

Mesoscale & Microscale Meteorological Division / NCAR NYBlue Shell environment –bash –csh or tsh Blue Gene Specifics export NETCDF=/gpfs/home2/WRF/BG-Install/netcdf export PNETCDF=/gpfs/home2/WRF/BG-Install/pnetcdf export NETCDF2=/apps/nco/netcdf-final/netcdf-3.6.2/bin/cd export JASPERLIB=/gpfs/home2/WRF/BG-Install/JASPER/lib export JASPERINC=/gpfs/home2/WRF/BG-Install/JASPER/include export PATH=/opt/ibmcmp/xlf/bg/10.1/bin:/gpfs/home2/WRF/BG-Install/fe_tools/gnu32/bin:$PATH setenv NETCDF=/gpfs/home2/WRF/BG-Install/netcdf setenv PNETCDF=/gpfs/home2/WRF/BG-Install/pnetcdf setenv NETCDF2=/apps/nco/netcdf-final/netcdf-3.6.2/bin/cd setenv JASPERLIB=/gpfs/home2/WRF/BG-Install/JASPER/lib setenv JASPERINC=/gpfs/home2/WRF/BG-Install/JASPER/include set path=( /opt/ibmcmp/xlf/bg/10.1/bin /gpfs/home2/WRF/BG-Install/fe_tools/gnu32/bin $path )

Mesoscale & Microscale Meteorological Division / NCAR Blue Gene Specifics The WRF model and it’s preprocessors can be run as non-MPI jobs; however, on Blue Gene we always run using MPI –Standard error and standard output from each MPI task appear as files: rsl.error. and rsl.out.. “Error” in the file name is a misnomer; these will always appear and have useful data in them. Since node memory is constrained, single-reader/writer I/O is not recommended. Use parallel I/O instead (pNetCDF is format number 11 in the namelist.input – more later) 32-bit addressing is a limitation for very large problems, where size of an individual field is greater than 2GB. WRF (like most atmospheric models) is memory bandwidth hungry. So typically run in “CO” (not “VN”) mode on Blue Gene

Mesoscale & Microscale Meteorological Division / NCAR NYBlue specific notes to Online Tutorial Download, Configure, and Build 1.Before starting, review previous slides on “Blue Gene Specifics” in this presentation (this will take care of the environment question under Configure WRF in the tutorial). Configure WRF 2.location of tutorial: 3.GET Source Code: Source code is already downloaded in ~michalak/Tutorial_downloads (see README in that directory). If you use these files you may skip some of the steps below, since they already contain these NYBlue specific mods.GET Source Code 4.Configure WRF: Note there will only be one option: “Linux ppc64 BG blxlf compiler with blxlc (dmpar)”. If you are not using the tar files from step 3 above, modify the Blue Gene section of the arch/configure_new.defaults file so that it reads (change shown in red):Configure WRF CPP = /opt/ibmcmp/xlf/bg/10.1/exe/cpp -C –P Also, to run on 10,000 nodes or more, the following changes must be made: In external/RSL_LITE/rsl_lite.h, change the value of RSL_MAXPROC to (i.e., all the NY Blue nodes). In external/RSL_LITE/c_code.c, change all instances of rsl.out.%04 and rsl.error.%04 to rsl.out.%08 and rsl.error.%08 5. Configure WPS : make the same change as #4 above in the Blue Gene section of arch/configure.defaults for WPS if you are not using the tar files from #3. Also, if the environment is set up correctly (previous slides) you should be able to select configure option 1 (with GRIB2 support).Configure WPS

Mesoscale & Microscale Meteorological Division / NCAR NYBlue specific notes to Online Tutorial Run defaultcase end-to-end 1.Skip to January 2000 Case ( location of the on-line tutorial. The geog_10m.tar.gz, geog_5m.tar.gz, and geog_general.tar.gz terrestrial data files plus other files, including JAN00.TAR.gz,are in ~michalak/Downloads (see README in that directory)January 2000 Case 2.The installed versions of all this data, both non-case-specific and case- specific for the Jan case, are in ~michalak/DATA 3.Front-end executable versions of commands like plotgrids.exe, g1print.exe, and ncdump should work if the environment is set correctly (see previous slides). Ignore relative paths to these commands in the on-line tutorial, e.g../util/plotgrids.exe should be entered as just plotgrids.exe 4.Setup the Model domain (geogrid.exe). A sample namelist.wps and load leveler batch scripts, run_geogrid, run_ungrib, and run_metgrid are in ~michalak/Build/WRF/WPSSetup the Model domain (geogrid.exe) 5.Interpolate the input data onto our model domain (metgrid.exe). Add “nocolons =.true.” (without quotes) to &share block of namelist.input, and then run metgrid.Interpolate the input data onto our model domain (metgrid.exe) 6.Run the model (real.exe & wrf.exe). In addition to the settings shown for the namelist.input file, add “nocolons =.true.” (without quotes) to the &time_control part of namelist.input. Change every value of io_form_* from 2 to 11 (to specify parallel NetCDF). And add “io_form_auxinput1 = 11” (no quotes) to the other io_form settings. There is a sample namelist.input in ~michalak/Build/WRF/WRFV3/test/em_real as well as sample load leveler scripts run_real and run_wrf.Run the model (real.exe & wrf.exe)