Real-data WRF: Setup and run ATM 419 Spring 2016 Fovell 1.

Slides:



Advertisements
Similar presentations
Weather Research & Forecasting: A General Overview
Advertisements

ATMO5332 WRF-ARW Tutorial 0.01”.
NCAS Unified Model Introduction Part 1b: Running the UM University of Reading, 3-5 December 2014.
WRF Tutorial For Version 2.2 / Synoptic Lab 10/3/2007 Robert Fovell or
High-Resolution Land Use Data in WPS/WRF for Urban Regions
WRF model exercise Experimental Boundary layer meteorology Björn Claremar 2012.
NPS: The NMMB Preprocessing System Matthew Pyle Dusan Jovic.
Jared H. Bowden Saravanan Arunachalam
For Version / Synoptic Lab
WRF demo/tutorial Robert Fovell
This is the footer WRF Basics Weather Research and Forecasting.
This is the footer Running WRF on HECToR Ralph Burton, NCAS (Leeds) Alan Gadian, NCAS (Leeds) With thanks to Paul Connolly, Hector.
How to set up and run WRF model. Outline n How to download and compile the WRF code? n Namelist n Input and output files.
WRF namelist.input Dr Meral Demirtaş
Installing WPS and WRF Michael Duda1 and Wei Wang1
1 Weather Research and Forecasting (WRF) Modeling System A Brief Overview WMO, Training Course, September 2011 Alanya, Turkey Dr Meral Demirtaş Turkish.
Mesoscale & Microscale Meteorological Division / NCAR How to Set Up and Run WRF (real.exe & wrf.exe)? Wei Wang June 29, 2004.
1 WRF PreProcessing System (WPS) A Brief Overview WMO, Training Course, September 2011 Alanya, Turkey Dr Meral Demirtaş Turkish State Meteorological.
L. Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO Developmental Testbed Center, Boulder, CO Sara Michelson.
Introduction to the WRF Modeling System Wei Wang NCAR/MMM.
Introduction to Running the WRF in LES Mode Jeff Massey April 1, 2014 University of Utah WRF Users Group.
WRF Portal (A GUI Front End For WRF) WRF Domain Wizard (A GUI Front End For WPS) Presented by Jeff Smith January 18, 2007.
WRF Domain Wizard A tool for the WRF Preprocessing System Jeff Smith Paula McCaslin July 17, 2008.
RDFS Rapid Deployment Forecast System Visit at: Registration required.
WRF Modeling System Overview
CCAM Numerical Weather Prediction Dr Marcus Thatcher Research Scientist December 2007.
Initialization for Real Data Cases Dave Gill
How to set up and run WRF model. Outline n How to download and compile the WRF code? n Namelist n Input and output files.
A Public Release of WRF Portal Jeff Smith and Mark Govett June 24, 2008.
Jonathan Pleim 1, Robert Gilliam 1, and Aijun Xiu 2 1 Atmospheric Sciences Modeling Division, NOAA, Research Triangle Park, NC (In partnership with the.
WRF Domain Wizard The WRF Preprocessing System GUI Jeff S Smith Paula McCaslin and Mark Govett AMS 2008 Presentation.
The WRF Preprocessing System Michael Duda 2006 WRF-ARW Summer Tutorial.
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
Justin Glisan Iowa State University Department of Geological and Atmospheric Sciences RACM Project Update: ISU Atmospheric Modeling Component: Part 1 3rd.
L. Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO Developmental Testbed Center, Boulder, CO Sara Michelson.
The WRF Preprocessing System: Description of General Functions
The 4th East Asia WRF Tutorial, 7-10 April 2010 The WRF Preprocessing System: Description of General Functions Michael Duda.
How to set up and run WRF model
WRF Modelling Aim:  18 th -19 th April Cyclone taking dust SW as opposed to SE  15 th – 16 th North Scotland hit  Cyclone has moved north dragging.
___________________________________________________________________________WRF-SI ___________________________________________________Community Modeling.
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
Module 6 MM5: Overview William J. Gutowski, Jr. Iowa State University.
The WRF Preprocessing System
Test Cases for the WRF Height Coordinate Model
Test Cases for the WRF Mass Coordinate Model 2D flow over a bell-shaped mountain WRFV1/test/em_hill2d_x 2D squall line (x, z ; y, z) WRFV1/test/em_squall2d_x.
Modules, Compiling WRF, and Running on CHPC Clusters Adam Varble WRF Users Meeting 10/26/15.
Evaluation of Operational WRF Model Simulations (in the SF Bay Area) By Ellen METR 702  Background  Research Question and Hypothesis  Method & Programs.
Hernán García CeCalcULA Universidad de los Andes.
Regional Weather and Climate Modeling:
NESIS estimates for the SOC case ATM 419 Spring 2016 Fovell 1.
SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
NPS: The NMMB Preprocessing System Matthew Pyle Dusan Jovic.
California Central Valley (Experiment 6) ATM 419 Spring 2016 Fovell 1.
Supercell storms: In-class demo and Experiment 3
Running the WRF Preprocessing System
Introduction to the New WRF Preprocessing System
Potpourri ATM 419/563 Spring 2017 Fovell.
“Storm of the Century”: grid nudging & stochastic perturbations (and Experiment 6A) ATM 419/563 Spring 2017 Fovell.
Real-data WRF: Verification with MET package
WRF Four-Dimensional Data Assimilation (FDDA)
“Storm of the Century”: grid nudging and stochastic perturbatons
National Scientific Library at Tbilisi State University
IMPROVING HURRICANE INTENSITY FORECASTS IN A MESOSCALE MODEL VIA MICROPHYSICAL PARAMETERIZATION METHODS By Cerese Albers & Dr. TN Krishnamurti- FSU Dept.
Changing WRF input files (met_em…) with MATLAB
gWRF Workflow and Input Data Requirements
Practice for Real Cases
Convective and orographically-induced precipitation study
Polar WRF (Ohio State Modifications)
WRF Tutorial For Version 2.2 / Synoptic Lab 10/3/2007 Robert Fovell
WRF Application in COAWST
Presentation transcript:

Real-data WRF: Setup and run ATM 419 Spring 2016 Fovell 1

References ARW users guide (PDF available) – V3/contents.html V3/contents.html Technical description of WRF (PDF) – NetCDF operators (NCO) home page

Terms Parent model = gridded data used for initialization and boundary conditions – GFS/FNL, NAM, RAP/HRRR, reanalyses (NARR, CFSR, NNRP, ERA-interim etc.), other WRF runs WPS = WRF Preprocessing System (consisting of geogrid, ungrib and metgrid programs).

Case study One 36-km resolution, 54 x 48 point domain centered over Kansas 48 h simulation, initialized from GFS at 3/13/2016 at 00Z Verify near-surface fields (T, Td, RH at 2 m; 10- m wind; SLP) against ASOS stations using Model Evaluation Tools (MET) package See provided script for implementing this case study

Steps in real WRF run Geogrid ( geogrid.exe ) – Set up domain (and nests, if applicable) – Only redone if domain is altered Ungrib ( ungrib.exe ) – Unpacks parent model data – Requires the correct variable table (“Vtable”) translator Metgrid ( metgrid.exe ) – Interpolates pressure-level parent model data to WRF model grid Real ( real.exe ) – Creates initial and boundary condition files for WRF on model vertical grid of your choice WRF ( wrf.exe ) – Compiled as em_real

Preliminaries WRF-ARW needs to be compiled as em_real Namelists are –namelist.wps (used for geogrid.exe, ungrib.exe, metgrid.exe ) –namelist.input (used for real.exe, wrf.exe ) Create a directory called KANSAS, copy into it and unpack this file –/network/rit/lab/atm419lab/KANSAS/SETUP.TAR Allocate 1 cpu on snow

make_all_links.csh In addition to linking to needed programs and support files, this shell script also – Creates directories called geogrid and metgrid, and places *.TBL files in them. We do not need to alter those files at this time. – Links to a newer version of NCL ( ncl62 ) – Copies several variable translation tables, called Vtable.*

Geogrid Do geogrid section of script – Creates geo_em.d01.nc Output from plotgrids.ncl

&share wrf_core = 'ARW', max_dom = 1, start_date =' _00:00:00', ' _00:00:00' end_date = ' _00:00:00', ' _00:00:00' interval_seconds = 10800, io_form_geogrid = 2, opt_output_from_geogrid_path = './', debug_level = 0 / namelist.wps NOTES: One domain ( max_dom = 1 ) so second column of start_date and end_date do not matter interval_seconds is time resolution of parent model data in seconds (for GFS, we have 3-hourly data, so sec) (NAM is hourly to 36 h, 3-hourly thereafter) (RAP is hourly to 18 h)

&geogrid parent_id = 1, 1, 2, parent_grid_ratio = 1, 3, 3, i_parent_start = 1, 82, 100, j_parent_start = 1, 82, 36, s_we = 1, 1, 1, e_we = 54, 214, 772, s_sn = 1, 1, 1, e_sn = 48, 196, 610, geog_data_res ='usgs_lakes+30s','usgs_lakes+30s','usgs_lakes+30s', dx = , dy = , map_proj = 'lambert', ref_lat = 38., ref_lon = -100, truelat1 = 38., truelat2 = 38., stand_lon = -100, geog_data_path = '/network/rit/lab/fovelllab_rit/GEOG_V371', opt_geogrid_tbl_path = 'geogrid/' / namelist.wps NOTES: again, in this case, only first column matters our domain is 54 x 48 and 36 km resolution We’re using the USGS landuse database… “30 sec” (about 1 km) is its resolution Lambert projection is standard for modest sized domains in midlatitudes. Use polar stereographic for high latitudes, Mercator for tropical domains.

Lambert conformal projection (from Wikipedia) - shape and accuracy depend somewhat on “true latitudes” (standard parallels) ref_lat = 38., ref_lon = -100, truelat1 = 38., truelat2 = 38., stand_lon = -100, At “true latitude”, there is no map distortion; i.e., map factor is 1.0 For relatively small domains, true latitudes can be the same (as here) Map factor = (horizontal grid size)/(actual distance) so when factor > 1 your real grid size < ∆x, ∆y

Map factors You specify ∆x (and ∆y) in namelist.input Map factor m determines actual grid spacing – So when m > 1.0, your actual grid spacing is smaller than specified ∆x. This puts stress on your time step. – When m < 1.0, you have less resolution than you thought you had – Stay as close to 1.0 as possible

Use NetCDF operators (NCO) to look in geo_em.d01.nc file ncwa -y max -v MAPFAC_M geo_em.d01.nc junk.nc ncdump junk.nc MAPFAC_M = ; ncwa -y min -v MAPFAC_M geo_em.d01.nc junk2.nc ncdump junk2.nc MAPFAC_M = ; Use ncview (or IDV) to peek at geo_em.d01.nc file ncview geo_em.d01.nc [plot 2D variable MAPFAC_M] MAPFAC_M increases from ~ 1 to 1.01 away from central (true) latitude. You need to keep the map factors close to 1.0. ref_lat = 38., ref_lon = -100, truelat1 = 38., truelat2 = 38., stand_lon = -100, [i.e., smallest ∆x is 35.7 km]

MAPFAC_M viewed in IDL

Ungrib In this step, we link to the parent model grids and unpack them into “intermediate format” files It is crucial to select the proper variable table (Vtable) – There is a different Vtable for each parent model –make_all_links.csh copies a few Vtable versions – The file must be named “Vtable” – Other variable tables found in /network/rit/home/atm419/WPSV371_ATM419/ungrib/ Variable_Tables Follow ungrib part of script

wgrib2 GRIBFILE.AAA | more 1:0:d= :UGRD:planetary boundary layer:anl: 2:558813:d= :VGRD:planetary boundary layer:anl: 3: :d= :VRATE:planetary boundary layer:anl: 4: :d= :GUST:surface:anl: 5: :d= :HGT:10 mb:anl: 6: :d= :TMP:10 mb:anl: 7: :d= :RH:10 mb:anl: 8: :d= :UGRD:10 mb:anl: 9: :d= :VGRD:10 mb:anl: 10: :d= :ABSV:10 mb:anl: 11: :d= :O3MR:10 mb:anl: 12: :d= :HGT:20 mb:anl: 13: :d= :TMP:20 mb:anl: 14: :d= :RH:20 mb:anl: 15: :d= :UGRD:20 mb:anl: 16: :d= :VGRD:20 mb:anl: 17: :d= :ABSV:20 mb:anl: GFS model grids in GRIB2 format, on pressure levels

&share wrf_core = 'ARW', max_dom = 1, start_date =' _00:00:00', ' _00:00:00' end_date = ' _00:00:00', ' _00:00:00' interval_seconds = 10800, io_form_geogrid = 2, opt_output_from_geogrid_path = './', debug_level = 0 / &ungrib out_format = 'WPS', prefix = 'FILE', / &metgrid fg_name = 'FILE', io_form_metgrid = 2, / namelist.wps NOTES: Execution of ungrib.exe unpacks the parent model grids into a set of files named by the prefix (here, ‘ FILE:… ’) Program looks for files between start and end dates, at interval specified as interval_seconds.

ncl62 plotfmt.ncl 'filename="FILE: _00"' Plotting intermediate format files

Metgrid Follow metgrid portion of script In this step, we link to the interpolate the intermediate format files onto the WRF horizontal grid – Creates files called met_em * Files may be viewed with ncview (poorly) or IDV (better) Use ncdump on any of the met_em * files to get # of vertical levels and # of soil levels (see next slide)

ncdump met_em.d _00:00:00.nc | more netcdf met_em.d _00\:00\:00 { dimensions: Time = UNLIMITED ; // (1 currently) DateStrLen = 19 ; west_east = 53 ; south_north = 47 ; num_metgrid_levels = 27 ; num_st_layers = 4 ; num_sm_layers = 4 ; south_north_stag = 48 ; west_east_stag = 54 ; z-dimension0132 = 132 ; z-dimension0012 = 12 ; z-dimension0016 = 16 ; z-dimension0028 = 28 ; This parent model data source has 27 vertical atmospheric levels and 4 soil temperature and soil moisture layers (st and sm). These varies among parent model sources.

SLP at initial time in domain, as seen with IDV

Running real.exe and wrf.exe : Batch scripts

Batch scripts Running real-data WRF ( real.exe and wrf.exe ) is often too resource-intensive to execute with srun from the command line. As an alternative, we’ll run them as batch jobs on the snow cluster. SETUP.TAR provided two files: submit_real, and submit_wrf. – Both are presently configured to request 8 cpus on a single node – No need to edit these scripts at this time

submit_real #!/bin/bash # Job name: #SBATCH --job-name=atm419 #SBATCH -n 8 #SBATCH -N 1 #SBATCH --mem-per-cpu=7G #SBATCH -p snow #SBATCH -o sbatch.out #SBATCH -e sbatch.err.out source /network/rit/home/atm419/.bash_profile st_tm="$(date +%s)" echo "running real" srun -N 1 -n 8 -o real.srun.out./real.exe DO NOT CHANGE These need to match

Steps for running real.exe Submit job to Snow sbatch –p snow submit_real To check on your job status, use squeue –u yournetid When job disappears from queue, check tail of rsl.out.0000 file with ‘trsl’ Result of real.exe : creation of files wrfbdy_d01 and wrfinput_d01.

Steps for running wrf.exe Submit job to Snow sbatch –p snow submit_wrf To check on your job status, use squeue –u yournetid When job disappears from queue, check tail of rsl.out.0000 file with ‘trsl’ Result of wrf.exe : creation of wrfout_d01 files [We can combine the real and wrf jobs in a single batch file.]

terrain.gs

Look inside namelist.input

&time_control run_days = 2, run_hours = 0, run_minutes = 0, run_seconds = 0, start_year = 2016, 2016, start_month = 03, 03, start_day = 13, 13, start_hour = 00, 00, start_minute = 00, 00, 00, start_second = 00, 00, 00, end_year = 2016, 2016, end_month = 03, 03, end_day = 15, 15, end_hour = 00, 00, end_minute = 00, 00, 00, end_second = 00, 00, 00, interval_seconds = 10800, input_from_file =.true.,.true., history_interval = 60, 60, frames_per_outfile = 1, 1, / namelist.input NOTES: We will run for 2 days, starting and ending at times shown interval_seconds should match namelist.wps setting One history file per history time ( frames_per_outfile = 1) AGAIN, only first column matters since max_dom is 1.

namelist.input NOTES: Domain size must match namelist.wps ! We are requesting 57 vertical levels in real.exe. Get num_metgrid* info from met_em* files via ncdump. AGAIN, only first column matters since max_dom is 1. &domains time_step = 180, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, e_we = 54, 214, e_sn = 48, 196, e_vert = 57, 57, p_top_requested = 5000, num_metgrid_levels = 27, num_metgrid_soil_levels = 4, dx = , 4000., dy = , 4000., grid_id = 1, 2, parent_id = 0, 1, i_parent_start = 1, 82, j_parent_start = 1, 82, parent_grid_ratio = 1, 3, parent_time_step_ratio = 1, 3, feedback = 1, /

namelist.input NOTES: Many microphysics options available Many options available for surface, surface layer and PBL schemes - surface layer ( sfclay ) and PBL ( bl_pbl ) usually come as pairs AGAIN, only first column matters since max_dom is 1. &physics mp_physics = 4, ra_lw_physics = 4, ra_sw_physics = 4, radt = 20, sf_surface_physics = 2, sf_sfclay_physics = 1, bl_pbl_physics = 1, bldt = 0, num_soil_layers = 4, num_land_cat = 28, cu_physics = 1, cudt = 5, cugd_avedx = 1, isfflx = 1, ifsnow = 0, icloud = 1, do_radar_ref = 1, surface_input_source = 1, mp_zero_out = 2, mp_zero_out_thresh = 1.e-8, / Noah LSM, Monin-Obukhov surface layer YSU PBL

Some available PBL schemes: YSU: pbl = 1, sfclay = 1 MYJ: pbl = 2, sfclay = 2 MYNN: pbl = 5, sfclay = 1, 2 or 5 ACM2: pbl = 7, sfclay = 7 Some land surface models: Noah: surface = 2, soil = 4 NoahMP: surface = 4, soil = 4 TD: surface = 1, soil = 5 RUC: surface = 3, soil = 6 PX: surface = 7, soil = 2 CLM: surface = 5, 10 pbl = bl_pbl_physics sfclay = sf_sfclay_physics surface = sf_surface_physics soil = num_soil_layers How PBL and surface layer schemes can mix & match

wind.gs (t = 13)