For Version / Synoptic Lab

Slides:



Advertisements
Similar presentations
Weather Research & Forecasting: A General Overview
Advertisements

ATMO5332 WRF-ARW Tutorial 0.01”.
WRF Modeling System V2.0 Overview
WRF Tutorial For Version 2.2 / Synoptic Lab 10/3/2007 Robert Fovell or
WRF model exercise Experimental Boundary layer meteorology Björn Claremar 2012.
Nesting. Eta Model Hybrid and Eta Coordinates ground MSL ground Pressure domain Sigma domain  = 0  = 1  = 1 Ptop  = 0.
WRF demo/tutorial Robert Fovell
This is the footer WRF Basics Weather Research and Forecasting.
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
This is the footer Running WRF on HECToR Ralph Burton, NCAS (Leeds) Alan Gadian, NCAS (Leeds) With thanks to Paul Connolly, Hector.
How to set up and run WRF model. Outline n How to download and compile the WRF code? n Namelist n Input and output files.
WRF namelist.input Dr Meral Demirtaş
Installing WPS and WRF Michael Duda1 and Wei Wang1
1 Weather Research and Forecasting (WRF) Modeling System A Brief Overview WMO, Training Course, September 2011 Alanya, Turkey Dr Meral Demirtaş Turkish.
Mesoscale & Microscale Meteorological Division / NCAR How to Set Up and Run WRF (real.exe & wrf.exe)? Wei Wang June 29, 2004.
1 WRF PreProcessing System (WPS) A Brief Overview WMO, Training Course, September 2011 Alanya, Turkey Dr Meral Demirtaş Turkish State Meteorological.
L. Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO Developmental Testbed Center, Boulder, CO Sara Michelson.
Introduction to the WRF Modeling System Wei Wang NCAR/MMM.
Introduction to Running the WRF in LES Mode Jeff Massey April 1, 2014 University of Utah WRF Users Group.
Development of WRF-CMAQ Interface Processor (WCIP)
WRF Portal (A GUI Front End For WRF) WRF Domain Wizard (A GUI Front End For WPS) Presented by Jeff Smith January 18, 2007.
WRF Domain Wizard A tool for the WRF Preprocessing System Jeff Smith Paula McCaslin July 17, 2008.
WRF Modeling System Overview
How to run RSM on imtf4 As of 2010/8/2 by Kei Yoshimura (AORI)
Initialization for Real Data Cases Dave Gill
How to set up and run WRF model. Outline n How to download and compile the WRF code? n Namelist n Input and output files.
A Public Release of WRF Portal Jeff Smith and Mark Govett June 24, 2008.
WRF Domain Wizard The WRF Preprocessing System GUI Jeff S Smith Paula McCaslin and Mark Govett AMS 2008 Presentation.
RAMS Evaluation and Visualization Utilities (REVU) Post Processing.
Higher Resolution Operational Models. Operational Mesoscale Model History Early: LFM, NGM (history) Eta (mainly history) MM5: Still used by some, but.
The WRF Preprocessing System Michael Duda 2006 WRF-ARW Summer Tutorial.
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
Justin Glisan Iowa State University Department of Geological and Atmospheric Sciences RACM Project Update: ISU Atmospheric Modeling Component: Part 1 3rd.
Mesoscale & Microscale Meteorological Division / NCAR WRF Modeling System Overview Jimy Dudhia.
L. Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO Developmental Testbed Center, Boulder, CO Sara Michelson.
WRF Four-Dimensional Data Assimilation (FDDA) Jimy Dudhia.
Higher Resolution Operational Models. Major U.S. High-Resolution Mesoscale Models (all non-hydrostatic ) WRF-ARW (developed at NCAR) NMM-B (developed.
Donald Stark National Center for Atmospheric Research (NCAR) The Developmental Testbed Center (DTC) 15 January, 2014 Building the HWRF Components.
The WRF Preprocessing System: Description of General Functions
The 4th East Asia WRF Tutorial, 7-10 April 2010 The WRF Preprocessing System: Description of General Functions Michael Duda.
This document gives one example of how one might be able to “fix” a meteorological file, if one finds that there may be problems with the file. There are.
How to set up and run WRF model
Introduction to the WRF Modeling System
WRF Modelling Aim:  18 th -19 th April Cyclone taking dust SW as opposed to SE  15 th – 16 th North Scotland hit  Cyclone has moved north dragging.
0 0 July, 2009 WRF-Var Tutorial Syed RH Rizvi WRFDA Analysis/Forecast Verification Syed RH Rizvi National Center For Atmospheric Research NCAR/ESSL/MMM,
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
The WRF Preprocessing System
Test Cases for the WRF Height Coordinate Model
Initialization for Idealized Cases
Test Cases for the WRF Mass Coordinate Model 2D flow over a bell-shaped mountain WRFV1/test/em_hill2d_x 2D squall line (x, z ; y, z) WRFV1/test/em_squall2d_x.
Higher Resolution Operational Models
Hernán García CeCalcULA Universidad de los Andes.
Real-data WRF: Setup and run ATM 419 Spring 2016 Fovell 1.
Regional Weather and Climate Modeling:
NESIS estimates for the SOC case ATM 419 Spring 2016 Fovell 1.
SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
Supercell storms: In-class demo and Experiment 3
Running the WRF Preprocessing System
Potpourri ATM 419/563 Spring 2017 Fovell.
“Storm of the Century”: grid nudging & stochastic perturbations (and Experiment 6A) ATM 419/563 Spring 2017 Fovell.
WRF Four-Dimensional Data Assimilation (FDDA)
“Storm of the Century”: grid nudging and stochastic perturbatons
National Scientific Library at Tbilisi State University
Overview of the COSMO NWP model
National Scientific Library at Tbilisi State University
Changing WRF input files (met_em…) with MATLAB
gWRF Workflow and Input Data Requirements
Practice for Real Cases
WRF Tutorial For Version 2.2 / Synoptic Lab 10/3/2007 Robert Fovell
WRF Application in COAWST
Presentation transcript:

For Version 3.0.1.1 / Synoptic Lab WRF Tutorial For Version 3.0.1.1 / Synoptic Lab January, 2009 Robert Fovell rfovell@ucla.edu http://www.atmos.ucla.edu/~fovell/WRF/wrf_tutorial_2009.ppt.htm

Background on WRF model “Weather Research and Forecasting” Co-developed by research and operational communities ARW core “Advanced Research WRF” NMM core “Nonhydrostatic Mesoscale Model” Supercedes MM5 and Eta models Current version 3.0.1.1 Platforms include Linux and Mac OS X

WRF advantages Better numerics than MM5 Arakawa C grid, R-K scheme, odd order advection w/ implicit diffusion Much less diffusive, larger effective resolution, permits longer time steps Better handling of topography than Eta (original NAM) NAM model is now WRF-NMM Fortran 95 and C (MM5 was F77) NetCDF, GRIB1 and GRIB2

Further advantages MPI from the ground up Allows real data and idealized simulations in same framework Plug-in architecture (different groups will supply WRF “cores”) Recently added: moving nests and nudging NetCDF output - many great tools such as NetCDF operators: http://nco.sourceforge.net/ GRIB2 output now possible as well Examples here ??

WRF disadvantages Bleeding edge Smaller range of physics choices (changing quickly…) Software design is unintuitive for physical scientists Can take hours to compile But does not need frequent recompiling Comparatively slower than MM5 NetCDF files can be huge

WRF and related software WRF Preprocessing System (WPS) Replaces/supercedes WRF SI WRF-ARW model Single node, OpenMP and MPI WRF postprocessing software RIP (read/interpolate/plot) GrADS NCL Specific to “hurricane” Synoptic Lab environment Neglecting for now: ARWpost, WRF Chem, WRF Var, MET (Verification software), VAPOR, Vis5D

Web resources WRF model users site ARW users’ guide http://www.mmm.ucar.edu/wrf/users ARW users’ guide http://www.mmm.ucar.edu/wrf/users/docs/user_guide/contents.html WRF-ARW/WPS online tutorial http://www.mmm.ucar.edu/wrf/users/docs/user_guide_V3/contents.html WRF namelist description http://www.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap5.htm#_Description_of_Namelist Tutorial presentations http://www.mmm.ucar.edu/wrf/users/supports/tutorial.html

My resources This presentation (PPT format) http://www.atmos.ucla.edu/~fovell/WRF/wrf_tutorial_2009.ppt WRF on Mac OS X http://macwrf.blogspot.com

Setup on “hurricane” machines Presumed: • tcsh environment • Intel Fortran compiler (64-bit) • my environment setup employed • precompiled versions of WRF, WPS, RIP and wrf_to_grads

Environment setup > …is the command line prompt If you don’t have a .cshrc file (worth saving) - recommended > cp /home/fovell/.cshrc . > source .cshrc If you want to keep your present .cshrc > cp /home/fovell/cshrc_fovell.csh . > ./cshrc_fovell.csh [you would need to do this every time, in every window]

This environment uses my versions of # RGF additions [abridged] setenv RIP_ROOT /home/fovell/RIP4 setenv GADDIR /home/fovell/lib/grads setenv GASCRP /home/fovell/gradslib # alias lsm 'ls -alt | more' alias rm 'rm -i' alias cp 'cp -i' alias mv 'mv -i' alias trsl ' tail -f rsl.out.0000' alias mpirun 'nohup time /home/fovell/mpich-1.2.7p1/bin/mpirun' alias w2g '/home/fovell/WRF2GrADS/wrf_to_grads' setenv P4_GLOBMEMSIZE 4096000 setenv P4_SOCKBUFSIZE 65536 unlimit limit coredumpsize 0 This environment uses my versions of netcdf, mpich, grads, RIP

Set up a run directory > cd > mkdir SQUALL > cd SQUALL > cp /home/fovell/WRFtutorial/make_all_links.csh . > make_all_links.csh > cp /home/fovell/WRFtutorial/namelist.* . [copies namelist.input, namelist.wps]

[This example uses data that WRF for real-data run A 2006 squall-line [This example uses data that may not remain online]

Surface map 00Z April 3, 2006

Radar 00Z April 3, 2006

WPS overview Tasks Controlled by namelist.wps (1) set up a domain (can be reused) geogrid.exe (2) unpack parent model data (e.g., from GFS, NAM, etc.) ungrib.exe (3) prepare unpacked data for WRF metgrid.exe Controlled by namelist.wps Input data source: FNL analysis

namelist.wps Setup for 3 telescoping domains &share wrf_core = 'ARW', max_dom = 3, start_date = '2006-04-02_18:00:00', '2006-04-02_18:00:00','2006-04-02_18:00:00', end_date = '2006-04-04_00:00:00', '2006-04-04_00:00:00','2006-04-04_00:00:00', interval_seconds = 21600 io_form_geogrid = 2, / Setup for 3 telescoping domains For start_date, end_date need one column for each domain interval_seconds is parent model data frequency (here, 6 h f or FNL data)

namelist.wps (cont.) there is more… &geogrid parent_id = 1, 1, 2, parent_grid_ratio = 1, 3, 3, i_parent_start = 1, 15, 30, j_parent_start = 1, 15, 37, e_we = 70, 133, 190, e_sn = 70, 133, 190, geog_data_res = '10m','2m','30s', dx = 36000, dy = 36000, map_proj = 'lambert', ref_lat = 39., ref_lon = -88.00, truelat1 = 30.0, truelat2 = 60.0, stand_lon = -88.00, geog_data_path = '/home/fovell/WPS_GEOG/geog' / there is more…

geogrid - create domain > geogrid.exe * creates geo_em.d0*.nc (a NetCDF file) * look for “Successful completion of geogrid.” > plotgrids.exe * creates gmeta > idt gmeta * uses NCAR graphics tool to view domain

> ncdump geo_em.d01.nc | more netcdf geo_em.d01 { dimensions: Time = UNLIMITED ; // (1 currently) DateStrLen = 19 ; west_east = 69 ; south_north = 69 ; south_north_stag = 70 ; west_east_stag = 70 ; land_cat = 24 ; soil_cat = 16 ; month = 12 ; variables: char Times(Time, DateStrLen) ; float XLAT_M(Time, south_north, west_east) ; XLAT_M:FieldType = 104 ; XLAT_M:MemoryOrder = "XY " ; XLAT_M:units = "degrees latitude" ; XLAT_M:description = "Latitude on mass grid" ; XLAT_M:stagger = "M" ;

Parent model data issues Sources include GFS, FNL, NAM, RUC, NARR and NNRP reanalysis data, etc. Need a different Vtable (variable table) for each source e.g., Vtable.GFS, Vtable.NAM, Vtable.NARR, etc. Look in /home/fovell/WRFtutorial

Accessing parent model data > link_grib.csh /home/fovell/FNL2006/fnl * links to where parent model for case resides ** data files start with ‘fnl*’ > ln -sf /home/fovell/WRFtutorial/Vtable.GFS Vtable * specifies appropriate Vtable > ungrib.exe * extracts parent model data * look for “Successful completion of ungrib.”

Next step: metgrid > metgrid.exe ...hopefully you see ... “Successful completion of metgrid.” ...Output looks like... met_em.d01.2006-04-02_18:00:00.nc met_em.d02.2006-04-03_12:00:00.nc met_em.d01.2006-04-03_00:00:00.nc met_em.d02.2006-04-03_18:00:00.nc met_em.d01.2006-04-03_06:00:00.nc met_em.d02.2006-04-04_00:00:00.nc met_em.d01.2006-04-03_12:00:00.nc met_em.d03.2006-04-02_18:00:00.nc met_em.d01.2006-04-03_18:00:00.nc met_em.d03.2006-04-03_00:00:00.nc met_em.d01.2006-04-04_00:00:00.nc met_em.d03.2006-04-03_06:00:00.nc met_em.d02.2006-04-02_18:00:00.nc met_em.d03.2006-04-03_12:00:00.nc met_em.d02.2006-04-03_00:00:00.nc met_em.d03.2006-04-03_18:00:00.nc met_em.d02.2006-04-03_06:00:00.nc met_em.d03.2006-04-04_00:00:00.nc

ncdump on a metgrid file netcdf met_em.d01.2006-04-02_18:00:00 { dimensions: Time = UNLIMITED ; // (1 currently) DateStrLen = 19 ; west_east = 69 ; south_north = 69 ; num_metgrid_levels = 27 ; This data source has 27 vertical levels. This will vary with source - so check an met_em* file.

WRF model steps Tasks Both use namelist.input Run real.exe (to finish creation of WRF model input data) Run wrf.exe Both use namelist.input Configured separately from namelist.wps but includes overlapping information Examples set up to use OpenMP on a single node (i.e., WRF will use 1 or more cores)

For start_*, end_*, one column per domain namelist.input &time_control run_days = 0, run_hours = 12, run_minutes = 0, run_seconds = 0, start_year = 2006, 2006, 2006, start_month = 04, 04, 04, start_day = 02, 02, 02, start_hour = 18, 18, 18, start_minute = 00, 00, 00, start_second = 00, 00, 00, end_year = 2006, 2006, 2006, end_month = 04, 04, 04, end_day = 03, 03, 03, end_hour = 06, 06, 06, end_minute = 00, 00, 00, end_second = 00, 00, 00, For start_*, end_*, one column per domain

namelist.input (cont.) interval_seconds matches namelist.wps input_from_file = .true.,.true.,.true., history_interval = 60, 60, 60, frames_per_outfile = 1, 1, 1, restart = .false., restart_interval = 5000, interval_seconds matches namelist.wps input_from_file should normally be ‘true’ for each domain history_interval - how frequently (in min) output created frames_per_outfile - number of writes in each history file If wish to restart mode, restart = .true. (and set model start_* data to restart time) restart_interval = frequency (min) for writing restart files

namelist.input (cont.) time_step = 60, time_step_fract_num = 0, &domains time_step = 60, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, s_we = 1, 1, 1, e_we = 70, 133, 190, s_sn = 1, 1, 1, e_sn = 70, 133, 190, s_vert = 1, 1, 1, e_vert = 51, 51, 51, num_metgrid_levels = 27 dx = 36000, 12000, 4000, dy = 36000, 12000, 4000, grid_id = 1, 2, 3, parent_id = 0, 1, 2, i_parent_start = 0, 15, 30, j_parent_start = 0, 15, 37, parent_grid_ratio = 1, 3, 3, parent_time_step_ratio = 1, 3, 3,

namelist.input (cont.) there is more… &physics mp_physics = 3, 3, 3, ra_lw_physics = 1, 1, 1, ra_sw_physics = 1, 1, 1, radt = 10, 10, 10, cam_abs_freq_s = 21600, cam_abs_dim1 = 4, cam_abs_dim2 = 51, levsiz = 59, paerlev = 29, sf_sfclay_physics = 1, 1, 1, sf_surface_physics = 2, 2, 2, bl_pbl_physics = 1, 1, 1, bldt = 0, 0, 0, cu_physics = 1, 1, 0, cudt = 5, 5, 5, there is more…

Notes on physics Need to use SAME microphysics (mp) scheme in each domain, but can use different cumulus (cu) schemes Some physics combinations work better than others, some don’t work at all -- this is only lightly documented bldt = 0 means boundary layer (bl) scheme is called every time step Standard practice: call bl every time step, cu every 5 min, ra (radiation) every 10 min

namelist.input (cont.) &dynamics w_damping = 0, diff_opt [subgrid turbulence] = 1, km_opt [ “ ] = 4, diff_6th_opt [numerical smoothing] = 0, diff_6th_factor [ “ ] = 0.12, base_temp = 290. damp_opt = 0, zdamp = 5000., 5000., 5000., dampcoef = 0.01, 0.01, 0.01 khdif = 0, 0, 0, kvdif = 0, 0, 0, Only some diff_opt/km_opt combinations make sense, and choices are resolution-dependent. More info: http://www.mmm.ucar.edu/wrf/users/supports/tutorial.html

http://www.mmm.ucar.edu/wrf/users/tutorial/200901/WRF_Physics_Dudhia.pdf

real.exe Has changed a lot since version 2.1.2 Number of vertical model levels is specified in this step The num_metgrid_levels comes from parent model; you set e_vert (number of WRF levels) here Can reset WRF levels by rerunning real.exe Can also specify which levels you want e_vert = 51, 51, 51, num_metgrid_levels = 27

Setting levels in namelist.input (optional) WRF uses “sigma” or “eta” coordinates (1.0 is model bottom, 0.0 is top) Added lines to &domains in namelist.input, presuming e_vert = 51, requests a model top pressure of 50 mb (5000 Pa) and concentrates vertical resolution in lower trop p_top_requested = 5000 eta_levels = 1.00,0.9969,0.9935,0.9899,0.9861,0.9821, 0.9777,0.9731,0.9682,0.9629,0.9573,0.9513, 0.9450,0.9382,0.9312,0.9240,0.9165,0.9088, 0.9008,0.8925,0.8840,0.8752,0.8661,0.8567, 0.8471,0.8371,0.8261,0.8141,0.8008,0.7863,0.7704, 0.7531,0.7341,0.7135,0.6911,0.6668,0.6406, 0.6123,0.5806,0.5452,0.5060,0.4630,0.4161, 0.3656,0.3119,0.2558,0.1982,0.1339,0.0804,0.0362,0.0000,

Run real.exe and wrf.exe > setenv | grep OMP OMP_NUM_THREADS=4 > real.exe [look for…] d01 2006-04-03_06:00:00 real_em: SUCCESS COMPLETE REAL_EM INIT • max_dom in namelist.input is set to 1… although we created three domains, only one will be used in the run

Run wrf.exe Output of real.exe is wrfbdy_d01 and wrfinput_d01 (NetCDF files) - Input files created for each domain, up to max_dom Run the model > nohup time wrf.exe > wrf.out & • Creates wrfout_d0* files keyed by simulation date for each domain > tail -f wrf.out [to watch the model run]

WRF model output Namelist set up to do 12 h run Look for at end of wrf.out file: d01 2006-04-03_06:00:00 wrf: SUCCESS COMPLETE WRF Output files created: This is because history_interval was 60 min and frames_per_outfile was 1 wrfout_d01_2006-04-02_18:00:00 wrfout_d01_2006-04-03_01:00:00 wrfout_d01_2006-04-02_19:00:00 wrfout_d01_2006-04-03_02:00:00 wrfout_d01_2006-04-02_20:00:00 wrfout_d01_2006-04-03_03:00:00 wrfout_d01_2006-04-02_21:00:00 wrfout_d01_2006-04-03_04:00:00 wrfout_d01_2006-04-02_22:00:00 wrfout_d01_2006-04-03_05:00:00 wrfout_d01_2006-04-02_23:00:00 wrfout_d01_2006-04-03_06:00:00 wrfout_d01_2006-04-03_00:00:00

Postprocessing WRF output: RIP and GrADS (Vis5D, ARWpost and VAPOR also exist)

RIP RIP operates in batch mode, using input scripts RIP can overlay fields, do arbitrary cross-sections, calculate trajectories, and create Vis5D output files RIP tasks include Unpack model output data (ripdp_wrf) Create RIP plotting scripts (rip.in files) Execute scripts (rip) RIP can create a LOT of output files

RIP procedure You can view the cgm file using idt or ictrans > cp /home/fovell/WRFtutorial/rip.T2.in . > ripdp_wrf run1 all wrfout_d01* [this creates a new dataset called ‘run1’ and uses all wrfout_d01 files created] > rip run1 rip.T2.in [the rip.T2.in file is a script containing RIP plotting commands] [the output file, rip.T2.cgm, is a graphics metafile] You can view the cgm file using idt or ictrans

Domain 1 terrain

12 h forecast (2m T - color; 1h precip totals; 10 m vector winds)

12 h forecast (2m T - color; 1h precip totals; 10 m vector winds)

RIP script http://www.mmm.ucar.edu/wrf/users/docs/ripug.htm =========================================================================== feld=T2; ptyp=hc; vcor=s; levs=1fb; cint=1; cmth=fill;> arng; cbeg=273; cend=303; cosq=0,violet,12.5,blue,25,green,37.5,> light.green,50,white,62.5,yellow,75,orange,87.5,red,100,brown feld=U10,V10; ptyp=hv; vcmx=20.0; colr=black; linw=1; intv=2; feld=slp; ptyp=hc; vcor=s; levs=1fb; cint=4; nohl;colr=blue;linw=2;nolb feld=map; ptyp=hb; colr=dark.blue; linw=2; feld=tic; ptyp=hb http://www.mmm.ucar.edu/wrf/users/docs/ripug.htm

GrADS and wrf_to_grads GrADS produces beautiful graphics Batch scriptable AND interactive Interactive: good for overlaying different datasets, computing difference fields [can also be done in RIP] Doesn’t create huge numbers of intermediate files like RIP Arbitrary cross-sections are difficult to construct

GrADS procedure Copy control_file from /home/fovell/WRFtutorial and edit Select variables desired and define wrfout files to be accessed (next slide) w2g control_file run1g Creates run1g.ctl, run1g.dat http://grads.iges.org/grads/head.html

List of available 2D fields follows control_file -3 ! times to put in GrADS file, negative ignores this 0001-01-01_00:00:00 0001-01-01_00:05:00 0001-01-01_00:10:00 end_of_time_list ! 3D variable list for GrADS file ! indent one space to skip U ! U Compoment of wind V ! V Component of wind UMET ! U Compoment of wind - rotated (diagnostic) VMET ! V Component of wind - rotated (diagnostic) W ! W Component of wind THETA ! Theta TK ! Temperature in K TC ! Temperature in C List of available 2D fields follows

control_file (cont.) ! Now we check to see what to do with the data ! All list of files to read here ! Indent not to read ! Full path OK wrfout_d01_2006-04-02_18:00:00 wrfout_d01_2006-04-02_19:00:00 wrfout_d01_2006-04-02_20:00:00 [more…] end_of_file_list ! Now we check to see what to do with the data real ! real (input/output) / ideal / static 1 ! 0=no map background in grads, 1=map background in grads -1 ! specify grads vertical grid ! 0=cartesian, ! -1=interp to z from lowest h ! 1 list levels (either height in km, or pressure in mb) 1000.0 950.0 900.0 850.0 800.0 750.0

Running GrADS > gradsnc -l [GrADS graphics output window opens] ga-> open run1g [ga-> is GrADS environment prompt] ga-> /home/fovell/WRFtutorial/T2_movie.gs [executes this GrADS script; hit return to advance a frame] ga-> quit [to exit]

12 h forecast (2 m T, 1 h precip totals, 10 m winds)

GrADS ctl file Sometimes, wrf_to_grads fails to detect proper dset ^run1g.dat undef 1.e35 pdef 69 69 lcc 27.026 -100.234 1.000 1.000 60. 30. -88.000 36000. 36000. xdef 228 linear -106.5 0.16216215 ydef 148 linear 26.0 0.16216215 zdef 50 levels 0.02976 0.09550 0.18799 [more…] tdef 13 linear 18:00z02apr2006 1hr vars 24 U 50 0 U Compoment of wind V 50 0 V Component of wind W 50 0 W Component of wind THETA 50 0 Theta TC 50 0 Temperature in C Sometimes, wrf_to_grads fails to detect proper time increment. In this case, I manually changed “19hr” to “1hr”. A bug.

WRF for idealized cases

Idealized cases WRF comes with “pre-packaged” idealized case examples (2D squall line, 2D sea-breeze, 3D supercell, mountain wave, etc.) Type compile to see list of cases 2D examples cannot use MPI or OpenMP Can be modified Next example: modified 3D supercell to do 3D squall line initialized with a line thermal w/ random perturbations

6 h forecast - surface T’ and column max W

= end =