Convection-Permitting Ensemble Forecasts at CAPS for Hazardous Weather Testbed (HWT) Ming Xue Center for Analysis and Prediction of Storms and School of.

Slides:



Advertisements
Similar presentations
Xuguang Wang, Xu Lu, Yongzuo Li, Ting Lei
Advertisements

5 th International Conference of Mesoscale Meteor. And Typhoons, Boulder, CO 31 October 2006 National Scale Probabilistic Storm Forecasting for Aviation.
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Object Based Cluster-Analysis and Verification of a Convection-Allowing Ensemble during the 2009 NOAA Hazardous Weather Testbed Spring Experiment Aaron.
Verification and calibration of probabilistic precipitation forecasts derived from neighborhood and object based methods for a convection-allowing ensemble.
Storm Prediction Center Highlights NCEP Production Suite Review December 3, 2013 Steven Weiss, Israel Jirak, Chris Melick, Andy Dean, Patrick Marsh, and.
SPC Potential Products and Services and Attributes of Operational Supporting NWP Probabilistic Outlooks of Tornado, Severe Hail, and Severe Wind.
Toward Improving Representation of Model Microphysics Errors in a Convection-Allowing Ensemble: Evaluation and Diagnosis of mixed- Microphysics and Perturbed.
ESRL – Some Recommendations for Mesoscale Ensemble Forecasts Consolidate all NCEP regional storm-scale model runs perhaps under HRRRE (or other) banner.
Improving Excessive Rainfall Forecasts at HPC by using the “Neighborhood - Spatial Density“ Approach to High Res Models Michael Eckert, David Novak, and.
Improving Probabilistic Ensemble Forecasts of Convection through the Application of QPF-POP Relationships Christopher J. Schaffer 1 William A. Gallus Jr.
Introduction to The Hazardous Weather Testbed Norman, Oklahoma.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Testbeds and Projects with Ongoing Ensemble Research:  Hydrometeorology Testbed (HMT)  Hazardous Weather Testbed (HWT)  Hurricane Forecast Improvement.
Tara Jensen for DTC Staff 1, Steve Weiss 2, Jack Kain 3, Mike Coniglio 3 1 NCAR/RAL and NOAA/GSD, Boulder Colorado, USA 2 NOAA/NWS/Storm Prediction Center,
Evaluation and Comparison of Multiple Convection-Allowing Ensembles Examined in Recent HWT Spring Forecasting Experiments Israel Jirak, Steve Weiss, and.
Warn-on-Forecast Capabilities and Possible Contributions by CAPS By Ming Xue Center for Analysis and Prediction of Storms and School of Meteorology University.
Warn on Forecast Briefing September 2014 Warn on Forecast Brief for NCEP planning NSSL and GSD September 2014.
Objective Evaluation of Aviation Related Variables during 2010 Hazardous Weather Testbed (HWT) Spring Experiment Tara Jensen 1*, Steve Weiss 2, Jason J.
The 2014 Flash Flood and Intense Rainfall Experiment Faye E. Barthold 1,2, Thomas E. Workoff 1,3, Wallace A. Hogsett 1*, J.J. Gourley 4, and David R. Novak.
Forecasting in a Changing Climate Harold E. Brooks NOAA/National Severe Storms Laboratory (Thanks to Andy Dean, Dave Stensrud, Tara Jensen, J J Gourley,
1 FY10-14 Planning. 2 Vision for 2015 NOAA is closer to its customers and better able to respond to severe events. NOAA information is routinely incorporated.
Toward a 4D Cube of the Atmosphere via Data Assimilation Kelvin Droegemeier University of Oklahoma 13 August 2009.
Bill Kuo 1, Louisa Nance 1, Barb Brown 1 and Zoltan Toth 2 Developmental Testbed Center 1. National Center for Atmospheric Research 2. Earth System Research.
Integration of Storm Scale Ensembles, Hail Observations, and Machine Learning for Severe Hail Prediction David John Gagne II Center for Analysis and Prediction.
Fly - Fight - Win 16 th Weather Squadron Evan Kuchera Fine Scale Models and Ensemble 16WS/WXN Template: 28 Feb 06 Air Force Weather Ensembles.
How can LAMEPS * help you to make a better forecast for extreme weather Henrik Feddersen, DMI * LAMEPS =Limited-Area Model Ensemble Prediction.
Verifying high-resolution forecasts Advanced Forecasting Techniques Forecast Evaluation and Decision Analysis METR 5803 Guest Lecture: Adam J. Clark 3.
Collaborating on the Development of Warn-On-Forecast Mike Foster / David Andra WFO Norman OK Feb. 18, 2010 Mike Foster / David Andra WFO Norman OK Feb.
HWT Spring Forecasting Experiment: History and Success Dr. Adam Clark February 25, 2015 National Weather Center Norman, Oklahoma.
Model Resolution Prof. David Schultz University of Helsinki, Finnish Meteorological Institute, and University of Manchester.
Assimilating Reflectivity Observations of Convective Storms into Convection-Permitting NWP Models David Dowell 1, Chris Snyder 2, Bill Skamarock 2 1 Cooperative.
NOAA Hazardous Weather Testbed Experimental Forecast Program (EFP) Steven Weiss (SPC) and Jack Kain (NSSL)
Joe Klemp National Center for Atmospheric Research Boulder, Colorado Convection Resolving NWP using WRF.
Experiences with 0-36 h Explicit Convective Forecasting with the WRF-ARW Model Morris Weisman (Wei Wang, Chris Davis) NCAR/MMM WSN05 September 8, 2005.
Short-Range Ensemble Prediction System at INM José A. García-Moya SMNT – INM 27th EWGLAM & 12th SRNWP Meetings Ljubljana, October 2005.
Performance of the Experimental 4.5 km WRF-NMM Model During Recent Severe Weather Outbreaks Steven Weiss, John Kain, David Bright, Matthew Pyle, Zavisa.
3 rd Annual WRF Users Workshop Promote closer ties between research and operations Develop an advanced mesoscale forecast and assimilation system   Design.
NSSL’s Warn-on-Forecast Project Dr. Lou Wicker February 25–27, 2015 National Weather Center Norman, Oklahoma.
SPC Ensemble Applications: Current Status, Recent Developments, and Future Plans David Bright Storm Prediction Center Science Support Branch Norman, OK.
The EnKF Analyses and Forecasts of the 8 May 2003 Oklahoma City Tornadic Supercell Storm By Nusrat Yussouf 1,2 Edward Mansell 2, Louis Wicker 2, Dustan.
Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project.
NOAA Hazardous Weather Test Bed (SPC, OUN, NSSL) Objectives – Advance the science of weather forecasting and prediction of severe convective weather –
Welcome to the 2012 Warn-on-Forecast and High Impact Weather Workshop!
Jidong Gao, Kristin Kuhlman, Travis Smith, David Stensrud 3DVAR Storm-scale assimilation in real- time.
Norman Weather Forecast Office Gabe Garfield 2/23/11.
R2O and O2R between NSSL and SPC: The benefits of Collocation Jack Kain, Steve Weiss, Mike Coniglio, Harold Brooks, Israel Jirak, and Adam Clark.
Implementation and Testing of 3DEnVAR and 4DEnVAR Algorithms within the ARPS Data Assimilation Framework Chengsi Liu, Ming Xue, and Rong Kong Center for.
Convective-Scale Numerical Weather Prediction and Data Assimilation Research At CAPS Ming Xue Director Center for Analysis and Prediction of Storms and.
Numerical Simulation and Prediction of Supercell Tornadoes Ming Xue School of Meteorology and Center for Analysis and Prediction of Storms University of.
CAPS Realtime 4-km Multi-Model Convection-Allowing Ensemble and 1-km Convection-Resolving Forecasts for the NOAA Hazardous Weather Testbed (HWT) 2009 Spring.
Recent and Future Advancements in Convective-Scale Storm Prediction with the High- Resolution Rapid Refresh (HRRR) Forecast System NOAA/ESRL/GSD/AMB Curtis.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
Comparison of Convection-permitting and Convection-parameterizing Ensembles Adam J. Clark – NOAA/NSSL 18 August 2010 DTC Ensemble Testbed (DET) Workshop.
The Risks and Rewards of High-Resolution and Ensemble Modeling Systems David Schultz NOAA/National Severe Storms Laboratory Paul Roebber University of.
STORM-SCALE DATA ASSIMILATION AND ENSEMBLE FORECASTING WITH THE NSSL EXPERIMENTAL WARN-ON-FORECAST SYSTEM 40 th National Weather Association Annual Meeting.
DET Module 1 Ensemble Configuration Linda Wharton 1, Paula McCaslin 1, Tara Jensen 2 1 NOAA/GSD, Boulder, CO 2 NCAR/RAL, Boulder, CO 3/8/2016.
The Performance of a Weather-Adaptive 3DVAR System for Convective-scale RUA and some suggestions for future GSI-based RUA Jidong Gao NOAA/National Severe.
The Quantitative Precipitation Forecasting Component of the 2011 NOAA Hazardous Weather Testbed Spring Experiment David Novak 1, Faye Barthold 1,2, Mike.
CAPS Realtime 4-km Multi-Model Convection-Allowing Ensemble and 1-km Convection-Resolving Forecasts for the NOAA Hazardous Weather Testbed (HWT) 2009 Spring.
Application of object-oriented verification techniques to ensemble precipitation forecasts William A. Gallus, Jr. Iowa State University June 5, 2009.
Multi-scale Analysis and Prediction of the 8 May 2003 Oklahoma City Tornadic Supercell Storm Assimilating Radar and Surface Network Data using EnKF Ting.
A few examples of heavy precipitation forecast Ming Xue Director
Hydrometeorological Predication Center
CAPS is one of the first 11 NSF Science and Technology (S&T) Centers
University of Washington Ensemble Systems for Probabilistic Analysis and Forecasting Cliff Mass, Atmospheric Sciences University of Washington.
Center for Analysis and Prediction of Storms (CAPS) Briefing by Ming Xue, Director CAPS is one of the 1st NSF Science and Technology Centers established.
CAPS Mission Statement
CAPS Real-time Storm-Scale EnKF Data Assimilation and Forecasts for the NOAA Hazardous Weather Testbed Spring Forecasting Experiments: Towards the Goal.
Craig Schwartz, Glen Romine, Ryan Sobash, and Kate Fossell
Presentation transcript:

Convection-Permitting Ensemble Forecasts at CAPS for Hazardous Weather Testbed (HWT) Ming Xue Center for Analysis and Prediction of Storms and School of Meteorology University of Oklahoma August, 2010 ARPS Simulated Tornado

NOAA Hazardous Weather Testbed (HWT) HWT is a facility jointly managed by NSSL, SPC, and NWS Norman WFO To accelerate the transition of promising new technologies into forecasting and warning for hazardous weather. HWT organizes annual Spring Experiment that attracts about 100 researchers and forecasters each year. Provides forecasters with a first-hand look at the latest research concepts and potential future products, and immerses researchers in the challenges, needs, and constraints of forecasters.

HWT Spring Experiment Daily Discussions (pictures from spring 2007)

Storm-Scale Convection-Permitting Ensemble and Convection-Resolving Deterministic Forecasting CAPS/OU provided CONUS-scale 4-km ensemble & 1-2 km high-res forecasts for HWT Spring Experiment since NSSL, EMC, NCAR, and GSD provided additional 3-4 km deterministic forecasts.

Motivations High uncertainty, nonlinearity and short deterministic predictability make probabilistic forecasting information at convective scales especially desirable; Optimal ensemble forecasting configurations at convective scale little known; Post-processing/verification/evaluation for both deterministic and probabilistic products at convective scale face many additional challenges; Sufficiently accurate control initial conditions may require assimilation of radar and other high-resolution data; The value and best use of probabilistic information for improving convective- scale forecast guidance are unknown; Forecasters as well as researchers need to be trained in the use of potential future operational forecast products.

Scientific Issues to Address The values and cost-benefits of storm-scale versus coarser- resolution short-range ensembles versus even-higher- resolution deterministic forecast; Suitable perturbation methods for storm-scale ensemble, e.g., IC, physics, and model perturbations; Proper handling and use of boundary perturbations; The value and impact of assimilating high-res data including those from radars; The most effective ensemble post-processing and most useful products at the convective scales; The impact of such unique products on forecasting and warning.

Forecast Configurations of Four Years Spring 2007: 10-member WRF-ARW, 4 km, 33 h, 21Z start time, NAM+SREF ICs. 5 members physics perturbations only, 5 with Phy+IC+LBC perturbations. Single 2 km grid. 2/3 CONUS (Xue et al.; Kong et al.; 2007 NWP conf.) Spring 2008: larger domain, 00Z start, Phy+IC+LBC pert for all. Radar Vr and Z data assimilation for 4 and 2 km grids! (Xue et al.; Kong et al SLS Conf.) Spring 2009: 20 members, 4 km, 3 models (ARW, NMM, ARPS), mixed physics/IC/LBCs. Single 1 km grid. Radar DA on native grids. 30 h forecasts from 0Z (Xue et al.; Kong et al NWP Conf.) Spring 2010: 26 4-km and one 1-km forecasts. Full CONUS domain. Some members with physics difference only, and 3 with storm-scale and mesoscale IC perturbations only for studying error growth and predictability. About 1.5 months each spring season from mid-April through early June

Configuration of 2007 Ensemble WRF ARW Model at 4 km

Average Domain Total Precipitation Stage II Obs Physics only membersAll members Ferrier/MYJ Thompson/YSU WSM6/YSU Native grids (Schwartz et al. 2009a,b)

Areal Coverages Ferrier/MYJ Thompson/YSU WSM6/YSU Native grids

Domain-mean ensemble spread - averaged over 38 forecast dates from April 18 to June 7 Local Standard Time: (Kong et al. 2007)

Key Findings from 2007 Experiment Ferrier/MYJ schemes are associated with greater average precipitation YSU PBL scheme seems to be associated with relatively less precipitation, on average, in combination with WSM6 or Thompson microphysics Physics only members are under-dispersive for large-scale fields For precipitation, physics perturbations seem to generate as much spread as IC/LBC perturbations There is significant high bias for most members especially on the second day Convective-allowing ensemble clearly out-performs convection- parameterization ensemble in propagation, ETS, statistical consistency, ROC, etc. (Adam Clark’s talk) 2 km forecasts didn’t seem to provide much more value than 4 km forecasts for the second day guidance.

Forecast Configurations of Four Years Spring 2007: 10-member WRF-ARW, 4 km, 33 h, 21Z start time, NAM+SREF ICs. 5 members physics perturbations only, 5 with Phy+IC+LBC perturbations. Single 2 km grid. 2/3 CONUS (Xue et al.; Kong et al.; 2007 NWP conf.) Spring 2008: larger domain, 00Z start, Phy+IC+LBC pert for all. Radar Vr and Z data assimilation for 4 and 2 km grids! (Xue et al.; Kong et al SLS Conf.) Spring 2009: 20 members, 4 km, 3 models (ARW, NMM, ARPS), mixed physics/IC/LBCs. Single 1 km grid. Radar DA on native grids. 30 h forecasts from 0Z (Xue et al.; Kong et al NWP Conf.) Spring 2010: 26 4-km and one 1-km forecasts. Full CONUS domain. Some members with physics difference only, and 3 with storm-scale and mesoscale IC perturbations only for studying error growth and predictability (Xue et al. 2010; Kong et al SLS conf.). About 1.5 months each spring season from mid-April through early June

4 km ensemble and 2 km high-res domains 3600 x 2688 km

Movie of 2 km forecast v.s. observations 5 minute time intervals

Configuration of 4-km Ensemble

1-h accumulated precipitation ≥ 0.1in, t=12 h ≥ 0.01in, t=24 h

BIAS comparison 1-h accumulated precipitation ≥ 0.1 in

Bias correction based on first 12 days’ bias based on ranks for each hour (a) Sorted 1 h accumulated precipitation, and (b) differences between members and observation (bias) for the 24 h forecast, averaged over a 12-day period from April 16 to May 7, (Kong et al SLS)

Bias corrected (for the later 15 days) Probability matching (Ebert 2001) >0.01 in/h >0.1 in/h >0.5 in/h > 1.0 in/h

Rank histogram of 1 h accumulated precipitation for 18 h, and 24 h, averaged over 15 days of bias corrected dates 18 h 24h (not much improvement to the reliability though)

ETS comparison 1-h accumulated precipitation ≥0.1 in

Forecast Configurations of Four Years Spring 2007: 10-member WRF-ARW, 4 km, 33 h, 21Z start time, NAM+SREF ICs. 5 members physics perturbations only, 5 with Phy+IC+LBC perturbations. Single 2 km grid. 2/3 CONUS (Xue et al.; Kong et al.; 2007 NWP conf.) Spring 2008: larger domain, 00Z start, Phy+IC+LBC pert for all. Radar Vr and Z data assimilation for 4 and 2 km grids! (Xue et al.; Kong et al SLS Conf.) Spring 2009: 20 members, 4 km, 3 models (ARW, NMM, ARPS), mixed physics/IC/LBCs. Single 1 km grid. Radar DA on native grids. 30 h forecasts from 0Z (Xue et al.; Kong et al NWP Conf.) Spring 2010: 26 4-km and one 1-km forecasts. Full CONUS domain. Some members with physics difference only, and 3 with storm-scale and mesoscale IC perturbations only for studying error growth and predictability. About 1.5 months each spring season from mid-April through early June

ARPS 3DVAR Analysis Grid WRF ARW (4 and 1 km) and ARPS forecast grid and common post-processing grid WRF NMM forecast grid 1 km grid: 3603 x 2691 x 51

ETS for 3-hourly Precip. ≥ 0.5 in from HWT Spring Forecast Experiments 2008 (32-day) 2009 (26-day) Probability-matched score generally better than any ensemble member 2 km score no-better than the best 4-km ensemble member – may be due to physics 1-km score better than any 4-km member and than the 4 km PM score. With radar no radar 12 km NAM With radar no radar 12 km NAM

BIAS for 1 h precip of 2009 (24-day average) ≥0.1 inch/h

12 h forecast of 1 h accumulated precip. ≥ 0.1in Reliability diagram for precipitation probability forecast Reliability is improved by using multiple models

Object-Oriented Precipitation forecasts clusters (by Aaron Johnson) 4 ARW ARPS NMM No Radar Microphysics ARWNMMARPS PBL

Forecast Configurations of Four Years Spring 2007: 10-member WRF-ARW, 4 km, 33 h, 21Z start time, NAM+SREF ICs. 5 members physics perturbations only, 5 with Phy+IC+LBC perturbations. Single 2 km grid. 2/3 CONUS (Xue et al.; Kong et al.; 2007 NWP conf.) Spring 2008: larger domain, 00Z start, Phy+IC+LBC pert for all. Radar Vr and Z data assimilation for 4 and 2 km grids! (Xue et al.; Kong et al SLS Conf.) Spring 2009: 20 members, 4 km, 3 models (ARW, NMM, ARPS), mixed physics/IC/LBCs. Single 1 km grid. Radar DA on native grids. 30 h forecasts from 0Z (Xue et al.; Kong et al NWP Conf.) Spring 2010: 26 4-km and one 1-km forecasts. Full CONUS domain. Some members with physics difference only, and 3 with storm-scale and mesoscale IC perturbations only for studying error growth and predictability. About 1.5 months each spring season from mid-April through early June

2010 Spring Experiment Domains – Full CONUS 3DVAR 1200x780 NMM 790x999 ARW, ARPS & verification 1160x720

memberICBCRadar datamicrophyLSMPBL arw_cn00Z ARPSa00Z NAMfyesThompsonNoahMYJ arw_c000Z NAMa00Z NAMfnoThompsonNoahMYJ arw_m3arw_cn + random pert00Z NAMfyesThompsonNoahMYJ arw_m4arw_cn + RF pert00Z NAMfyesThompsonNoahMYJ arw_m5arw_cn + em-p1 + RF pert21Z SREF em-p1yesMorrisonRUCYSU arw_m6 arw_cn + em-p1_pert 21Z SREF em-p1yesMorrisonRUCYSU arw_m7arw_cn + em-p2_pert21Z SREF em-p2yesThompsonNoahQNSE arw_m8arw_cn – nmm-p1_pert21Z SREF nmm-p1yesWSM6RUCQNSE arw_m9arw_cn + nmm-p2_pert21Z SREF nmm-p2yesWDM6NoahMYNN arw_m10arw_cn + rsmSAS-n1_pert21Z SREF rsmSAS-n1yesFerrierRUCYSU arw_m11arw_cn – etaKF-n1_pert21Z SREF etaKF-n1yesFerrierNoahYSU arw_m12arw_cn + etaKF-p1_pert21Z SREF etaKF-p1yesWDM6RUCQNSE arw_m13arw_cn – etaBMJ-n1_pert21Z SREF etaBMJ-n1yesWSM6NoahMYNN arw_m14arw_cn + etaBMJ-p1_pert21Z SREF etaBMJ-p1yesThompsonRUCMYNN arw_m1500Z ARPSa00Z NAMfyesWDM6NoahMYJ arw_m1600Z ARPSa00Z NAMfyesWSM6NoahMYJ arw_m1700Z ARPSa00Z NAMfyesMorrisonNoahMYJ arw_m1800Z ARPSa00Z NAMfyesThompsonNoahQNSE arw_m1900Z ARPSa00Z NAMfyesThompsonNoahMYNN ARW member configuration (19) For all ARW members: ra_lw_physics= RRTM; ra_sw_physics=Goddard; cu_physics=none

NMM member configuration (5) memberICBC Radar data mp_phylw_physw-physf_phy nmm_cn00Z ARPSa00Z NAMfyesFerrierGFDL Noah nmm_c000Z NAMa00Z NAMfnoFerrierGFDL Noah nmm_m3 nmm_cn + nmm- n1_pert 21Z SREF nmm- n1 yesThompsonRRTMDudhiaNoah nmm_m4 nmm_cn + nmm- n2_pert 21Z SREF nmm- n2 yes WSM 6-class RRTMDudhiaRUC nmm_m5nmm_cn + em-n1_pert21Z SREF em-n1yesFerrierGFDL RUC memberICBCRadar dataMicrophy.radiationsf_phy arps_cn00Z ARPSa00Z NAMfyesLinChou/SuarezForce-restore arps_c000Z NAMa00Z NAMfnoLinChou/SuarezForce-restore ARPS member configuration (2) For all NMM members: pbl_physics=MYJ; cu_physics=none For all ARPS members: no cumulus parameterization Members in red contribute to the 15-member sub-ensemble for post-processed product

12–18Z accumulated precipitation: 18h (June 14, 2010 – OKC Flood Day) SSEF mean SSEF Prob match SREF mean SREF Prob match QPE NCEP 12 km NAM HWT images

12–18Z accumulated precipitation: 18h (May 19, 2010) SSEF mean SSEF Prob match SREF mean SREF Prob match QPE NAM HWT images

Gilbert Skill Scores (ETSs) for CAPS’s SSEF (4 and 1 km) ESRL/GSD’s 3 km HRRR NCEP 12 km NAM From 2010 spring experiment

Referred publications from the data Schwartz, C., J. Kain, S. Weiss, M. Xue, D. Bright, F. Kong, K. Thomas, J. Levit, and M. Coniglio, 2009: Next-day convection-allowing WRF model guidance: A second look at 2 vs. 4 km grid spacing. Mon. Wea. Rev., 137, Schwartz, C. S., J. S. Kain, S. J. Weiss, M. Xue, D. R. Bright, F. Kong, K. W.Thomas, J. J. Levit, M. C. Coniglio, and M. S. Wandishin, 2010: Toward improved convection-allowing ensembles: model physics sensitivities and optimizing probabilistic guidance with small ensemble membership. Wea. Forcasting, 25, Clark, A. J., W. A. Gallus, Jr., M. Xue, and F. Kong, 2009: A comparison of precipitation forecast skill between small near-convection-permitting and large convection-parameterizing ensembles. Wea. and Forecasting, 24, Clark, A. J., W. A. Gallus, Jr., M. Xue, and F. Kong, 2010: Growth of spread in convection-allowing and convection-parameterizing ensembles, In press. Clark, A. J., W. A. Gallus, Jr., M. Xue, and F. Kong, 2010: Convection-allowing and convection- parameterizing ensemble forecasts of a mesoscale convective vortex and associated severe weather. Wea. Forecasting, Accepted. Coniglio, M. C., K. L. Elmore, J. S. Kain, S. Weiss, and M. Xue, 2009: Evaluation of WRF model output for severe-weather forecasting from the 2008 NOAA Hazardous Weather Testbed Spring Experiment. Wea. Forcasting, Accepted. Kain, J. S., M. Xue, M. C. Coniglio, S. J. Weiss, F. Kong, T. L. Jensen, B. G. Brown, J. Gao, K. Brewster, K. W. Thomas, Y. Wang, C. S. Schwartz, and J. J. Levit, 2010: Assessing advances in the assimilation of radar data within a collaborative forecasting-research environment. Wea. Forecasting, Accepted.

Web links to papers and realtime products Xue, M., F. Kong, K. W. Thomas, J. Gao, Y. Wang, K. Brewster, K. K. Droegemeier, X. Wang, J. Kain, S. Weiss, D. Bright, M. Coniglio, and J. Du, 2009: CAPS realtime multi-model convection-allowing ensemble and 1-km convection- resolving forecasts for the NOAA Hazardous Weather Testbed 2009 Spring Experiment. 23nd Conf. Wea. Anal. Forecasting/19th Conf. Num. Wea. Pred., Omaha, NB, Amer. Meteor. Soc., Paper 16A.2. Xue, M., F. Kong, K. W. Thomas, J. Gao, Y. Wang, K. Brewster, K. K. Droegemeier, X. Wang, J. Kain, S. Weiss, D. Bright, M. Coniglio, and J. Du Kong, F. M. Xue, K. W. Thomas, J. Gao, Y. Wang, K. Brewster, K. K. Droegemeier, J. Kain, S. Weiss, D. Bright, M. Coniglio, and J. Du, 2009: A realtime storm-scale ensemble forecast system: 2009 spring experiment. 23nd Conf. Wea. Anal. Forecasting/19th Conf. Num. Wea. Pred., Omaha, NB, Amer. Meteor. Soc., Paper 16A.3. Kong, F. M. Xue, K. W. Thomas, J. Gao, Y. Wang, K. Brewster, K. K. Droegemeier, J. Kain, S. Weiss, D. Bright, M. Coniglio, and J. Du

Resources $125K/year CSTAR funding! NSF supercomputers. 18,000-core Cray XT-4 at NICS ~5 hours a day in 2010 All data archived (TBs/day) – need to be fully exploited Collaboration in analyzing the data welcome.

Resources $125K/year funding from NWS CSTAR program was the only specific funding to support all of the above efforts. Of course, it took much more than the $125K/year to do all those (CAPS is an entirely soft-funded center at the university) No specific funding for post-analysis and evaluation, or optimal ensemble system design. NSF supercomputing resources were used. Dedicated use of a 18,000-core Cray XT-4 at NICS/U. of Tennessee was used about 5 hours a day in spring 2010 Over 5 million CPU-hours were used each year in 2009 and Gold mines of data awaiting to be fully exploited.

Future Plan (in CSTAR Renewal Proposal) General direction: more emphasis on aviation weather (e.g., 3 weeks in June + May), more runs/day, shorter forecast ranges, fine-tuning of ensemble design, Multi-scale IC perturbations, ETKF perturbations, EnKF-based perturbations Land surface perturbations, Possible additional LBC perturbations, More intelligent choices of physics suites Addition of COAMPS Improved initial conditions via more advanced data assimilation Possible GSI analyses with target HRRR set up and other more experimental configurations/schemes Possible hybrid ensemble-GSI analysis Possible EnKF analysis Post-analysis and probabilistic products: e.g., calibration, bias removal, detailed performance evaluation, cost-benefit/trade off assessment, effective products for end users (e.g., those for aviation weather, severe storms); Integration/coordination with national mesoscale ensemble efforts (DTC/DET collaborations).

Probabilistic Warn-on-Forecast for Tornadoes - The ultimate challenge – need ~100 m resolution Probabilistic tornado guidance: Forecast looks on track, storm circulation (hook echo) is tracking along centerline of highest tornadic probabilities Radar and Initial Forecast at 2100 CSTRadar at 2130 CST: Accurate Forecast Most Likely Tornado Path T=2120 CST T=2150 T=2130 T= % 50% 30% T=2200 CST Developing thunderstorm NSSL Warn on Forecast Briefing March 5, 2007 Most Likely Tornado Path T=2120 CST T=2150 T=2130 T=2140 T=2200 CST An ensemble of storm-scale NWP models predict the path of a potentially tornadic supercell during the next 1 hour. The ensemble is used to create probabilistic tornado guidance. 70% 50% 30% (Stensrud, Xue, et al. BAMS 2009)

NICS, Kraken (~99K cores) PSC (4K cores) The Computers Used For 2010: Exclusive use of a 18,000-core Cray XT-4 at NICS/University of Tennessee 6 hours a day Thanks!