Pam Heinselman NOAA National Severe Storms Laboratory Warn on Forecast Workshop 8 Feb 2012 Exploring the Impact of Rapid-scan Radar Data on NWS Warnings.

Slides:



Advertisements
Similar presentations
Future Radar and Satellite Technology Daniel C. Miller National Weather Service Columbia, SC.
Advertisements

Radar Climatology of Tornadoes in High Shear, Low CAPE Environments in the Mid-Atlantic and Southeast Jason Davis Matthew Parker North Carolina State University.
The 4 Sep 2011 Tornado in Eastern New York: An Example for Updating Tornado Warning Strategies Brian J. Frugis NOAA/NWS Albany, NY NROW XIII 2-3 November.
20 Years of Connecticut Tornadoes Ryan Hanrahan / WVIT-TV NBC Connecticut.
Echo Tops Fairly accurate at depicting height of storm tops Inaccurate data close to radar because there is no beam angle high enough to see tops. Often.
Anomalous Velocity Signatures in Low Level Super Resolution Data.
THE IMPACTS OF THUNDERSTORM GEOMETRY AND WSR-88D BEAM CHARACTERISTICS ON DIAGNOSING SUPERCELL TORNADOES Steve Piltz – WFO Tulsa, OK Don Burgess – CIMSS.
Severe Weather Radar Features. Weak Echo Region (WER) Region of low radar reflectivities on inflow side of storm o Near the surface High reflectivities.
 Definition of Spectrum Width (σ v ) › Spectrum width is a measure of the velocity dispersion within a sample volume or a measure of the variability.
A Study on Convective Modes Associated with Tornadoes in Central New York and Northeast Pennsylvania Timothy W. Humphrey 1 Michael Evans 2 1 Department.
More Thunderstorms. Today Homework in Wind shear More multicellular storms.
Case Study: Using GIS to Analyze NWS Warnings Ken Waters NWS Phoenix June 5, 2008.
Tornado Detection Algorithm (TDA) By: Jeffrey Curtis and Jessica McLaughlin.
ASSESSING THE IMPACT OF COLLABORATIVE RESEARCH PROJECTS ON NWS PERFORMANCE Jeff Waldstreicher Scientific Services Division – Eastern Region Northeast Regional.
Travis Smith (OU/CIMMS) February 25–27, 2015 National Weather Center
The 4 August 2004 Central Pennsylvania Severe Weather Event – Environmental and Topographical Influences on Storm Structure Evolution Joe Villani NOAA/NWS,
Forecasting Convection An overview of how radar can help in the forecast process. Presentation to MSC radar course, March 24, 2010 By James Cummine, Lead.
NOAA’s National Weather Service Completes Doppler Radar Upgrades Dual-Polarization Radar Proves Valuable During Recent Severe Weather Stories from NWS.
Tornado Service Assessment Joplin Missouri May 2011 Gary Garnet National Weather Service Cleveland, Ohio.
Sebastián Torres Weather Radar Research Innovative Techniques to Improve Weather Observations.
V. Chandrasekar (CSU), Mike Daniels (NCAR), Sara Graves (UAH), Branko Kerkez (Michigan), Frank Vernon (USCD) Integrating Real-time Data into the EarthCube.
Ensemble Numerical Prediction of the 4 May 2007 Greensburg, Kansas Tornadic Supercell using EnKF Radar Data Assimilation Dr. Daniel T. Dawson II NRC Postdoc,
Background Research Applications Philip Hayes The Florida State University.
AMSR3/DFS Joint Research and Operational Users Working Group Meeting Tokyo, Japan - 20 April 2009 Requirements for Mid and High Latitude Applications Joe.
Severe Weather Operations. Severe Weather Staffing (Positions in orange are minimum needed) Severe Weather Coordinator – oversees the operations of the.
Reanalysis of Southern New England Tornadoes To Improve Warning Verification Daniel Brook, Lyndon State College* Joseph DelliCarpini, NOAA/NWS Taunton,
National Weather Service Weather Forecast Office – Taunton, MA (BOX)
A. FY12-13 GIMPAP Project Proposal Title Page version 27 October 2011 Title: Probabilistic Nearcasting of Severe Convection Status: New Duration: 2 years.
Collaborating on the Development of Warn-On-Forecast Mike Foster / David Andra WFO Norman OK Feb. 18, 2010 Mike Foster / David Andra WFO Norman OK Feb.
Christopher J. Schultz 1, Walter A. Petersen 2, Lawrence D. Carey 3* 1 - Department of Atmospheric Science, UAHuntsville, Huntsville, AL 2 – NASA Marshall.
Real-time Doppler Wind Quality Control and Analysis Qin Xu National Severe Storms Laboratory/NOAA Pengfei Zhang, Shun Liu, and Liping Liu CIMMS/University.
Preliminary Radar Observations of Convective Initiation and Mesocyclone Interactions with Atmospheric Waves on 27 April 2011 Todd A. Murphy, Timothy A.
TOP CWA Golf Ball Size Hail Study Bill Gargan WFO TOP, KS.
Chad Entremont Daniel Lamb NWS Jackson, MS
Tornadoes A thesis by Joao Vitor and Igor Neubauer.
Storm Based Warnings A New Direction in the Warning Process Add Name National Weather Service Add Office.
NSSL’s Warn-on-Forecast Project Dr. Lou Wicker February 25–27, 2015 National Weather Center Norman, Oklahoma.
Weather Event Simulator Best Practices John Ferree Warning Decision Training Branch Norman, OK John Ferree Warning Decision Training Branch Norman, OK.
NOAA Hazardous Weather Test Bed (SPC, OUN, NSSL) Objectives – Advance the science of weather forecasting and prediction of severe convective weather –
An Object-Based Approach for Identifying and Evaluating Convective Initiation Forecast Impact and Quality Assessment Section, NOAA/ESRL/GSD.
Updated Radar-Based Techniques for Tornado Warning Guidance in the Northeastern United States Brian J. Frugis & Thomas A. Wasula NOAA/NWS Albany, New York.
WFO Huntsville, Alabama A Review of the North Alabama Violent Tornado Outbreak February 6, 2008 Brian Carcione & David Nadler NWS Huntsville, Alabama.
THE BARON TORNADO INDEX (BTI)
BP-31 Updated Radar-Based Techniques for Tornado Warning Guidance in the Northeastern United States Brian J. Frugis and Thomas A. Wasula NOAA/National.
Tornado Warning Skill as a Function of Environment National Weather Service Sub-Regional Workshop Binghamton, New York September 23, 2015 Yvette Richardson.
Applied Meteorology Unit 1 High Resolution Analysis Products to Support Severe Weather and Cloud-to-Ground Lightning Threat Assessments over Florida 31.
A Case Study of Two Left-Moving Mesoanticyclonic Supercells on 24 April 2006 Chris Bowman National Weather Service – Wichita, KS.
2. Basic Characteristics and Forecast The 500-hPa pattern for this event featured a deep low centered over Idaho. A composite analysis of past tornado.
HWT Spring Experiment 2011 model comparisons 1 June OK-MO severe storms Very subtle boundaries, really not a lot of surface forcing But lots of storms.
1 National Weather Service Response to Partner Issues Eli Jacks, Chief Fire and Public Weather Services Branch 1) Intended Use of “Tornado Emergency” 2)
A Rare Severe Weather and Tornado Event in Central New York and Northeast Pennsylvania: July 8, 2014 Presented by Mike Evans 1.
1 KURT HONDL, NSSL MPAR PROJECT MANAGER OFCM MPAR WORKING GROUP MEETING 19 JUNE 2012 NOAA/NSSL Activities.
Pam Heinselman Weather Radar Research Evaluation of Phased Array Radar Data via Comparisons to the WSR-88D.
Evaluation of CASA (and other gap-filling) Radars Severe Weather Workshop for NWS Warning Decision Making - 11 July 2007 Kurt Hondl DOC/NOAA/OAR National.
C. Schultz, W. Petersen, L. Carey GLM Science Meeting 12/01/10.
HWT Experimental Warning Program: History & Successes Darrel Kingfield (CIMMS) February 25–27, 2015 National Weather Center Norman, Oklahoma.
The 1 November 2004 tornadic QLCS event over southwest Illinois Ron W. Przybylinski Science and Operations Officer National Weather Service – St. Louis.
Anthony Phillips Department of Geography Virginia Tech.
Eye-tracking during the Forecaster Warning Decision Process: A Pilot Experiment Katie Bowden OU CIMMS/School of Meteorology, Ph.D. Student Pam Heinselman.
The Effect of Downdraft Strength on Tornado Intensity and Path Length Dylan R. Card and Ross A. Lazear Department of Atmospheric & Environmental Sciences,
Operational Use of Lightning Mapping Array Data Fifth Meeting of the Science Advisory Committee November, 2009 Geoffrey Stano, Dennis Buechler, and.
Radar Interpretation Chad Entremont National Weather Service Jackson, MS.
Travis Smith U. Of Oklahoma & National Severe Storms Laboratory Severe Convection and Climate Workshop 14 Mar 2013 The Multi-Year Reanalysis of Remotely.
Ray Wolf and Alex Gibbs NOAA/NWS Davenport, Iowa
Paper Review Jennie Bukowski ATS APR-2017
Lightning Flashes in Alabama Tornadic Supercells on 27 April 2011
Radar Observation of Severe Weather
Michael L. Jurewicz, Sr. and Christopher Gitro
Northeast Regional Operational Workshop XVII 3 November 2016
A Neural Network for Detecting and Diagnosing Tornadic Circulations
Presentation transcript:

Pam Heinselman NOAA National Severe Storms Laboratory Warn on Forecast Workshop 8 Feb 2012 Exploring the Impact of Rapid-scan Radar Data on NWS Warnings Daphne LaDue OU CAPS Heather Lazrus NCAR

Motivation 62% Stakeholders’ needs: Faster Updates Source: Radar Operations Center

Motivation

“To enhance the ability to monitor rapid tornadogenesis, the NWS should develop and implement additional Volume Coverage Pattern strategies that allow for more continuous sampling near the surface (e.g., 1-min lowest elevation sampling).” Motivation

JDOP (Burgess et al. 1979; Whiton et al. 1998) Performance improvement resulting from adding Doppler capability Motivation Benefit Assessment JPOLE (Scharfenberg et al. 2005) Case examples illustrating how polarization aided operations Understanding of storm severity, warning decisions, wording in follow-up statements, confidence

Explore how improvements in depiction of storm development from rapid sampling may benefit forecasters’ decision making process. Objective

Andra et al. (2002) Description of 5 warning decision factors exemplified during 3 May 1999 tornado outbreak Background: NWS Decision Making 1. Scientifically based conceptual models 2. Two primary data sets: Doppler radar & ground truth 3. Workstations and software 4. Strategy 5. Expertise

Background: NWS Decision making Morss and Ralph (2007) Participant observation & structured interviews during PACJET and CALJET Assessed operational benefit of forecaster use of off-shore, gap-filling observations provided during CALJET Use of additional data appeared to help forecasters -- In some cases, data improved specificity of forecast -- When initial forecast was fairly accurate, use of data increased forecaster confidence

Tuesday Afternoon Introduction to PAR & WDSS-II training Tuesday Evening and Wednesday Gain experience interrogating PAR data and issue warnings using WDSS-lI WARNGEN Thursday 12 forecasters, April 2010 Temporal Resolution Experiment

Paired forecasters w/ similar radar analysis skills Worked tropical supercell event that produced EF1 tornado (unwarned ) Pair 1: 43-s updates Pair 2: 4.5-min updates Temporal Resolution Experiment

43-s Updates 4.5-min Updates 19 Aug 2007

Data We Collected Audio of the teams working through situation awareness and the case What they did Video of computer screens Products issued Two observers took notes in each room What we saw

What they thought they did Teams debriefed individually Joint debrief to compare across teams Each individual ranked factors in their warning decision Each individual completed a confidence continuum Data We Collected

Understanding decision process Cognitive ActionsEmotions Experiment Design & Software Data used Coding and Thematic Analysis

Example Analysis: 43-s Team Decision Process 01:37:09 01:40:01 01:38:35 01:40:44 01:41:17 01:42:10 Interrogate SRM to “make sure there is nothing as pertinent as far as rotation there” before issuing SVS on south storm’s warning; identify circ on south storm w/ strong inbound but weak outbound velocity Video=22:29 – 22:53 Examine shape of reflectivity Video=22: :04 Issue SVS Video=23:05–24:08 No changes are made to the warning Notices increase in reflectivity magnitude in north storm (0.5) Interrogate base velocity & detect mesos in both storms; vel in south storm stronger aloft ) Interrogate top height and vertical reflectivity structure of both storms Video=25:52 – 27:58 01:42:53 Interrogate base velocity of north storm: “We’ve got that persistent meso. I’ve got 34 kts inbound now on this, 25 outbound” Video=28:01 – 28:10 Interrogate reflectivity for notch signature Video=28:11 – 29:06 01:43:36 01:44:19 Interrogate base vel “usually at 35 kts you need to start considering tor.” Video=29:07 – 29:17 Compares location of vel couplet w/ location of refl. notch 01:45:02 Opens WarnGen at Video = 30:26 Video=30:12 Issue Warning at Video = 32:38 01:45:45 01:46:28 01:47:11 01:45:02 Video=30:12 At end statates, “41 inbound, 64 out…I don’t think I can ignore that.” Video=29:45 -30:19 PAR ~01:47:28 Video=24: :27 Video=24: :41 Video=29:23 – 29:44 Interrogates updated SRM and finds “41 inbound, 64 out…I don’t think I can ignore that.” Video=29:45 -30:19 Issue TOR Warning

43-s Team Warnings EF1 TOR N. Storm 01:13:2901:50:0301:27:0601:40:0101:17:4701:35:4301:44:19 01:22:05 01:31:24 TOR S. Storm TOR N. Storm 01:50:0301:22:0501:27:0601:31:2401:40:0101:17:4701:35:4301:44:19 SVR S. StormSVR N. Storm TOR N. Storm Negative Lead TimePositive Lead Time 01:13:2901:50:0301:40:0101:17:4701:35:4301:44:19 01:22:05 01:31:24 01:27:06 TOR N. Storm min min -3.2 min 01:56:13 01:13:29 Comparison A Comparison B Comparison C 0.5° LLSD Azimuthal Shear (s -1 ) Time (UTC) +6 min

4.5-min Team Warnings EF1 TOR N. Storm 01:13:2901:50:0301:27:0601:40:0101:17:4701:35:4301:44:1901:22:0501:31:24 TOR N. Storm 01:50:0301:22:0501:27:0601:31:2401:40:0101:17:47 01:35:43 01:44:19 TOR N. Storm Negative Lead TimePositive Lead Time 01:13:2901:50:0301:40:0101:17:4701:35:43 01:22:05 01:31:24 01:27:06 TOR N. Storm +4.6min -0.7 min 01:56:13 01:13:29 Comparison A Comparison B Comparison C Time (UTC) 0.5° LLSD Azimuthal Shear (s -1 ) -1.6 min 01:44:19

What we’ve learned 6 teams interrogated similar radar signatures Came to different conclusions about whether and when to warn Experience Conceptual models Confidence Tolerance for Missing Event Perceived Threats Software Decision (Confounding) Factors (Hahn et al. 2003; Hoffman et al. 2006; Pliske et al. 1997)

Forecasters’ Conceptual Model During Case 43-s Team4.5-minTeam Weaker Couplet Strength 66%83% Trend in Circulation Strength 100% Update Time Detrimental 0%100% Environment 66% Reflectivity Notch 100%

Understanding of Supercell in Tropical Environment B C A A A A B B C B C C More ConfidentLess Confident Understanding of NWRT PAR Data A B B C C BB C C Usual Confidence More ConfidentLess Confident Usual Confidence

What we’ve learned *43-s Teams 4.5-min Teams 0 min20 min Warning Lead Times 11.5 min 4.6 min 18.6 min 6 min *Issued 50% more warnings: 3 hits, 1 miss, 2 false alarms Data analysis was time intensive; exploring other methods Warning decision process is complex Some decision factors were similar across groups, others were not Update time likely had a positive impact on warning lead time