Ed Tollerud In collaboration with ESRL, RFC’s, WFO’s, and CDWR HMT-DTC Task.

Slides:



Advertisements
Similar presentations
Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:
Advertisements

Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Storm Prediction Center Highlights NCEP Production Suite Review December 3, 2013 Steven Weiss, Israel Jirak, Chris Melick, Andy Dean, Patrick Marsh, and.
Jamie K. Wolff 1, Louisa B. Nance 1, Brad S. Ferrier 2, Clifford F. Mass 3, Barbara G. Brown 1 and Ying-Hwa Kuo 1 Joint 46 th CMOS/25 th WAF/21 st NWP.
DTC AOP GSI October 1, FY09 Funding Resource/Tasks AFWA (February January 2010): – GSI code management and support (1.1FTE)
Demonstration Testbed for the Evaluation of Experimental Models for Tropical Cyclone Forecasting in Support of the NOAA Hurricane Forecast Improvement.
Meeting Expectations Gary Jedlovec Purpose of review SPoRT Mission and Vision Role of Science Advisory Committee Charge to Committee members transitioning.
NWP Verification with Shape- matching Algorithms: Hydrologic Applications and Extension to Ensembles Barbara Brown 1, Edward Tollerud 2, Tara Jensen 1,
1 WRF Development Test Center A NOAA Perspective WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006 Fred Toepfer NOAA Environmental.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
Testbeds and Projects with Ongoing Ensemble Research:  Hydrometeorology Testbed (HMT)  Hazardous Weather Testbed (HWT)  Hurricane Forecast Improvement.
Tara Jensen for DTC Staff 1, Steve Weiss 2, Jack Kain 3, Mike Coniglio 3 1 NCAR/RAL and NOAA/GSD, Boulder Colorado, USA 2 NOAA/NWS/Storm Prediction Center,
Jordan G. Powers Mesoscale and Microscale Meteorology Division NCAR Earth System Laboratory National Center for Atmospheric Research Space Weather Workshop.
Bill Kuo Developmental Testbed Center: Overview, Status and Future Direction.
Warn on Forecast Briefing September 2014 Warn on Forecast Brief for NCEP planning NSSL and GSD September 2014.
Bill Kuo Summary of DTC EC Meeting 26 th August 2010.
Objective Evaluation of Aviation Related Variables during 2010 Hazardous Weather Testbed (HWT) Spring Experiment Tara Jensen 1*, Steve Weiss 2, Jason J.
Model testing, community code support, and transfer between research and operations: The Tropical Cyclone Modeling Testbed (TCMT) and the Developmental.
NCAR Annual Budget Review October 8, 2007 Tim Killeen NCAR Director.
Edward Tollerud (Funded by USWRP) HMT-DTC Collaboration.
Bill Kuo 1, Louisa Nance 1, Barb Brown 1 and Zoltan Toth 2 Developmental Testbed Center 1. National Center for Atmospheric Research 2. Earth System Research.
Creating a Shared Vision Model. What is a Shared Vision Model? A “Shared Vision” model is a collective view of a water resources system developed by managers.
Ed Tollerud, Tara Jensen, Barb Brown ( also Yuejian Zhu, Zoltan Toth, Tony Eckel, Curtis Alexander, Huiling Yuan,…) Module 6 Objective: Provide a portable,
Page 1© Crown copyright 2007SRNWP 8-11 October 2007, Dubrovnik SRNWP – Revised Verification Proposal Clive Wilson Presented by Terry Davies at SRNWP Meeting.
Page 1© Crown copyright 2005 SRNWP – Revised Verification Proposal Clive Wilson, COSMO Annual Meeting September 18-21, 2007.
Bill Kuo DTC Management Board Meeting 17 th March 2011.
Barbara Brown 1, Ed Tollerud 2, and Tara Jensen 1 1 NCAR/RAL, Boulder, CO and DTC 2 NOAA/GSD, Boulder, CO and DTC DET: Testing and Evaluation Plan Wally.
1 CIMSS Participation in the Development of a GOES-R Proving Ground Timothy J. Schmit NOAA/NESDIS/Satellite Applications and Research Advanced Satellite.
1 NUOPC National Unified Operational Prediction Capability 1 Review Committee for Operational Processing Centers National Unified Operational Prediction.
1 Precipitation verification Precipitation verification is still in a testing stage due to the lack of station observation data in some regions
HWT Spring Forecasting Experiment: History and Success Dr. Adam Clark February 25, 2015 National Weather Center Norman, Oklahoma.
DTC HWRF Task AOP2009 & AOP /01/2009 Ligia Bernardet.
1 INTRODUCTION TO DTC ENSEMBLE TESTBED (DET) Zoltan Toth 1 1 GSD/ESRL/OAR/NOAA 2 NCAR/RAL Team Members: Barbara Brown 2, Tara Jensen 2, Huiling Yuan 1,
AWIPS II Update Unidata Policy Committee Meeting J.C. Duh Chief, Program & Plans Division, Office of Science & Technology, NWS April 15, 2010.
Summary of EMC-DTC Meeting June 18, DTC AOP 2010 Tasks to be continued from 2009: – WRF-NMM + WPP community support (Jamie Wolff) – Reference configurations.
3 rd Annual WRF Users Workshop Promote closer ties between research and operations Develop an advanced mesoscale forecast and assimilation system   Design.
Ligia Bernardet, S. Bao, C. Harrop, D. Stark, T. Brown, and L. Carson Technology Transfer in Tropical Cyclone Numerical Modeling – The Role of the DTC.
DTC Verification for the HMT Edward Tollerud 1, Tara Jensen 2, John Halley Gotway 2, Huiling Yuan 1,3, Wally Clark 4, Ellen Sukovich 4, Paul Oldenburg.
1 11/25/2015 Developmental Testbed Center (DTC) Bob Gall June 2004.
Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project.
APPLICATION OF NUMERICAL MODELS IN THE FORECAST PROCESS - FROM NATIONAL CENTERS TO THE LOCAL WFO David W. Reynolds National Weather Service WFO San Francisco.
NWS Operations Proving Ground Update Chad Gravelle OCONUS Meeting 20 June 2013.
NOAA Hazardous Weather Test Bed (SPC, OUN, NSSL) Objectives – Advance the science of weather forecasting and prediction of severe convective weather –
Gary Jedlovec Roadmap to Success transitioning unique NASA data and research technologies to operations.
MMET Team Michelle Harrold Tracy Hertneky Jamie Wolff Demonstrating the utility of the Mesoscale Model Evaluation Testbed (MMET)
Working group III report Post-processing. Members participating Arturo Quintanar, John Pace, John Gotway, Huiling Yuan, Paul Schultz, Evan Kuchera, Barb.
DET Module 5 Products and Display Tara Jensen 1 and Paula McCaslin 2 1 NCAR/RAL, Boulder, CO 2 NOAA/GSD, Boulder, CO Acknowledgements: HWT Spring Experiment.
HMT-DTC Project – 2009 Funded by USWRP Collaborators: NCAR – Tara Jensen, Tressa Fowler, John Halley-Gotway, Barb Brown, Randy Bullock ESRL – Ed Tollerud,
Recommendations and SPoRT Response transitioning unique NASA data and research technologies to operations Gary Jedlovec SAC Meeting – June 2007.
NCEP Vision: First Choice – First Alert – Preferred Partner 1 HPC Hydrometeorological Testbed April 2009.
Toward GSI Community Code Louisa Nance, Ming Hu, Hui Shao, Laurie Carson, Hans Huang.
Convective-Scale Numerical Weather Prediction and Data Assimilation Research At CAPS Ming Xue Director Center for Analysis and Prediction of Storms and.
Edward Tollerud 1, Tara Jensen 2, John Halley Gotway 2, Huiling Yuan 1,3, Wally Clark 4, Ellen Sukovich 4, Paul Oldenburg 2, Randy Bullock 2, Gary Wick.
1 Aviation Forecasting – Works in Progress NCVF – Ceiling & Visibility CoSPA – Storm Prediction A Joint Effort Among: MIT Lincoln Laboratory NCAR – National.
 Federal Aviation Administration’s Notice of Proposed Rulemaking on certification of aircraft for operation in supercooled large drop (SLD) icing conditions.
Transitioning unique NASA data and research technologies to operations Short-term Prediction Research and Transition (SPoRT) Project Future Directions.
DTC Overview Bill Kuo September 25, Outlines DTC Charter DTC Management Structure DTC Budget DTC AOP 2010 Processes Proposed new tasks for 2010.
DET Module 1 Ensemble Configuration Linda Wharton 1, Paula McCaslin 1, Tara Jensen 2 1 NOAA/GSD, Boulder, CO 2 NCAR/RAL, Boulder, CO 3/8/2016.
1 Where the Rubber Meets the Road: Testbed Experiences of the Hydrometeorological Prediction Center David Novak 1, Faye Barthold 2, Mike Bodner 1, and.
1 National Centers for Environmental Prediction: Where America’s Climate and Weather Services Begin Louis W. Uccellini Director, NCEP January 28, 2004.
The Quantitative Precipitation Forecasting Component of the 2011 NOAA Hazardous Weather Testbed Spring Experiment David Novak 1, Faye Barthold 1,2, Mike.
A. FY12-13 GIMPAP Project Proposal Title Page version 26 October 2011 Title: WRF Cloud and Moisture Verification with GOES Status: New GOES Utilization.
1 Application of MET for the Verification of the NWP Cloud and Precipitation Products using A-Train Satellite Observations Paul A. Kucera, Courtney Weeks,
Bill Kuo 1, Steve Koch 2, Barb Brown 1, Louisa Nance 1 Developmental Testbed Center 1. National Center for Atmospheric Research 2. Earth System Research.
Developmental Testbed Center: Overview
NWS Alaska Region, GSD, with input from AWC
A few examples of heavy precipitation forecast Ming Xue Director
Hydrometeorological Predication Center
5Developmental Testbed Center
Presentation transcript:

Ed Tollerud In collaboration with ESRL, RFC’s, WFO’s, and CDWR HMT-DTC Task

DTC Goals and the HMT Provide early step up for front-line forecasters vis-à-vis ensemble usage Demonstrate usefulness of prototype ensemble forecast systems; thereby provide long-range guidance for ensemble forecast systems Enhance and motivate MET development HMT-West ensemble QPF evaluation is current focus to attain these goals

Major accomplishments - past 6 months Online WRF ensemble verification in near-real-time: Regionalization of HMT domain 30-day summaries Observational data options Object-based (MODE/MET) verification of historical atmospheric rivers Direct comparison between GFS and WRF ensemble mean

Online QPF verification for HMT-West  Equitable Threat Scores (ETS) for ensemble members  Forecasts initiated at 1200 UTC 1/  Ensemble mean (black)  GFS (green)  30-day summary boxplots, ETS  January 2010  Large HMT domain  Intensity thresholds:.01,.1,.5,1,3 in

Planned activities - next 6 months Revise and expand winter exercise verification demonstration website Prepare and present preliminary results from winter exercise verification Develop probabilistic capabilities for QPF in the HMT Continue retrospective study of MET/MODE applications to atmospheric river forecasts

Anticipated major accomplishments - next 6 months Following features added to HMT-West winter verification exercise: 1-2 new additional baseline models (SREF, NAM,…) Time series display of MODE attributes Displays of reliability and ROC diagrams 6h gage accumulation options

Plans for AOP 2011 Incorporate spatial verification statistics for ensembles Implement METviewer online displays Collaborate with DET to apply MET-based probabilistic verification techniques Develop procedures to assess gridded QPE products and their impact on verification Extend verification to additional meteorological fields Implement prototype AR verification utility

Anticipated major accomplishments – AOP 2011 Revised and expanded verification options appropriate for HMT (West and East) Comprehensive suite of probabilistic verification utilities applied to QPF for HMT Object-based verification techniques implemented for AR forecasting Reports and presentations from HMT-West verification activities

Resource requirements for AOP 2011 Staff CategoryFTE Scientists1.0 Software Engineers0.5 Students0.1 ScientistsSoftware Engineers Ed TollerudJohn Halley Gotway Tara JensenPaul Oldenburg CIRES associateRandy Bullock RAL statisticianBrent Ertz Tressa FowlerChris Harrop Barb Brown Non-salaryCost Travel$7 K Publications$2 K

Challenges (Science and Resources) Establishing synergistic roles of DET and DTC vis-à-vis ensemble verification Determining potential use and advantages for regional and episodic verification Developing appropriate down-scaling methods for baselining and comparison Improving forecaster access to results of evaluation (ALPS? AWIPS?) Validating long-range point precipitation forecast for Tillamook for September 2011

Tara Jensen HWT-DTC Collaboration Collaborating with: NOAA: SPC, NSSL, HPC, AWC, ESRL, EMC Universities: OU/CAPS NCAR: MMM UCAR: COMET

HWT-DTC Collaboration Goals Explore a different R2O Paradigm “O” is Forecasters, WFOs, and Prediction Centers Evaluate convection allowing models on leading edge of NWP Gather robust dataset for ensemble and deterministic studies Datasets and algorithms leveraged by DET Demonstrate utility of MET-based objective evaluation in a forecast environment Provide near real-time evaluation during Spring Experiment Application of MET feeds back to MET development Facilitate faster R2O through focused evaluation of key topics identified during Spring Experiment Perform retrospective studies using DTCs METviewer

Major accomplishments - past 6 months  Enormous expansion of near real-time evaluation capability  Evaluation of 30 models during Spring Experiment (CAPS ensemble+3 operational baselines)  10 Deterministic and 4 Ensemble products evaluated using traditional and spatial verification methods.  Three additional research models available for retrospective studies  DTC staff participation in each week of Spring Experiment 2010  Manuscript accepted by AMS Weather and Forecasting. (Kain et. al, 2010 on Impact of radar assimilation on short-term forecasts)

2010 HWT Spring Experiment – 30 models CAPS SSEF Ensemble PM Mean CAPS 1 km Model CAPS SSEF ARW-CN (control w/ radar assimilation) 3 km HRRR 12km NAM CAPS SSEF ARW-C0 (control w/o radar assimilation) Assimilation continues to show greater skill over no-assimilation CAPS Ens. Mean generally has higher skill than 1km deterministic and Operational baselines

Planned activities - next 6 months Complete re-run of Spring Experiment evaluation Add additional probabilistic measures (Brier Skill Score, Reliability Diagrams, and ROC Curves) Perform retrospective evaluation of all members/products Probabilistic Verification of Ensemble Products (for a Single Run) Brier Score and decomposition CAPS Simple CAPS Neighborhood SREF Simple

Anticipated major accomplishments - next 6 months Analysis for presentations (Severe Local Storms and AMS) Investigate physics perturbations on Accumulated Precip. and Reflectivity Evaluate performance of probability products for QPF Determine meaningful metrics from object-oriented evaluation of Radar Echo Top Report on impact of Radar Assimilation on 0-12 hr forecast Targeted analysis using METviewer to facilitate 2011 Spring Experiment planning Quality-checked and complete dataset available to collaborators for their own investigation

Plans for AOP 2011 Contribute to R2O into SPC, HPC and AWC? by providing objective evaluation for their Spring Experiment foci: Severe Convective Initiation/Lightning QPF/Heavy Rain Ensemble Product Evaluation Perform retrospective studies Provide feedback to researchers on ensemble configuration and product generation Leverage DET Verification Module Help develop link between DET and HWT

Anticipated major accomplishments – AOP 2011 Near real-time objective evaluation for: Simulated Satellite Lightning Product QPF and Probabilistic (PQPF) fields Reflectivity (if desired) Evaluation products available within 30 min – 2 hours of valid time (rather than 6-18 hours) DTC staff member participation during each week of Spring Experiment Submit Journal article on QPF/PQPF evaluation

Resource requirements for AOP 2011 Staff CategoryFTE Scientists0.54 Software Engineers0.31 Students0.15 ScientistsSoftware Engineers Tara Jensen{System Engineer} Michelle HarroldPaul Oldenburg Jamie Wolff*John Halley Gotway Isidora Jankov*Brent Ernst Tressa Fowler* Lisa Coco Non-salaryCost Travel$9 K Workshops$1 K Publications$1.5 K Other(see below) Compute Resources: HPC facility like NOAA-Jet system or NCAR Bluefire system Goal is to use DET Verification Module. * Spring Experiment participant

Challenges How to balance expectations with budget HWT Collaboration started small Success has garnered more requests and expectation Budget has not grown How to serve the needs of additional testbeds AWC planning to start separate Aviation Weather Testbed Indicated interest in DTC participation - How do we fund activity? HPC planning to continue using HWT facility What if their evaluation needs expand? Defining HWT-DTC Collaboration and DET interaction What is appropriate evolution of these complimentary tasks?

Zoltan Toth DTC Ensemble Testbed Collaborators: NOAA/NCEP/EMC DET website:

Goals for setting up new DET task Develop and maintain DET infrastructure Support its use by community Establish NCEP operational system as benchmark Test and evaluate new community methods Transition successful methods to NCEP and other agencies Link up with ensemble work in other test-beds

Module 1: Configuration Module 2: Initial Perturbations Module 4: Statistical Post- Processing External Input (HMT, HWT, HFIP, etc) Module 3: Model Perturbations Module 5: Product Generation Module 6: Verification DET MODULES

Major accomplishments - past 6 months Initial plans developed for Overall architecture of DET infrastructure Each of 6 modules Test and Evaluation Initial tests for establishing basic capabilities for 2 modules Configuration Initial perturbations Collaboration with other DTC tasks / projects HMT – Joint plans for testing DET & HMT ensemble generation HWT – Joint plans for evaluation of 2011 HWT ensemble Outreach Organized DET Workshop and engaged with WRF Ensemble WG Activities coordinated with NCEP/EMC via regular meetings

DET Workshop Organized by DTC with input from WRF Ensemble WG Aug , Boulder Excellent opportunity for connecting DET with community 50 plus participants from community & agencies Total of 6 Breakout sessions focused on Configuration Preliminary plans for testing Extensive set of important recommendations To be used for updating DET plans WRF Ensemble WG meeting, Aug 20 Additional recommendations

Planned activities - next 6 months Complete planning for overall design and 6 modules Develop and test initial perturbation methods Collaborate closely with NCEP/EMC for their next implementation Complete protocol for T&E Continue coordination / collaboration with HMT & HWT Leverage winter 2010/11 HMT ensemble for DET testing Prepare for joint evaluation of 2011 HWT ensemble

Anticipated major accomplishments - next 6 months Detailed plan for infrastructure – overall design plus 6 modules Basic capability for modules 1-2 Initial tests contributing to next NCEP implementation In collaboration with NCEP Protocol for T&E & basic capability for verification module (#6) Joint plans / activities with HMT & HWT Testing of generation & verification of ensemble (HMT) Evaluation plan of ensemble (HWT) Basic capability – Functionality tested / supported Benchmark – Methods used in operations can be replicated

Plans for AOP 2011 Benchmark for module 2 Contribute to next NCEP implementation; Replicate next NCEP configuration T&E for module 2 Use T&E protocol and verification module Contribute to NCEP implementation for module 2 (FY11-12) Test improved method, contribute with algorithm & software Basic capability for module 3 Initial tests with NEMS compatible and WRF models Contribute to design and evaluation of HMT & HWT ensembles Collaborative effort

Anticipated major accomplishments – AOP 2011 Benchmark for module 2 Milestone: Build initial perturbation module replicating next NCEP SREF implementation T&E for module 2 Milestone: Complete tests of new initial perturbation configuration Contribute to NCEP implementation for module 2 (FY11-12) Milestone: Collaborate with NCEP scientists and contribute to development of new initial perturbation method Basic capability for module 3 Milestone: Demonstrate basic capabilities for model-related uncertainty (NEMS / WRF) Contribute to design and evaluation of HMT & HWT ensembles Milestones: Test DET benchmark configuration for initial perturbation module in HMT ensemble Contribution to evaluation of HWT ensemble

Resource requirements for AOP 2011 Staff CategoryFTE Scientists2.1 Software Engineers0.8 ScientistsSoftware Engineers Barbara BrownPaula McCaslin Isidora JankovLinda Wharton Tara JensenEugene Mirvis Ed TollerudAnn Holmes New Hire (DET Lead) Non-salaryCost Travel$8 K Workshops$15 K Publications$2 K

Challenges Identify CPU resources for computer intensive DET work Internal GSD & NCAR resources very limited Prepare & submit proposals for external resources Agencies and community desire quick setup of & extensive testing by DET Mismatch between funding and expectations How can DTC build benchmarks faster? How can DTC engage other agencies / programs / projects? HFIP, NUOPC, NEXTGEN, FAA, AFWA, etc Contingency plan assuming additional funding Accelerate integrated development of & community support for modules 4-6 Basic capability for modules 4-5 – AOP11 Benchmark for modules 4-6 – AOP12

DET TIMELINE – STARTING MARCH 2011 AOP11 deliverables – Basic capabilities, benchmarks Basic capability for module 3 Benchmark + T&E for module 2 Contribute to NCEP implementation for modules 2 (FY11-12) Contribute to design and evaluation of HMT & HWT ensembles AOP12 deliverables – Benchmarks, T&E Benchmark + T&E for module 3 Basic capability for modules 4-5 T&E of community methods for module 2 AOP13 deliverables Benchmark for modules 4 & 6 Contribute to NCEP implementation for module 4 (FY13-14) T&E of community methods for all modules Update benchmarks

Laurie Carson DTC NEMS Collaborators: NOAA/NCEP/EMC, NOAA/ESRL

DTC NEMS Goals Define community software goals for the NEMS modeling framework and operational model configurations Gain expertise in the NEMS framework, by participating in and assisting the on-going development, with a community support focus Develop an O2R and R2O ops-concept that supports the needs of both the research and the operational user

Progress to-date: past 6 months Explore approaches and ideas to help define DTC NEMS community software goals and timeline On-going: internal (DTC) and external (EMC, NCAR, GSD) discussions NEMS software status and near-term plans at EMC 1-2 year plans including FIM, RapidRefresh, SREF Technical expert meeting planned to focus on the research and operations connections between WRF and NEMS Develop DTC expertise in NEMS software On-going: technical discussions with NEMS developers Code repository Task/Work plan for Software Engineer hire Software Engineer hire to focus on NEMS development (at EMC) starting Oct 4, 2010 Technical Meeting: October 5, 2010

Planned activities - next 6 months DTC Software Engineer Will reside at EMC to further facilitate the information exchange and DTC contributions Add in-depth knowledge of NEMS to the DTC Contribute to the development of NEMS software Begin work on a draft code management plan Hold a technical information exchange meeting Fall 2010: DTC, NOAA-EMC, NOAA-GSD, NCAR-MMM overlap between WRF and NEMS with respect to community support maintain a connection with the broad research community use this as a path-finder to explore R2O and O2R mechanisms

Anticipated major accomplishments - next 6 months Refine work plan for DTC-NEMS software engineer Add I/O layer capabilities to NEMS that will benefit research communities Computing Platform Portability (systems, compilers) Inter-operability with other community software Users Guides, sample datasets, and examples of use Technical meeting Oct 5, 2010 in Boulder Focus on upcoming interactions between WRF and NEMS components

Plans for AOP 2011 Strengthen the foundation of DTC expertise with the NEMS software Continue to contribute to the development process for NEMS, with a community user focus. Refine DTC NEMS community software goals and timeline Develop, with EMC, a NEMS code management plan to support community software and operational system goals

Anticipated major accomplishments – AOP 2011 DTC NEMS code management plan Continued Technical Interchange Meetings DTC NEMS community software plan, including a path from R2O and O2R

Resource requirements for AOP 2011 Staff CategoryFTE Scientists0.1 Software Engineers0.75 ScientistsSoftware Engineers Jamie WolffEugene Mirvis Laurie Carson Non-salaryCost Travel$12 K Workshops$300

Challenges Engagement of the NWP research community is a critical component to making contributions to the operational NWP capabilities NEMS is still under active development, so any decision to package a software release to the research community must be done with careful coordination between EMC and DTC A fully supported community software package requires considerable resources from DTC. Is this the best use of DTC resources?