Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ed Tollerud In collaboration with ESRL, RFC’s, WFO’s, and CDWR HMT-DTC Task.

Similar presentations


Presentation on theme: "Ed Tollerud In collaboration with ESRL, RFC’s, WFO’s, and CDWR HMT-DTC Task."— Presentation transcript:

1 Ed Tollerud In collaboration with ESRL, RFC’s, WFO’s, and CDWR HMT-DTC Task

2 DTC Goals and the HMT Provide early step up for front-line forecasters vis-à-vis ensemble usage Demonstrate usefulness of prototype ensemble forecast systems; thereby provide long-range guidance for ensemble forecast systems Enhance and motivate MET development HMT-West ensemble QPF evaluation is current focus to attain these goals

3 Major accomplishments - past 6 months Online WRF ensemble verification in near-real-time: Regionalization of HMT domain 30-day summaries Observational data options Object-based (MODE/MET) verification of historical atmospheric rivers Direct comparison between GFS and WRF ensemble mean

4 Online QPF verification for HMT-West  Equitable Threat Scores (ETS) for ensemble members  Forecasts initiated at 1200 UTC 1/17 2010  Ensemble mean (black)  GFS (green)  30-day summary boxplots, ETS  January 2010  Large HMT domain  Intensity thresholds:.01,.1,.5,1,3 in

5 Planned activities - next 6 months Revise and expand winter exercise verification demonstration website Prepare and present preliminary results from 2009-2010 winter exercise verification Develop probabilistic capabilities for QPF in the HMT Continue retrospective study of MET/MODE applications to atmospheric river forecasts

6 Anticipated major accomplishments - next 6 months Following features added to HMT-West winter verification exercise: 1-2 new additional baseline models (SREF, NAM,…) Time series display of MODE attributes Displays of reliability and ROC diagrams 6h gage accumulation options

7 Plans for AOP 2011 Incorporate spatial verification statistics for ensembles Implement METviewer online displays Collaborate with DET to apply MET-based probabilistic verification techniques Develop procedures to assess gridded QPE products and their impact on verification Extend verification to additional meteorological fields Implement prototype AR verification utility

8 Anticipated major accomplishments – AOP 2011 Revised and expanded verification options appropriate for HMT (West and East) Comprehensive suite of probabilistic verification utilities applied to QPF for HMT Object-based verification techniques implemented for AR forecasting Reports and presentations from HMT-West verification activities

9 Resource requirements for AOP 2011 Staff CategoryFTE Scientists1.0 Software Engineers0.5 Students0.1 ScientistsSoftware Engineers Ed TollerudJohn Halley Gotway Tara JensenPaul Oldenburg CIRES associateRandy Bullock RAL statisticianBrent Ertz Tressa FowlerChris Harrop Barb Brown Non-salaryCost Travel$7 K Publications$2 K

10 Challenges (Science and Resources) Establishing synergistic roles of DET and DTC vis-à-vis ensemble verification Determining potential use and advantages for regional and episodic verification Developing appropriate down-scaling methods for baselining and comparison Improving forecaster access to results of evaluation (ALPS? AWIPS?) Validating long-range point precipitation forecast for Tillamook for September 2011

11 Tara Jensen HWT-DTC Collaboration Collaborating with: NOAA: SPC, NSSL, HPC, AWC, ESRL, EMC Universities: OU/CAPS NCAR: MMM UCAR: COMET

12 HWT-DTC Collaboration Goals Explore a different R2O Paradigm “O” is Forecasters, WFOs, and Prediction Centers Evaluate convection allowing models on leading edge of NWP Gather robust dataset for ensemble and deterministic studies Datasets and algorithms leveraged by DET Demonstrate utility of MET-based objective evaluation in a forecast environment Provide near real-time evaluation during Spring Experiment Application of MET feeds back to MET development Facilitate faster R2O through focused evaluation of key topics identified during Spring Experiment Perform retrospective studies using DTCs METviewer

13 Major accomplishments - past 6 months  Enormous expansion of near real-time evaluation capability  Evaluation of 30 models during Spring Experiment (CAPS ensemble+3 operational baselines)  10 Deterministic and 4 Ensemble products evaluated using traditional and spatial verification methods.  Three additional research models available for retrospective studies  DTC staff participation in each week of Spring Experiment 2010  Manuscript accepted by AMS Weather and Forecasting. (Kain et. al, 2010 on Impact of radar assimilation on short-term forecasts)

14 2010 HWT Spring Experiment – 30 models CAPS SSEF Ensemble PM Mean CAPS 1 km Model CAPS SSEF ARW-CN (control w/ radar assimilation) 3 km HRRR 12km NAM CAPS SSEF ARW-C0 (control w/o radar assimilation) Assimilation continues to show greater skill over no-assimilation CAPS Ens. Mean generally has higher skill than 1km deterministic and Operational baselines

15 Planned activities - next 6 months Complete re-run of Spring Experiment evaluation Add additional probabilistic measures (Brier Skill Score, Reliability Diagrams, and ROC Curves) Perform retrospective evaluation of all members/products Probabilistic Verification of Ensemble Products (for a Single Run) Brier Score and decomposition CAPS Simple CAPS Neighborhood SREF Simple

16 Anticipated major accomplishments - next 6 months Analysis for presentations (Severe Local Storms and AMS) Investigate physics perturbations on Accumulated Precip. and Reflectivity Evaluate performance of probability products for QPF Determine meaningful metrics from object-oriented evaluation of Radar Echo Top Report on impact of Radar Assimilation on 0-12 hr forecast Targeted analysis using METviewer to facilitate 2011 Spring Experiment planning Quality-checked and complete dataset available to collaborators for their own investigation

17 Plans for AOP 2011 Contribute to R2O into SPC, HPC and AWC? by providing objective evaluation for their Spring Experiment foci: Severe Convective Initiation/Lightning QPF/Heavy Rain Ensemble Product Evaluation Perform retrospective studies Provide feedback to researchers on ensemble configuration and product generation Leverage DET Verification Module Help develop link between DET and HWT

18 Anticipated major accomplishments – AOP 2011 Near real-time objective evaluation for: Simulated Satellite Lightning Product QPF and Probabilistic (PQPF) fields Reflectivity (if desired) Evaluation products available within 30 min – 2 hours of valid time (rather than 6-18 hours) DTC staff member participation during each week of Spring Experiment Submit Journal article on QPF/PQPF evaluation

19 Resource requirements for AOP 2011 Staff CategoryFTE Scientists0.54 Software Engineers0.31 Students0.15 ScientistsSoftware Engineers Tara Jensen{System Engineer} Michelle HarroldPaul Oldenburg Jamie Wolff*John Halley Gotway Isidora Jankov*Brent Ernst Tressa Fowler* Lisa Coco Non-salaryCost Travel$9 K Workshops$1 K Publications$1.5 K Other(see below) Compute Resources: HPC facility like NOAA-Jet system or NCAR Bluefire system Goal is to use DET Verification Module. * Spring Experiment participant

20 Challenges How to balance expectations with budget HWT Collaboration started small Success has garnered more requests and expectation Budget has not grown How to serve the needs of additional testbeds AWC planning to start separate Aviation Weather Testbed Indicated interest in DTC participation - How do we fund activity? HPC planning to continue using HWT facility What if their evaluation needs expand? Defining HWT-DTC Collaboration and DET interaction What is appropriate evolution of these complimentary tasks?

21 Zoltan Toth DTC Ensemble Testbed Collaborators: NOAA/NCEP/EMC DET website: http://www.dtcenter.org/det

22 Goals for setting up new DET task Develop and maintain DET infrastructure Support its use by community Establish NCEP operational system as benchmark Test and evaluate new community methods Transition successful methods to NCEP and other agencies Link up with ensemble work in other test-beds

23 Module 1: Configuration Module 2: Initial Perturbations Module 4: Statistical Post- Processing External Input (HMT, HWT, HFIP, etc) Module 3: Model Perturbations Module 5: Product Generation Module 6: Verification DET MODULES

24 Major accomplishments - past 6 months Initial plans developed for Overall architecture of DET infrastructure Each of 6 modules Test and Evaluation Initial tests for establishing basic capabilities for 2 modules Configuration Initial perturbations Collaboration with other DTC tasks / projects HMT – Joint plans for testing DET & HMT ensemble generation HWT – Joint plans for evaluation of 2011 HWT ensemble Outreach Organized DET Workshop and engaged with WRF Ensemble WG Activities coordinated with NCEP/EMC via regular meetings

25 DET Workshop Organized by DTC with input from WRF Ensemble WG Aug. 18-19, Boulder Excellent opportunity for connecting DET with community 50 plus participants from community & agencies Total of 6 Breakout sessions focused on Configuration Preliminary plans for testing Extensive set of important recommendations To be used for updating DET plans WRF Ensemble WG meeting, Aug 20 Additional recommendations

26 Planned activities - next 6 months Complete planning for overall design and 6 modules Develop and test initial perturbation methods Collaborate closely with NCEP/EMC for their next implementation Complete protocol for T&E Continue coordination / collaboration with HMT & HWT Leverage winter 2010/11 HMT ensemble for DET testing Prepare for joint evaluation of 2011 HWT ensemble

27 Anticipated major accomplishments - next 6 months Detailed plan for infrastructure – overall design plus 6 modules Basic capability for modules 1-2 Initial tests contributing to next NCEP implementation In collaboration with NCEP Protocol for T&E & basic capability for verification module (#6) Joint plans / activities with HMT & HWT Testing of generation & verification of ensemble (HMT) Evaluation plan of ensemble (HWT) Basic capability – Functionality tested / supported Benchmark – Methods used in operations can be replicated

28 Plans for AOP 2011 Benchmark for module 2 Contribute to next NCEP implementation; Replicate next NCEP configuration T&E for module 2 Use T&E protocol and verification module Contribute to NCEP implementation for module 2 (FY11-12) Test improved method, contribute with algorithm & software Basic capability for module 3 Initial tests with NEMS compatible and WRF models Contribute to design and evaluation of HMT & HWT ensembles Collaborative effort

29 Anticipated major accomplishments – AOP 2011 Benchmark for module 2 Milestone: Build initial perturbation module replicating next NCEP SREF implementation T&E for module 2 Milestone: Complete tests of new initial perturbation configuration Contribute to NCEP implementation for module 2 (FY11-12) Milestone: Collaborate with NCEP scientists and contribute to development of new initial perturbation method Basic capability for module 3 Milestone: Demonstrate basic capabilities for model-related uncertainty (NEMS / WRF) Contribute to design and evaluation of HMT & HWT ensembles Milestones: Test DET benchmark configuration for initial perturbation module in HMT ensemble Contribution to evaluation of HWT ensemble

30 Resource requirements for AOP 2011 Staff CategoryFTE Scientists2.1 Software Engineers0.8 ScientistsSoftware Engineers Barbara BrownPaula McCaslin Isidora JankovLinda Wharton Tara JensenEugene Mirvis Ed TollerudAnn Holmes New Hire (DET Lead) Non-salaryCost Travel$8 K Workshops$15 K Publications$2 K

31 Challenges Identify CPU resources for computer intensive DET work Internal GSD & NCAR resources very limited Prepare & submit proposals for external resources Agencies and community desire quick setup of & extensive testing by DET Mismatch between funding and expectations How can DTC build benchmarks faster? How can DTC engage other agencies / programs / projects? HFIP, NUOPC, NEXTGEN, FAA, AFWA, etc Contingency plan assuming additional funding Accelerate integrated development of & community support for modules 4-6 Basic capability for modules 4-5 – AOP11 Benchmark for modules 4-6 – AOP12

32 DET TIMELINE – STARTING MARCH 2011 AOP11 deliverables – Basic capabilities, benchmarks Basic capability for module 3 Benchmark + T&E for module 2 Contribute to NCEP implementation for modules 2 (FY11-12) Contribute to design and evaluation of HMT & HWT ensembles AOP12 deliverables – Benchmarks, T&E Benchmark + T&E for module 3 Basic capability for modules 4-5 T&E of community methods for module 2 AOP13 deliverables Benchmark for modules 4 & 6 Contribute to NCEP implementation for module 4 (FY13-14) T&E of community methods for all modules Update benchmarks

33 Laurie Carson DTC NEMS Collaborators: NOAA/NCEP/EMC, NOAA/ESRL

34 DTC NEMS Goals Define community software goals for the NEMS modeling framework and operational model configurations Gain expertise in the NEMS framework, by participating in and assisting the on-going development, with a community support focus Develop an O2R and R2O ops-concept that supports the needs of both the research and the operational user

35 Progress to-date: past 6 months Explore approaches and ideas to help define DTC NEMS community software goals and timeline On-going: internal (DTC) and external (EMC, NCAR, GSD) discussions NEMS software status and near-term plans at EMC 1-2 year plans including FIM, RapidRefresh, SREF Technical expert meeting planned to focus on the research and operations connections between WRF and NEMS Develop DTC expertise in NEMS software On-going: technical discussions with NEMS developers Code repository Task/Work plan for Software Engineer hire Software Engineer hire to focus on NEMS development (at EMC) starting Oct 4, 2010 Technical Meeting: October 5, 2010

36 Planned activities - next 6 months DTC Software Engineer Will reside at EMC to further facilitate the information exchange and DTC contributions Add in-depth knowledge of NEMS to the DTC Contribute to the development of NEMS software Begin work on a draft code management plan Hold a technical information exchange meeting Fall 2010: DTC, NOAA-EMC, NOAA-GSD, NCAR-MMM overlap between WRF and NEMS with respect to community support maintain a connection with the broad research community use this as a path-finder to explore R2O and O2R mechanisms

37 Anticipated major accomplishments - next 6 months Refine work plan for DTC-NEMS software engineer Add I/O layer capabilities to NEMS that will benefit research communities Computing Platform Portability (systems, compilers) Inter-operability with other community software Users Guides, sample datasets, and examples of use Technical meeting Oct 5, 2010 in Boulder Focus on upcoming interactions between WRF and NEMS components

38 Plans for AOP 2011 Strengthen the foundation of DTC expertise with the NEMS software Continue to contribute to the development process for NEMS, with a community user focus. Refine DTC NEMS community software goals and timeline Develop, with EMC, a NEMS code management plan to support community software and operational system goals

39 Anticipated major accomplishments – AOP 2011 DTC NEMS code management plan Continued Technical Interchange Meetings DTC NEMS community software plan, including a path from R2O and O2R

40 Resource requirements for AOP 2011 Staff CategoryFTE Scientists0.1 Software Engineers0.75 ScientistsSoftware Engineers Jamie WolffEugene Mirvis Laurie Carson Non-salaryCost Travel$12 K Workshops$300

41 Challenges Engagement of the NWP research community is a critical component to making contributions to the operational NWP capabilities NEMS is still under active development, so any decision to package a software release to the research community must be done with careful coordination between EMC and DTC A fully supported community software package requires considerable resources from DTC. Is this the best use of DTC resources?


Download ppt "Ed Tollerud In collaboration with ESRL, RFC’s, WFO’s, and CDWR HMT-DTC Task."

Similar presentations


Ads by Google