Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bill Kuo DTC AOP 2011 and Challenges. Outline Review of DTC AOP 2010 tasks and budgets Highlights of DTC accomplishments Recommendations of SAB and DTC.

Similar presentations


Presentation on theme: "Bill Kuo DTC AOP 2011 and Challenges. Outline Review of DTC AOP 2010 tasks and budgets Highlights of DTC accomplishments Recommendations of SAB and DTC."— Presentation transcript:

1 Bill Kuo DTC AOP 2011 and Challenges

2 Outline Review of DTC AOP 2010 tasks and budgets Highlights of DTC accomplishments Recommendations of SAB and DTC responses Proposed DTC AOP 2011 Future direction and challenges

3 DTC Organization & AOP 2010 Tasks DTC Director’s Office WRF WRF for Hurricanes GSIMETHMTHWTDETNEMS DTC Visitor Program WRF: WRF modeling system WRF for Hurricanes: HWRF, HFIP GSI: Grid-point Statistical Interpolation data assimilation system MET: Model Evaluation Tools HMT: Hydrometeorology Testbed collaboration HWT: Hazardous Weather Testbed collaboration DET: DTC Ensemble Testbed NEMS: NOAA Environmental Modeling System Two major functions of DTC: A.Provide support for community systems B.Conduct testing and evaluation of community systems for research and operations

4 DTC FY 2010 Budget Allocations (in $K) TaskNOAA Core HFIP USWRP AFWANCARGSDNSFTotal DTC Dir. Office Visitor Prog WRF support HWRF ,123 GSI MET HMT-DTC HWT-DTC DET651 NEMS173 Total2, ,071

5 DTC Tasking and FY10 Funding Task areas Funding sources DTC Director’s office: $623K Visitor program: $222K WRF (Wolff): $698K HWRF (Bernardet): $1,123K GSI (Huang): $622K MET (Fowler): $487K HMT (Tollerud): $323K HWT (Jensen): $148K DET (Toth): $651K NEMS (Carson): $173K NOAA Core: $2,900K HFIP: $346K USWRP: $300K AFWA: $825K NCAR: $300K GSD: $300K NSF: $100K Total budget: $5,071K Collaborations with other testbeds

6 Major DTC milestones: Feb 8-9, 2010: DRAFT AOP 2010 reviewed by DTC MB Feb 28, 2010: Received DTC EC approval for AOP 2010 March 1, 2010: Begin AOP 2010 execution May 25, 2010: DTC MB teleconference meeting Aug 26, 2010: DTC EC meeting in WDC Sep 21-23, 2010: Joint DTC SAB/MB Meeting Oct 18, 2010: Received SAB written report Jan 11-12, 2011: DTC MB to review DRAFT AOP 2011

7 What we will discuss in this meeting: SAB recommendations and DTC responses: Do we agree with all their recommendations? For those that we agree, how do we incorporate their recommendations into AOP 2011 (with potentially reduced budget)? Continuing resolution and its impact on DTC AOP 2011 & budget and DTC operation: We don’t have final numbers from all DTC sponsors May not know the numbers until well into the AOP 2011 How should DTC operate under these budget uncertainties?

8 Highlights of DTC Activities Hired DTC software engineer (Eugene Mirvis) to work on NEMS task at EMC, in collaboration with EMC staff. Hired DTC scientist (Mrinal Kanti Biswas) to work on hurricane task. Established the Community GSI repository, and a process for community to contribute to GSI development. Started the development of DET modules. Tested WRF community code for 2011 operational hurricane prediction at NCEP. Additional highlights will be presented by task leads.

9 GSI R2O Transition Procedure (draft) Community research Code development candidate 1.GSI Review Committee - initial scientific review 2.DTC - developer code merging and testing 3.GSI Review Committee - code and commitment review 4.DTC->EMC GSI code commitment 5.EMC->DTC GSI repository syncing 1 1 DTC branch 2 2 EMC branch 3 3 EMC trunkDTC trunk

10 GSI Transition: O2R and R2O Past (two years ago?) Distributed development Different applications and operational requirement (GFS, RTMA, NAM, RR, AFWA…) Manual version control No system documentation limited system support (for GSI partners and community) Non-portable system Current GSI Review Committee (DTC, NCEP/EMC, NASA/GMAO, NOAA/ESRL, NCAR/MMM- AFWA) Multiple platform GSI system supported by DTC SVN version control (Duel repositories with synced trunk and localized development branches) Completed GSI User’s Guide and website Annual community release and residential tutorial Community user support by DTC R2O infrastructure and application Support by EMC (John Derber) was essential!!

11 WRF V3.1.1 WRF V3.2 WRF V2 05/ / /2010 HWRF code management: atmospheric model HWRF for operations (2011) oper HWRF 2009 oper HWRF 2009 oper HWRF 2010 “R2” oper HWRF 2010 “R2” baseline “R1” Tutorial, HWRF Beta release Contributions from EMC and NOAA Research to HWRF (preliminarily tested 3 rd nest, new nest motion) Contributions from EMC and NOAA Research to HWRF (preliminarily tested 3 rd nest, new nest motion) WRF V3.2.1 (…) WRF V / / /2011 (…) 04/2011 WRF V Baseline “R2 final” HWRF release

12 Community Code Accepted for 2011 Operations Tests (with EMC) towards acceptance of community code: Sanity Check: Comm HWRF against 2008/2009 model R1: Comm WRF code vs 2010 baseline (H050) R2: Comm WRF code vs 2010 operational configuration R2prime: Comm WRF & vortex codes vs 2010 operational R2-final: 2011 operational Baseline (on NCEP computer) In progress 2011 Baseline (on jet) (400 runs undergoing analysis) Upcoming: 2011 Baseline + HYCOM DTC exposed and/or fixed several bugs in this process

13 Functionally-equivalent T&E suite The Developmental Testbed Center HWRF 2011 Baseline Test Plan Point of Contact: Ligia Bernardet December 15, 2010 Introduction The DTC will be performing testing and evaluation for the Hurricane WRF system, known as HWRF (Gopalakrishnan et al. 2010). HWRF will be configured as close as possible to the operational HWRF model, employing the same domains, physics, coupling, and initialization procedures as the model used at the NOAA NCEP Central Operations and by the model developers at NCEP EMC. The configuration to be tested matches the 2011 HWRF Baseline, which is the configuration that served as control for all developments at EMC geared towards the 2011 operational implementation. Pre-processing (including ability to read binary GFS in spectral coordinates) Cycled HWRF vortex initialization and relocation GSI Data Assimilation Coupled (POM + WRF) model Post-processing Tracking NHC Verification & confidence intervals Display Archival

14 SAB Recommendations and DTC responses

15 General Conclusions: SAB believes DTC currently risks becoming spread over too many projects and directions. While most activities are relevant, some tasks appear to be peripheral to DTC and critical U.S. NWP goals. We restructured the DTC tasks for AOP 2011 into five focused areas. We examine all tasks to ensure their relevance to the core missions of DTC. New approach should improve the way we communicate the DTC activities to the outside community. The SAB believes that these accomplishments, although substantial, have not been sufficient to realize DTC’s core objectives, as noted in the introduction. For example, a functionally equivalent operational environment available for extended period tests is not available, although progress has been made in that direction. AFWA considers DTC to have a functionally equivalent test environment for AFWA related work. DTC now has a functionally equivalent test environment for HWRF (including full cycling capability) Efforts on “functionally equivalent test environment” are needed on mesoscale modeling (e.g., NAM, NEMS, and NMM-B).

16 General Conclusions: To date, a substantial amount of DTC effort has been placed on the long-term development of infrastructure in a range of areas, such as MET software package. However, it is time to place increased emphasis on infrastructure utilization that directly addresses DTC’s central goals. SAB asked DTC to increase testing and evaluation efforts that directly contribute to the core missions of the DTC. With flat budget, the efforts on community support have to be decreased.

17 Specific Recommendations and Conclusions (1) The DTC should give priority to building a functionally equivalent operational environment to test and evaluate new NWP methods for extended retrospective periods and significant events using advanced verification tools. Such an environment should include operations-like data assimilation/cycling. Sufficient resources should be provided to insure this is in place within 12 months. DTC will place greater efforts on this, particularly related to the mesoscale modeling. Assembling a functionally equivalent environment will require enhanced collaboration among several task areas, including mesoscale modeling, hurricane, data assimilation and ensemble efforts. (2) The SAB was in general agreement that a central role for DTC is to be the nation’s model “scorekeeper” that would track mesoscale model forecast improvement and serve as the key benchmark center for U.S. mesoscale NWP. DTC will develop a “test plan” for Reference Configuration (RC) testing that can help track mesoscale model forecast improvement. RC testing will include EMC and AFWA operational configurations. Concept will provide a framework for tracking the improvements of WRF systems with releases of new versions. The test plan will be reviewed by SAB representatives, EMC, and MMM. Increased efforts on T&E will require additional resources at the expense of community support services.

18 Specific Recommendations and Conclusions The establishment of a suite of model verification tools by DTC (including, but not limited to the MET package) was appropriate, although several enhancements are still required. For example, plan-view spatial verification is a critical tool that is currently missing from MET. Furthermore, DTC verification should support the full range of mesoscale assets, such as ACARS, NEXRAD and profiler data. However, the current lack of comprehensive, actionable verification statistics for major contemporary U.S. modeling systems (see 2 above) is of some concern and higher priority must be given to making this information available more rapidly, even if that requires redirecting some of the resources currently being provided to construct and support more specialized MET capabilities. Some of these features are already available in MET. Further enhancement will require additional resources. DTC will make a contribution to actionable verification statistics for major contemporary U.S. modeling systems, through Reference Configuration testing.

19 Specific Recommendations and Conclusions (4) The board notes that DTC has supported workshops on some modeling systems that are not widely used in the community (e.g., NMM model). Such activities may be justifiable, but it may be worthwhile to review whether their current frequency reflects optimal use of DTC resources. DTC does not hold NMM workshops. DTC and EMC need to decide on NMM-B (and NEMS) tutorials, as well as Joint ARW-NMM tutorials with MMM for AOP (5) The committee believes that although the DTC itself must maintain competency and knowledge in the new NWS model infrastructure (NEMS), for the immediate future there is little requirement for any significant NEMS outreach activities to the research community. DTC has no plan to start a tutorial on NEMS framework. DTC hired a staff to work at EMC, to gain expertise in NEMS, and will have the ability to implement components of other modeling systems into the NEMS-based EMC operational system for testing and evaluation. Louis Uccellini believes that the most effective R2O is for the research community to use operational systems for their research.

20 Specific Recommendations and Conclusions (6) The general sentiment of the SAB is that, although the NOAA Hydrometeorological Testbed (HMT) and Hazard Weather Testbed (HWT) are important national endeavors, DTC participation in these activities may be a secondary priority with respect to DTC’s core mission. Testbed collaborations are important to NOAA HWT and HMT collaborations are no longer identified as DTC task areas. Rather, they will be DTC special projects that are directly linked to a few DTC focused areas. Efforts are made to ensure that these projects contribute to DTC core missions. (7) The committee believes that the DTC can potentially play a unique role in bringing the research and operational communities together to examine and address key national NWP problems. Dealing with existing deficiencies in NWP physics parameterizations is one such problem. For AOP 2011, the DTC will organize a Physics Workshop, in collaboration with EMC and the academic community, to be held in WDC area in August 2011.

21 Specific Recommendations and Conclusions (8)… there remained concern on the board that, because DET is inevitably a costly long-term project, it may limit DTC’s ability to demonstrate sufficient short-term value and relevancy to its sponsoring agencies and the NWP community at large. Thus, some difficult resource decisions may become necessary. DET is an important activity for NWS to realize the goals outlined in the “White Paper”. DTC will continue to explore additional resources to augment and support such effort. DET needs to quickly demonstrate short-term value and relevancy to the sponsoring agencies and NWP community at large. (9)… Based on this experience and a review of previous cycles of the program, the board believes that the visitor program is an appropriate and important element of DTC activities. However, the board believes that the current program casts too wide a net in its request for proposals…Future calls should be crafted to clearly identify one or two mission-critical areas where collaborations with applicants having particularly relevant expertise can significantly augment DTC capabilities and hasten transition of effective NWP solutions to the broader community. DTC will improve the process for the visitor program for Input from MB will be an important part of this process!

22 Specific Recommendations and Conclusions (10) Regarding hurricane-related activities, the board felt that there is a need to establish and maintain a reference configuration and operational testbed for the Hurricane-WRF (HWRF). Such a testbed must include a data assimilation component. Furthermore, there is a need for the development and support of relevant diagnostic tools for hurricanes. The HWRF system maintained by DTC includes a data assimilation component. DTC will include HWRF as a Reference Configuration for testing, with full cycling data assimilation. DTC will also develop and support relevant diagnostic tools, in collaboration with HFIP partners, pending on HFIP support.

23 AOP 2011 Tasks DTC Director’s Office Mesoscale modeling Hurricanes Data Assimilation EnsembleVerification DTC Visitor Program Testbed collaborations: HWT & HMT 1.DTC activities are distilled into five focused areas 2.HWT & HMT are cross-DTC special projects, with contributions to DTC focused areas identified 3.NEMS is included as part of mesoscale modeling

24 Required budget for proposed AOP 2011 activities Proposed TaskBudget Director’s Office + Visitor Program$873,436 Mesoscale modeling (including NEMS)$1,032,363 Verification$464,989 Data assimilation$633,394 Hurricanes$1,262,758 Ensembles$723,442 HWT$206,636 HMT$321,798 Total$5,517,818 Total actual funding for AOP 2010 was $5,070K. Total for AOP 2011 is a 8.7% increase.

25 Future Directions and Challenges Mesoscale modeling, NMM-B and NEMS NCEP is moving forward with NMM-B and NEMS, and expects DTC to entrain the community into these operational systems SAB does not recommend DTC provide community support for NMM-B and NEMS DTC needs to have a stronger linkage to mesoscale modeling at NCEP (that will contribute directly to NCEP operations). Physics workshop is a start. Future NCEP short-range ensemble will include NMM-B and ARW in the NEMS framework. In 2012, DTC needs to decide whether to continue joint ARW- NMM tutorials (and whether to do NEMS tutorial).

26 Future Directions and Challenges Data Assimilation: NCEP is moving toward GSI-EnKF hybrid data assimilation for global modeling Multiple EnKF systems are being developed under HFIP sponsorship AFWA asked DTC to examine a few regional EnKF systems for possible operational use DTC needs to work with EMC and AFWA to decide on a community hybrid GSI-EnKF system for both regional and global NWP applications. DTC should not support multiple EnKF systems to the community.

27 Future Directions and Challenges Hurricane: DTC made significant progress in merging the operational HWRF code into WRF repository, and in testing the community HWRF code for operation Needs to assess the impact of EnKF on hurricane prediction Needs to facilitate improvements made by HFIP community into operations (i.e., Hurricane Test Plan) Future of operational hurricane model at NCEP: Will HWRF migrate toward NMM-B on NEMS? Should we consider migrating toward AHW?

28 Future Directions and Challenges Ensemble: Very good start on the development of the DET infrastructure and first few modules during AOP 2010 Need to demonstrate sufficient short-term value to operations and NWP community: EMC: Next generation SREF HWT: Collaboration with CAPS and SPC cloud-scale ensemble HMT: Ensemble verification Making the DET modules available to the community DET test and evaluation activities will require significant compute resources Additional resources required to accelerate development

29 THANK YOU!

30 Louisa Nance DTC Director’s Office

31 Director’s Office Responsibilities Internal Coordination Manage and coordinate DTC tasks Planning, budgeting, execution and reporting External Communication Conduct or assist with workshops and tutorials Interact with DTC partners on collaborative efforts Create and maintain the DTC Website Provide administrative support for DTC EC, MB & SAB meetings Host the DTC Visitor Program

32 Director’s Office Staff Management Bill Kuo – Director (0.5 FTE) Louisa Nance – Assistant Director (0.5 FTE) Barb Brown – JNT Director (0.10 FTE) Bonny Strong - JNT DTC Manager (0.25 FTE) Steve Koch – GSD Director (0.10 FTE) TBD - GSD DTC Manager (0.25 FTE) Administrative Support Pam Johnson (0.50 FTE) Total FTEs: 2.20 FTEs

33 Major accomplishments for AOP 2010 Internal Coordination DTC staff retreat (Apr 2010) Monthly staff / task lead meetings External Communication DTC Management Meetings 1 st face-to-face meeting of DTC Executive Committee (Aug 2010) 1 st SAB/MB meeting (Sept 2010) DTC Visitor Program Announcement of Opportunity (Jun 2010) Selected 6 proposals from 22 submissions for funding (Sept 2010) Selected hosts from DTC staff for each project Hosted initial visits for 3 projects

34 DTC Visitor Program Principal Investigator Home Institution Project Type Project Title Brian Ancell Texas Tech University PI + student Development of operational Weather Research and Forecasting model ensemble sensitivity/data assimilation tools Michael E. Baldwin / Kimberly Elmore Purdue University / University of Oklahoma PI Incorporating spatial analysis of forecast performance into the MET package at the DTC Vincent E. Larson University of Wisconsin - Milwaukee PI A generalized parameterization for clouds and turbulence in WRF Sarah-Jane LockUniversity of Leeds PI Cut-cell representation of orography: Exploring an alternative method for dealing with flows over steep and complex terrain Don Morton University of Alaska Fairbanks PI Alaska High Resolution Rapid Refresh (HRRR-AK) - Verification and study of model configuration for operational deployment Di Wu & Xiquan Dong University of North Dakota PI + student Evaluation of WRF microphysics schemes with observations in 3D environment Withdrawn

35 Proposed activities for AOP 2011 Internal Communication Implement tools to assist with task and staff coordination Provide a framework for cross-coordination between over-lapping activities of major task areas Coordinate strategic planning activities External Communication Continue making improvements to DTC website Host MB, EC & SAB meetings and keep open channels of communication DTC Visitor Program Host visitors for 5 funded projects Prepare a new AO & select next round of visitor projects Provide support for upcoming workshops and tutorials

36 Challenges for Director’s Office Meeting the needs/interests of the NWP community during tight funding period Cross-coordination between task areas Verification, HMT & HWT EnKF for DA, Hurricanes & Ensembles Determining appropriate focus and scope for next DTC Visitor Program Announcement of Opportunity

37 Jamie Wolff Mesoscale Modeling Collaborators: NOAA/NCEP/EMC NOAA/ESRL/GSD NCAR/NESL/MMM

38 Mesoscale Modeling Goals Community Support Maintain, support and facilitate contributions to the community-released code repositories, currently including: WRF * * Post Processing Software * Support community outreach events * * -in collaboration with MMM* and EMC* Testing and Evaluation Extensively test and evaluate a variety of model configurations that will provide benefits to both the operational and research communities Reference Configurations (RCs) Provide a functionally equivalent operational environment for testing and evaluating new techniques or capabilities

39 Major Accomplishments for AOP 2010 Community outreach and support * * WRF v3.2 release,WRF Workshop, WRF Tutorials, wrfhelp Significant progress on transition from WPP to UPP * * * Established community-UPP code repository -in collaboration with MMM*, EMC*, GSD* and AFWA* Published/updated online docs for past & current activities on the DTC T&E webpage Designated several DTC RCs and retested two with WRF v Performed evaluation of GFS/NAM Precipitation Forecast Comparison Presented results at relevant conferences Began to develop DTC expertise in NEMS software Software Engineer hire focused on NEMS development (at EMC) Held a technical information exchange meeting (October 2010) EMC, GSD, MMM, DTC representatives

40 Highlight: GFS/NAM Precip Forecast Comparison FSS: NAM15 is consistently higher than the GFS60 across multiple thresholds (12-h lead time shown) MODE: Counts (left) and size distribution (right) for all objects defined within the NAM4 forecast are more consistent with the observation field than the GFS4 forecast

41 Community Support - AOP 2011 Proposed Activities Maintain/Enhance WRF regression testing * Continue on-going efforts in community support * * Finish preparations of community-UPP software package for distribution * Extensively test, write documentation Provide community support upon release Anticipated major accomplishments WRF release (v3.3 April 2011) * * Community-UPP version 1.0 release (April 2011) * Bi-annual WRF Tutorials (July 2011, January 2012) * * Annual WRF Users Workshop (June 2011) * -in collaboration with MMM* and EMC*

42 Testing and Evaluation - AOP 2011 Proposed Activities Continue expansion of RC testing and evaluation Strengthen the foundation of DTC expertise with the NEMS software Continue to contribute to development process in select areas (e.g. portability, inter-operability, I/O layer capabilities) Co-host physics workshop for mesoscale modeling Work on publications for DTC methodology and results from select test activities Anticipated major accomplishments New RC designations or retests, as appropriate Functionally equivalent operational environment established at the DTC to assess new NWP methods White paper outlining short-term and longer-term approaches for making significant progress toward improving physics parameterizations Manuscript(s) submitted to appropriate peer-reviewed journals

43 Resource requirements for AOP 2011 Staff CategoryFTE Scientists3.83 (0.43) Software Engineers2.21 (0.05) Students0.68 (0.18) ScientistsSoftware Engineers Jimy DudhiaLaurie Carson Tressa FowlerDave Gill Eric GillelandJohn Halley Gotway Michelle HarroldChris Harrop Tara JensenEugene Mirvis Louisa NancePaul Oldenburg Ed TollerudTricia Slovacek Wei WangBrent Ertz Jamie WolffLara Ziady Chunhua Zhou Students: ASIIStanislav Stoytchev Zach Trabold Non-salaryCost Travel$15K Workshops$25K Publications$2K *Contributions from testbed collaborations – proposed work presented separately

44 Ligia Bernardet Hurricanes External collaborators: NOAA Environmental Modeling Center NOAA Geophysical Fluid Dynamics Laboratory NOAA Atlantic Oceanographic and Meteorological Laboratory NCAR Mesoscale and Microscale Meteorology Division University of Rhode Island

45 Hurricanes Goals Facilitate transfer of research to operations by creating a framework for NCEP and the research community to collaborate Support the community in using operational hurricane models Develop and maintain a hurricane testing and evaluation infrastructure at DTC Perform tests to assure integrity of community code and evaluate new developments for potential operational implementation

46 Major Accomplishments for AOP 2010 Release of HWRF to the community: code management, documentation, support to 150 registered users. Transition of Community Code to EMC to serve as baseline for 2011 operational implementation. Creation of a functionally-equivalent infrastructure to run HWRF on jet (including data assimilation and cycling) Testing and evaluation: Routine extended regression tests to evaluate integrity of code Ran 400 cases from the 2008 and 2009 hurricane seasons in preparation to designate a DTC RC.

47 Functionally-equivalent T&E suite The Developmental Testbed Center HWRF 2011 Baseline Test Plan Point of Contact: Ligia Bernardet December 15, 2010 Introduction The DTC will be performing testing and evaluation for the Hurricane WRF system, known as HWRF (Gopalakrishnan et al. 2010). HWRF will be configured as close as possible to the operational HWRF model, employing the same domains, physics, coupling, and initialization procedures as the model used at the NOAA NCEP Central Operations and by the model developers at NCEP EMC. The configuration to be tested matches the 2011 HWRF Baseline, which is the configuration that served as control for all developments at EMC geared towards the 2011 operational implementation. Pre-processing (including ability to read binary GFS in spectral coordinates) Cycled HWRF vortex initialization and relocation GSI Data Assimilation Coupled (POM + WRF) model Post-processing Tracking NHC Verification & confidence intervals Display Archival

48 Proposed activities for AOP 2011 Continue HWRF community support. Continue HWRF code management, keeping the evolving community and EMC versions of HWRF connected. Upgrade DTC testing suite by adding HYCOM, UPP and ability to run high-resolution. Perform extensive testing of HWRF. Actual tests are TBD, but could include changes in resolution, alternate physics and initialization (EnKF). Diagnostic activities: Evaluate HWRF to understand weaknesses and sources of error. Begin assembling a diagnostic toolbox for DTC and the community * * Pending HFIP funding.

49 Anticipated major accomplishments for AOP 2011 Hurricane Prediction Test Plan, in collaboration with EMC and HFIP, describing plans for tests to be conducted by DTC in 2011 and a protocol for how future tests will be determined. Hurricane Tutorial in April Publication of a HWRF Reference Configuration. Expanded WRF Community Code with addition of new developments for HWRF. Expanded HWRF Testing infrastructure on jet with the addition of new components (HYCOM and high-resolution). Input to NCEP pre-implementation decisions through HWRF testing and evaluation at DTC.

50 Resource requirements for AOP 2011 Staff CategoryFTE Scientists3.7 * Software Engineers2.3 * Students- ScientistsSoftware Engineers Shaowu BaoTimothy Brown Ligia BernardetLaurie Carson Mrinal BiswasChristopher Harrop Louisa NanceDonald Stark Jamie WolffTricia Slovacek * New AS IIBonny Strong * New AS III * Non-salaryCost Travel8 K Workshops5 K Publications1.5 K * Includes funding for hurricane diagnostic toolbox

51 Xiang-Yu (Hans) Huang *presented by Ming Hu Data Assimilation Collaborators: AFWA, NCEP/EMC, NASA/GMAO, NOAA/GSD, NCAR/MMM

52 Data Assimilation Goals Community code: Provide current operational GSI capability to research community (O2R) and a pathway for research community to contribute to operational GSI (R2O); Provide a framework to enhance the collaboration from distributed developers. T&E Provide rational basis to operational centers and research community for selecting a GSI configuration for their NWP system Explore alternative data assimilation method -EnKF

53 Major Accomplishments for AOP 2010 GSI Community Code Established GSI Review Committee Community support (v2 release, tutorial, & helpdesk) Developer support Maintained Community GSI repository and conducted appropriate testing R2O applications GSI T&E Configuration testing Month-long data impact tests Investigated several special issues

54 GSI R2O applications GSI R2O Transition Procedure 1.GSI Review Committee - initial scientific review 2.DTC Merge code with the latest GSI trunk following the GSI coding standard Perform the DTC regression test Perform impact study of the code changes 3.GSI Review Committee - code and commitment review 4.DTC->EMC GSI code commitment 5.EMC->DTC GSI repository syncing Applications 1.GSD cloud analysis package for Rapid Refresh operation 2.Assimilation of surface PM2.5 observations in GSI for CMAQ regional air quality model 3.Portability issues from repository testing

55 Proposed activities for AOP 2011 Community Code Continue provide support for Community GSI package Continue to maintain Community GSI repository Coordinate GSI Review Committee meetings and activities Test (and adjust if necessary) procedure for managing R2O transition with input from GSI Review Committee Explore alternative DA methods/systems including collaboration and community support Testing and Evaluation Conduct extensive tests of end-to-end system GSI baseline tests (including comparison with WRF-Var) Regional EnKF system Reassess the long term strategy for DA task

56 Anticipated major accomplishments for AOP 2011 GSI Community Code Annual Community GSI Tutorial/Workshop GSI v3.0 release GSI community contribution (R2O) procedure and implementation Testing and Evaluation Final report summarizing results of GSI and WRF-Var comparison experiments. Final report summarizing the DART-EnKF test results and recommendation for regional EnKF testbed

57 Resource requirements for AOP 2011 Staff CategoryFTE Scientists3.2 Software Engineers0.5 ScientistsSoftware Engineers Kathryn CrosbyDon Stark Hans Huang Ming Hu Hui Shao Chunhua Zhou Non-salaryCost Travel$18 K

58 Tressa L. Fowler Verification In collaboration with AFWA, NASA, HWT, HMT

59 Verification Goals MET Development Provide complete, quality verification tools to NWP community. MET Support Provide instruction for and demonstration of those tools.

60 Major Accomplishments for AOP 2010 MET Support New expanded tutorials MET related talks at AMS HWT and HMT collaborations MET Development Workshop Release of MET v3.0 Adaptation and demonstration of METViewer for NCEP database.

61

62 Proposed activities for AOP 2011 MET Support Semi-annual tutorials, expanded and improved. MET Development Annual Workshop Improved methods for verifying clouds. Research on methods for verifying through time. Improvements in ensemble methods in collaboration with DET. Expanded capabilities of METViewer. Initial capabilities to verify hurricanes.

63 Anticipated major accomplishments for AOP 2011 MET Support Improved Tutorials MET Development Informative workshop MET software release Improved ensemble, cloud, and time verification. METViewer database and display release

64 Resource requirements for AOP 2011 Staff CategoryFTE Scientists1.40 (0.65) Software Engineers2.50 (0.73) Students0.40 (0.40) ScientistsSoftware Engineers Tressa Fowler*Randy Bullock* Tara Jensen*John Halley Gotway* Eric GillelandPaul Oldenburg* Michelle HarroldAnne Holmes Ed TollerudBonny Strong* Students: Lisa Coco Non-salaryCost Travel11 K (5 K) Workshops26 K (1 K) *Contributions from testbed collaborations – proposed work presented separately

65 Zoltan Toth Ensembles DET website: External collaborators: NOAA Environmental Modeling Center Hazardous Weather Testbed Hydrometeorological Testbed Hurricane Forecast Improvement Project

66 Ensembles Goals Develop & maintain DET infrastructure Six modules User interface – DET Portal Establish benchmark Functionally reproduce NCEP operational system First benchmark is NCEP’s upcoming implementation(s) Test & evaluate community developed methods Collaborative work with the research community Link with other testbeds and programs/projects Support ensemble activities HMT, HWT, HFIP, etc Link with user community

67 Major Accomplishments for AOP 2010 Plans developed for Overall architecture of DET infrastructure Each of 6 modules Test and Evaluation Established and tested basic capabilities for 2 modules Configuration Initial perturbations Collaboration with other DTC tasks/projects HMT – Joint plans for testing DET & HMT ensemble generation HWT – Joint plans for evaluation of 2011 HWT ensemble products Outreach Organized DET Workshop and engaged with WRF Ensemble WG Activities coordinated with NCEP/EMC via regular meetings

68 Module 1: Configuration Module 2: Initial Perturbations Module 4: Statistical Post- Processing External Input (HMT, HWT, HFIP, etc) Module 3: Model Perturbations Module 5: Product Generation Module 6: Verification DET MODULES

69 Ongoing work on Modules 1-2 Establishing basic capabilities Closely coordinated with EMC 6-member ensemble test ARW with various physics NA SREF domain 22 km grid-spacing To be expanded S & E for HFIP GFS initial conditions Dec 2010 Initial perturbations from GEFS are cycled (dynamically downscaled) Tested against interpolated GEFS initial perturbations GEFS lateral boundary conditions

70 Proposed activities for AOP 2011 Establish benchmark for initial perturbation module Testing & evaluation to contribute to next NCEP SREF implementation Establish basic capability for model perturbation module Capability of using different versions of NMM and possibly ARW under NEMS Interface with other testbeds & projects Evaluation of HMT ensemble Products & evaluation for HWT ensemble Joint planning for HFIP ensemble development & testing Continued engagement with community Co-organize 5 th Ensemble User Workshop with NCEP & possibly NUOPC

71 Anticipated major accomplishments for AOP 2011 Test results and software for cycling initial perturbations ARW & NMM incorporated into DET under NEMS Improved user interface – Basic capability for DET Portal Report on HMT & HWT ensemble product evaluation Hydromet & hazardous weather forecast products Verification metric packages identified for hydromet & hazardous weather In collaboration with HMT, OHD, NCEP, etc 5 th Ensemble User Workshop

72 Resource requirements for AOP 2011 Staff CategoryFTE Scientists1.90 (0.40) Software Engineers1.40 (0.25) Students0.10 (0.10) ScientistsSoftware Engineers Barbara BrownBrent Ertz Michelle HaroldAnn Holmes Isidora JankovEugene Mirvis Tara Jensen*Paula McCaslin Ed TollerudPaul Oldenburg Zoltan TothLinda Wharton New DET Task LeadSEII New HireStudents Lisa Coco Non-salaryCost Travel15 K (6 K) Workshops12 K Publications3 K *Contributions from testbed collaborations – proposed work presented separately

73 Issues Computational resources for DET Collaboration with HFIP established – Access to Jet in Boulder Run HFIP regional ensemble embedded into DET NA ensemble Teragrid start-up allocation secured For testing portability Full allocation will be requested in spring NOAA Site B research computer How to request allocation for DET for FY12 & beyond? New NCAR facility How to request allocation for DET? Real time testing of DET ensemble NA domain, with embedded HMT, HWT, HFIP, etc (movable) nests? Subject to availability of additional resources Accelerate development of benchmarks for stat. post & products Subject to availability of additional resources

74 Ed Tollerud HMT/DTC Collaboration External Collaborators: NOAA Earth System Research Laboratory NOAA Environmental Modeling Center California Nevada River Forecast Center California NWS forecast offices

75 HMT/DTC Collaboration Goals Work toward the DTC Mission Statement: Improvement through verification/evaluation of EMC operational models Improve forecasting methods for extreme precipitation events: model techniques, data impacts, and physical parameterizations Demonstrate usefulness of prototype ensemble forecast systems; thereby provide long-range guidance for operational ensemble forecast systems in development at EMC Collaborate and advance the missions of the cross-cutting tasks in the DTC: Model development, verification tools, ensemble methods HMT-West ensemble QPF evaluation is current focus to attain these goals; HMT-West is an effective test-bed for real-life forecasting method evaluation

76 Major Accomplishments for AOP 2010 HMT-West winter exercise real-time demonstration website for QPF verification Mesoscale modeling: operational EMC model QPF verification Ensemble modeling (DET): verification for WRF regional ensemble model Data impacts analyses: verification uncertainty due to observational data-stream choices and data quality Verification methods: MODE and the assessment of Atmospheric River forecasts

77

78 Proposed activities for AOP 2011 Evaluate impact of microphysical schemes on operational and research model performance using HMT-West observations base Perform QPF verification for heavy rain events for EMC operational and research models, HMT-West WRF ensemble, and others Expand ensemble/probabilistic content of HMT-West verification demonstration Investigate use of moisture flux for MODE object identification and value to model AR forecast assessment

79 Anticipated major accomplishments for AOP 2011 Written evaluation of QPF for several EMC operational models and regional ensemble systems using 1-2 year statistics from HMT winter exercises Development of new techniques to verify microphysical forecasts in time and space domains An expanded and interactive verification website with new aggregation, regionalization, and probabilistic scoring options Identification of effective MODE-based methods to evaluate leading edge, moisture flux, and other AR attributes

80 Resource requirements for AOP 2011 Staff CategoryFTE Scientists0.83 Software Engineers0.53 Students0.48 ScientistsSoftware Engineers Ed TollerudJohn Halley Gotway Tara JensenPaul Oldenberg Tressa FowlerRandy Bullock Brent Ertz Students: Lisa CocoStanislav Stoytchev Non-salaryCost Travel8 K Publications2 K Compute ResourcesNOAA jet system

81 Tara Jensen HWT-DTC Collaboration Collaborating with: NOAA: SPC, NSSL, EMC, HPC, AWC, ESRL Universities: OU/CAPS NCAR: MMM UCAR: COMET

82 HWT-DTC Collaboration Goals Support DTC mission by evaluating convection allowing models/ensembles on cutting edge of NWP Gather robust datasets for ensemble and deterministic studies Datasets and algorithms can be leveraged by DET and Mesoscale Focus Demonstrate utility of MET-based objective evaluation in a forecast environment Provide near real-time evaluation during Spring Experiment Application of new MET tools feeds back to MET development Mesoscale DataAssimilation Hurricanes Ensembles Verification

83 Major Accomplishments for AOP 2010  Enormous expansion of near real-time evaluation capability  Evaluation of 30 models during Spring Experiment (CAPS ensemble+3 operational baselines)  10 Deterministic and 4 Ensemble products evaluated using traditional and spatial verification methods.  Three additional research models available for retrospective studies  DTC staff participation in each week of Spring Experiment 2010  Papers and Presentations Kain et. al, October 2010: Assessing Advances in the Assimilation of Radar Data and Other Mesoscale Observations within a Collaborative Forecasting–Research Environment, MWR Presentations at 11 th WRF Users’ Workshop, 25 th AMS Conf. on Severe Local Storms, FAA Interagency Meeting, 24 th AMS Conf. on Weather and Forecasting, & 25 th AMS Conf. on Hydrology

84 Example of Real-time Evaluation: Radar Echo Tops Ensemble Products are not always useful RETOP ObservedCAPS PM MeanThompsonWSM6WDM6Morrison

85 CAPS Ensemble Mean Quick Glance at HWT-DTC Evaluation Results Frequency Bias indicates CAPS ensemble mean field exhibits a large over-forecast of areal coverage of cloud complexes Ratio of MODE forecast objects to observed objects implies 2-4x over- prediction of CAPS ensemble mean convective cells whereas HRRR and CAPS 1-km ratio is near 1 for first 15 hours Frequency Bias – RETOP > 35kFt CAPS 1km HRRR

86 Proposed activities for AOP 2011 Ensembles Focus Area: Help build foundation for Product Generation module by: Identifying promising ensemble product algorithms through evaluation of NCEP and CAPS SSEF ensemble products. NCEP ensembles: SREF and HREF Variables: Reflectivity, Accum precip, and Synthetic satellite (if available) Working with EMC and CAPS to incorporate promising algorithms into module. Verification Focus Area: Demonstrate MODE-Time Dimension (MODE-TD) for Convective Initiation forecast problem. Mesoscale DataAssimilation Hurricanes Ensembles Verification

87 Anticipated major accomplishments for AOP 2011 Report describing the evaluation of real-time CAPS SSEF and NCEP products. Begin incorporation of select NCEP and CAPS ensemble product algorithms into DET Product Generation module. Demonstration of MODE-TD during Spring Experiment DTC staff participation during each week of Spring Experiment. Mesoscale DataAssimilation Hurricanes Ensembles Verification

88 Resource requirements for AOP 2011 Staff CategoryFTE Scientists0.70 Software Engineers0.45 Students0.20 ScientistsSoftware Engineers Tara JensenPaul Oldenburg Michelle HarroldJohn Halley Gotway Tressa FowlerBrent Ernst StudentRandy Bullock Lisa CocoBonnie Strong Non-salaryCost Travel11 K Fall Meeting1 K Compute Resources8-12 dedicated processors from 1 May to 30 June


Download ppt "Bill Kuo DTC AOP 2011 and Challenges. Outline Review of DTC AOP 2010 tasks and budgets Highlights of DTC accomplishments Recommendations of SAB and DTC."

Similar presentations


Ads by Google