1 Report of Inclusion of FNMOC Ensemble into NAEFS S. Lord (NCEP/EMC) Andre Methot (MSC) Yuejian Zhu, and Zoltan Toth (NOAA) Acknowledgements Bo Cui (EMC),

Slides:



Advertisements
Similar presentations
Sub-seasonal to seasonal prediction David Anderson.
Advertisements

ECMWF long range forecast systems
Update to COPC 4 – 5 November 2014 Dave McCarren, NUOPC DPM.
A Brief Guide to MDL's SREF Winter Guidance (SWinG) Version 1.0 January 2013.
Statistical post-processing using reforecasts to improve medium- range renewable energy forecasts Tom Hamill and Jeff Whitaker NOAA Earth System Research.
Verification of NCEP SFM seasonal climate prediction during Jae-Kyung E. Schemm Climate Prediction Center NCEP/NWS/NOAA.
Instituting Reforecasting at NCEP/EMC Tom Hamill (ESRL) Yuejian Zhu (EMC) Tom Workoff (WPC) Kathryn Gilbert (MDL) Mike Charles (CPC) Hank Herr (OHD) Trevor.
The NCEP operational Climate Forecast System : configuration, products, and plan for the future Hua-Lu Pan Environmental Modeling Center NCEP.
Hydrometeorological Prediction Center HPC Medium Range Grid Improvements Mike Schichtel, Chris Bailey, Keith Brill, and David Novak.
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
Current Status of the Development of the Local Ensemble Transform Kalman Filter at UMD Istvan Szunyogh representing the UMD “Chaos-Weather” Group Ensemble.
Introduction to Numerical Weather Prediction and Ensemble Weather Forecasting Tom Hamill NOAA-CIRES Climate Diagnostics Center Boulder, Colorado USA.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
CPC’s U.S. Seasonal Drought Outlook & Future Plans April 20, 2010 Brad Pugh, CPC.
UMAC data callpage 1 of 16Global Ensemble Forecast System - GEFS Global Ensemble Forecast System Yuejian Zhu Ensemble Team Leader, Environmental Modeling.
UMAC data callpage 1 of 25North American Ensemble Forecast System - NAEFS EMC Operational Models North American Ensemble Forecast System Yuejian Zhu Ensemble.
Medium Range Forecast - Global System Out To 14 Days Yuejian Zhu Ensemble Team Leader EMC/NCEP/NWS/NOAA Presents for NWP Forecast Training Class March.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Statistical Post Processing
1 Global Ensemble Strategy Presented By: Yuejian Zhu (NWS/NCEP) Contributors: EMC staff.
ECMWF WWRP/WMO Workshop on QPF Verification - Prague, May 2001 NWP precipitation forecasts: Validation and Value Deterministic Forecasts Probabilities.
Intraseasonal TC prediction in the southern hemisphere Matthew Wheeler and John McBride Centre for Australia Weather and Climate Research A partnership.
Data assimilation and observing systems strategies Pierre Gauthier Data Assimilation and Satellite Meteorology Division Meteorological Service of Canada.
Allan Darling Deputy Director, NCEP Central Operations NOAA NWS NCEP
Measuring forecast skill: is it real skill or is it the varying climatology? Tom Hamill NOAA Earth System Research Lab, Boulder, Colorado
NOAA Project Update Overall Project Goal: Improved regional relevance and usability of climate and sea level rise data and tools for the specific needs.
National Weather Service Application of CFS Forecasts in NWS Hydrologic Ensemble Prediction John Schaake Office of Hydrologic Development NOAA National.
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
Verification of IRI Forecasts Tony Barnston and Shuhua Li.
1 An overview of the use of reforecasts for improving probabilistic weather forecasts Tom Hamill NOAA / ESRL, Physical Sciences Div.
Verification of Global Ensemble Forecasts Fanglin Yang Yuejian Zhu, Glenn White, John Derber Environmental Modeling Center National Centers for Environmental.
ENSEMBLES RT4/RT5 Joint Meeting Paris, February 2005 Overview of the WP5.3 Activities Partners: ECMWF, METO/HC, MeteoSchweiz, KNMI, IfM, CNRM, UREAD/CGAM,
. Outline  Evaluation of different model-error schemes in the WRF mesoscale ensemble: stochastic, multi-physics and combinations thereof  Where is.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Local Probabilistic Weather Predictions for Switzerland.
Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project.
Slides for NUOPC ESPC NAEFS ESMF. A NOAA, Navy, Air Force strategic partnership to improve the Nation’s weather forecast capability Vision – a national.
Insights from CMC BAMS, June Short Range The SPC Short-Range Ensemble Forecast (SREF) is constructed by post-processing all 21 members of the NCEP.
CBRFC Stakeholder Forum February 24, 2014 Ashley Nielson Kevin Werner NWS Colorado Basin River Forecast Center 1 CBRFC Forecast Verification.
1 NUOPC National Unified Operational Prediction Capability Update to COPC 27 – 28 May 2015 Dave McCarren, NUOPC DPM.
18 September 2009: On the value of reforecasts for the TIGGE database 1/27 On the value of reforecasts for the TIGGE database Renate Hagedorn European.
Working group III report Post-processing. Members participating Arturo Quintanar, John Pace, John Gotway, Huiling Yuan, Paul Schultz, Evan Kuchera, Barb.
© 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Ensemble-4DWX.
Statistical Post Processing - Using Reforecast to Improve GEFS Forecast Yuejian Zhu Hong Guan and Bo Cui ECM/NCEP/NWS Dec. 3 rd 2013 Acknowledgements:
Modes of variability and teleconnections: Part II Hai Lin Meteorological Research Division, Environment Canada Advanced School and Workshop on S2S ICTP,
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
Exploring Multi-Model Ensemble Performance in Extratropical Cyclones over Eastern North America and the Western Atlantic Ocean Nathan Korfe and Brian A.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
Common verification methods for ensemble forecasts
On the Challenges of Identifying the “Best” Ensemble Member in Operational Forecasting David Bright NOAA/Storm Prediction Center Paul Nutter CIMMS/Univ.
Quantifying the Significance of the April 2011 Severe Weather Outbreak Matthew S. Stalley, Chad M. Gravelle, Charles E. Graves Saint Louis University.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
DOWNSCALING GLOBAL MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR FLOOD PREDICTION Nathalie Voisin, Andy W. Wood, Dennis P. Lettenmaier University of Washington,
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
Probabilistic Forecasts Based on “Reforecasts” Tom Hamill and Jeff Whitaker and
Description of the IRI Experimental Seasonal Typhoon Activity Forecasts Suzana J. Camargo, Anthony G. Barnston and Stephen E.Zebiak.
Long-lead streamflow forecasts: 2. An approach based on ensemble climate forecasts Andrew W. Wood, Dennis P. Lettenmaier, Alan.F. Hamlet University of.
GPC-Montreal - Status Report - March 2014
Challenges of Seasonal Forecasting: El Niño, La Niña, and La Nada
Verifying and interpreting ensemble products
Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu
Precipitation Products Statistical Techniques
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Progress in Seasonal Forecasting at NCEP
The Importance of Reforecasts at CPC
University of Washington Center for Science in the Earth System
Verification of COSMO-LEPS and coupling with a hydrologic model
Caio Coelho (Joint CBS/CCl IPET-OPSLS Co-chair) CPTEC/INPE, Brazil
Christoph Gebhardt, Zied Ben Bouallègue, Michael Buchhold
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

1 Report of Inclusion of FNMOC Ensemble into NAEFS S. Lord (NCEP/EMC) Andre Methot (MSC) Yuejian Zhu, and Zoltan Toth (NOAA) Acknowledgements Bo Cui (EMC), Stephane Beauregard (MSC), Mike Sestak (FNMOC), Rebecca Cosgrove (NCEP/NCO) October 20, 2009

2 Overview Background & Testing Procedure Results Conclusions Issues Recommendation and outlook

3 Background & Testing Procedure North American Ensemble Forecast System (NAEFS) –Collaboration between NCEP, Meteorological Service of Canada (MSC), FNMOC and Mexico Weather Service –Elements: Demonstrate value of Multi-Model Ensemble (MME) Engage in collaborative software development, focused on postprocessing products from an arbitrary number of forecast systems Establish operational data transfer Application to operational products with shared software Continue to monitor value-added with MME strategy Global ensemble products –NCEP – operational 20 members -16 days –CMC – operational 20 members - 16 days –FNMOC – experimental 16 members – 10 days

4 Background & Testing Procedure (cont) Forecast data –9 months of data collected (off line) –Communications pathway established with FNMOC –Raw forecasts Fall 2008 (September 1 st – November 30 th 2008) Winter 2008/2009 (December 1 st 2008 – February 28 th 2009) Spring 2009 (March 1 st – May 31 st 2009) –Bias corrected forecasts – All ensembles bias corrected against NCEP analysis Winter 2008/2009 (December 1 st 2008 – February 28 th 2009) Spring 2009 (March 1 st – May 31 st 2009) Verification methods –Reference analysis Individual ensembles – Each center’s own Combined ensembles – NCEP analysis –Scores NCEP standard probabilistic verification package –AC and RMS for ensemble mean, spread, histogram –CRPS, RPSS, ROC, BSS (resolution and reliability) –Variables 500 hPa and 1000 hPa height 850 hPa and 2-meter temperature 10-m U and V Precipitation (limited scores, CONUS only)

5 2 meter temperature: 120 hours forecast (ini: ) Shaded: left – uncorrected right – after bias correction Bias reduced approximately 50% at early lead time RMS errors improved by 9% for d0-d3

6 From Bias correction (NCEP, CMC) Dual-resolution (NCEP only) Down-scaling (NCEP, CMC) Combination of NCEP and CMC NAEFS final products NCEP/GEFS raw forecast 8+ days gain

7 NH 500hPa Height Fall 2008 (AC) 7.3d 7.8d FNMOC is about 12h behind CMC and NCEP E20s – NCEP 20 members raw ensemble mean E20m – CMC 20 members raw ensemble mean E16f – FNMOC 16 members raw ensemble mean

8 Value-added by including FNMOC ensemble into NAEFS T2m: Against analysis (NCEP’s evaluation, 1 of 4) Raw NCEP Raw NCEP ensemble has modest skill (3.4d) 0.5 CRPS skill

9 Raw NCEP Stat. corr. Raw NCEP ensemble has modest skill (3.4d) Statistically corrected NCEP ensemble has improved skill (4.8d) 0.5 CRPS skill Value-added by including FNMOC ensemble into NAEFS T2m: Against analysis (NCEP’s evaluation, 2 of 4)

10 Raw NCEP Stat. corr. NAEFS Combined NCEP – CMC (NAEFS) show further increase in skill (6.2d) Raw NCEP ensemble has modest skill (3.4d) Statistically corrected NCEP ensemble has improved skill (4.8d) 0.5 CRPS skill Value-added by including FNMOC ensemble into NAEFS T2m: Against analysis (NCEP’s evaluation, 3 of 4)

11 Raw NCEP NAEFS + FNMOC Stat. corr. NAEFS Combined NCEP – CMC (NAEFS) show further increase in skill (6.2d) Addition of FNMOC to NAEFS leads to modest improvement (6.7d) Raw NCEP ensemble has modest skill (3.4d) Statistically corrected NCEP ensemble has improved skill (4.8d) 0.5 CRPS skill Value-added by including FNMOC ensemble into NAEFS T2m: Against analysis (NCEP’s evaluation, 4 of 4)

12 Preliminary Results from CMC (raw forecast) Verification Against Observations

13 Preliminary Results from CMC (bias corrected forecast) Verification Against Observations

14 Preliminary Conclusions Individual ensemble systems (individual Centers’ forecasts) –NCEP and CMC have similar performance –FNMOC performance similar to NCEP & FNMOC for near surface variables, including precipitation –FNMOC is less skillful than NCEP and CMC for upper atmosphere variable (500hPa) Combined ensemble system (without bias correction) –Multi-model ensembles have higher skill than single system –Adding FNMOC ensemble to current NAEFS (NCEP+CMC) adds value for most forecast variables Noticable improvement for surface variables Minimal improvement for upper atmosphere Combined ensemble system (with operational NAEFS bias correction) –Improved near surface variables with FNMOC ensemble NCEPbc + CMCbc + FNMOCbc –Less improvement for upper atmosphere (e.g. 500hPa height)) Some degradation for short lead times (related to large spread in FNMOC ensemble) CMC evaluation against observations –Preliminary results combining raw ensembles are mixed –Results with bias corrected data still mixed

15 Issues 1.Data flow –FNMOC processing at NCEP must be completed by the time NAEFS processing begins –Currently NAEFS processing begins at 0730 and 1930 Z Processing of FNMOC data takes 30 minutes FNMOC delivery to NCO is 0730 and 1930 Z –Require 30 minute overall gain for timely availability of FNMOC ensemble for NAEFS (0730 and 1930) processing Processing time at NCEP can be reduced by ~10 minutes Arrival at NCEP by 0710, 1910 required (if NCEP speedup is 10 minutes) –Data delivery needs to be accelerated by 20 minutes 2.FNMOC ensemble upgrades –Extend forecast from 10 days to 16, and add 4 members –Expand variables from 52 to 80 –Reduce initial spread in ensemble generation –Receive in GRIB2 format 3.FNMOC use of MSC ensemble –Optional –May be security issues

16 Recommendation and Outlook NCEP plans to include FNMOC ensemble in NAEFS based on –Preliminary evaluations (shown here) –Future improvements NOGAPS 4-D Var (recently implemented) Ensemble system upgrade –Reduced initial ensemble spread for variables related to 500hPa height Extended forecast from current 10d to 16d 4 additional members (16  20) Increase variables from 52 to 80 Upgrade exchange data format to GRIB2 for reduced data flow –Earlier data delivery from FNMOC –Final Real Time parallel evaluation (Q3FY10) with all partners (NCEP, FNMOC, MSC) for 3-months including above improvements MSC reserves right to not include FNMOC data but no decision yet Proposed data flow –NCEP data: NCEP to FNMOC and CMC directly –FNMOC data: FNMOC to NCEP, then NCEP to CMC –CMC data: CMC to NCEP, then NCEP to FNMOC (?) Anticipated implementation: Q4FY10 –Address new issues as they arise

17 Backup

18 Standard Probabilistic Scores Continuous Ranked Probabilistic Skill Score (CRPSS) –Ability of ensemble to forecast the observed (climatological) distribution of values –Maximum value is 1.0, >0 more skillful than climatology Brier Skill Score (BSS) –Ability of ensemble to predict spatial and temporal variability of observed events (e.g. T2>10 K) skillfully (relative to climatological probability) –BSS=1 for perfect, BSS=0 for no skill Relative Operating Characteristic –Ability of an ensemble membership to distinguish “hits” and “false alarms” False alarm rate Hitrate

19

20 Gaussian Kernels “Frequentist” methods “Bayesian” methods Construction of Optimum Forecast Guidance from Multi-Model Ensembles 1.Multiple independent realizations 2.Historical “reforecast” data set 3.Optimal post-processing to produce “the best” forecast 4.Compact information dissemination

21 Ensemble Spread 500hPa height (example) Spread Too much / Too little Anomaly correlation (45-day mean) NCEP spread will be much increased after 2009 NCEP/GEFS implementation (due to introduction of stochastic scheme, higher resolution model & higher order horizontal diffusion)

22 NCEP/GEFS FNMOC/GEFS spread Examples of plume

23 NCEP and CMCNCEP and FNMOC Precipitation Raw Fcst Individual ensembles

24 Variablespgrba fileTotal 80 (28) GHTSurface, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa11 (3) TMP2m, 2mMax, 2mMin, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa 13 (3) RH2m, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa11 (3) UGRD10m, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa11 (3) VGRD10m, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa11 (3) VVEL850hPa1 (1) PRESSurface, PRMSL2 (0) PRCP (types)APCP, CRAIN, CSNOW, CFRZR, CICEP5 (0) FLUX (surface)LHTFL, SHTFL, DSWRF, DLWRF, USWRF, ULWRF6 (6) FLUX (top)ULWRF (OLR)1 (1) PWATTotal precipitable water at atmospheric column1 (0) TCDCTotal cloud cover at atmospheric column1 (0) CAPE and CINConvective available potential energy, Convective Inhibition2 (1) SOILSOILW(0-10cm), WEASD(water equiv. of accum. snow depth), SNOD(surface), TMP(0-10cm down) 4 (4) Notes Surface GHT is only in analysis file and first pgrb file when the resolution changed. 25 of 28 new variables are from pgrbb files, 10, 50hPa RH and SNOD are new variables 28 new vars NEXT NAEFS exchange pgrba files

25 Variablespgrba_bc fileTotal 49 (14) GHT10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa10 (3) TMP2m, 2mMax, 2mMin, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa 13 (3) UGRD10m, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa11 (3) VGRD10m, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa11 (3) VVEL850hPa1(1) PRESSurface, PRMSL2(0) FLUX (top)ULWRF (toa - OLR)1 (1) 14 new vars Notes NEXT NAEFS pgrba_bc files (bias correction)

26 Data Flow –NCEP receives 00 and 12Z cycle data –Data path from FNMOC to the NWS/TOC then to the NCEP/CCS –April 2009 requirements study NCO, TOC, FNMOC examined data delivery Offline delivery time (for evaluation) is 11Z and 23Z For operations, NCO requires data here and packaged appropriately by 730Z (1930 for the 12Z cycle) to meet the current start time of the NAEFS processing –NCO currently receives FNMOC ensemble data 720 to 740Z for the 00Z (1930 to 2000Z for the 12Z) –Processing takes 30 minutes –Delivery by 0710, 1910 required (if NCEP speedup is 10 minutes)