Presentation is loading. Please wait.

Presentation is loading. Please wait.

Review of EMC regional ensembles: SREF, NARRE-TL, HRRRE-TL/HREF Jun Du, Geoff DiMego, Binbin Zhou, Dusan Jovic, Bo Yang, Matt Pyle, Geoff Manikin, Brad.

Similar presentations


Presentation on theme: "Review of EMC regional ensembles: SREF, NARRE-TL, HRRRE-TL/HREF Jun Du, Geoff DiMego, Binbin Zhou, Dusan Jovic, Bo Yang, Matt Pyle, Geoff Manikin, Brad."— Presentation transcript:

1 Review of EMC regional ensembles: SREF, NARRE-TL, HRRRE-TL/HREF Jun Du, Geoff DiMego, Binbin Zhou, Dusan Jovic, Bo Yang, Matt Pyle, Geoff Manikin, Brad Ferrier, Stan Benjamin (GSD) and Brian Etherton (DET) Acknowledgements: Ensemble Team: Yuejian Zhu, Yan Luo and Bo Cui Mesoscale Branch: Eric Rogers, Perry Shafran and Ying Lin DTC/NCAR: Jamie Wolff AWC: David Bright HPC: Dave Novak IBM: James Abeles

2 Content 2011 implementations Coming implementations Plans

3 2011 implementations Added North American domain (221 grid) SREF data on NOMADS Added CONUS domain (212 grid) bias corrected SREF data on NOMADS Added 4km HREF ensemble products (combining 32km SREF with 4km Hires Window runs, Du 2004)

4 Examples of HREF

5 MeanSpread Probability 01h-, 03h-, 06h-, 12h-, 24h-apcp xxx(>0.01”,0.05”,0.1”,0.25”,0.5”,1.0”,1.5”,2”,4”,6”) 2m-Txxx( 25.5C) 10m-Uxx 10m-Vxx 10m-Windxxx(>25kt, 34kt, 50kt) 2m-RHxx CAPExxx(>500, 1000, 2000, 3000, 4000) CINxxx(<-50, -100, -200, -300, -400) SLPxx HREF ensemble products (7 surface and 7 upper air variables) MeanSpread Probability 850Txxx(<0C) 850RHxx 850Uxx 850Vxx 850Windxx 500Txx 500Hxx 250Txx 250Uxx 250Vxx 250Windxx

6 Coming implementations ~January 2012: NARRE-TL (North America Rapid Refresh Ensemble – Time Lagged) ~April 2012: SREF (Short Range Ensemble Forecast)

7  NARRE-TL (formerly VSREF) -Together with the implementation of Rapid Refresh (RR) to replace RUC (Stan Benjamin’s review). -Background: NCEP and GSD agreement: to implement a “3 WRF-ARW and 3 WRF-NMMB members, hourly-updated N.A. RR Ensemble (6 member NARRE)” -Before having enough computing resource to operationally run the “6- member NARRE”, build a Time-lagged NARRE system (NARRE-TL) based on single runs from RR and NAM as a first step to (1) Fully utilize existing hourly output RR data; (2) Provide probabilistic guidance to aviation weather (similar to the idea of SPC’s “Probability of Opportunity”); (3) use only very low compute resource (post processing); and (4) Provide a baseline to future real NARRE/HRRRE development and evaluation.

8 RR model domain NARRE-TL (grid#130) NARRE-TL Alaska, ( grid#242 )  NARRE-TL Configuration 10 weighted time-lagged members collected from: 6 RR members and 4 NAM members (next slide) Two output grids: CONUS (grid#130) Alaska (grid#242) Forecast hours: 12 hours Model resolution 12km

9 Member Weighting = 1 - forecast age (hr)/30: 1.0 for the most current forecast and 0.0 for 30hr-old fcst (NAM always older than RR  gives more weight to RR members) RR’s first 6 hr forecasts are used. Example for 06Z cycle’s NARRE-TL members: 060912000318211206 NAM RR 15182100 6hr TL 12hr FCST

10 Field Ensemble product 1 IcingOccurrence prob on 8 FL 2 Turbulence (CAT)3 severity occurrence Prob on 9 FL 3 Ceiling height (cloud base)Mean/spread/prob of 4 ranges 4 VisibilityMean/spread/prob of 4 ranges 5 Low level Wind shearMean/spread/occurrence prob 6 Jet streamProb on 3 levels 7 Fog (light/dense)Mean/spread/prob 8 ConvectionProb of occ. (Steve W. of GSD) 9 ReflectivityProb of 4 thresholds 10 Freezing heightMean/spread 11 Precipitation typeProb of rain and snow types 12 Accumulate PrecipProb of 3 and 6hr acc. precip 13 LightningProb of occurrence (D. Bright of SPC/AWC) 14 Severe thunderstormProb of occurrence (D. Bright of SPC/AWC)

11 Example products from NARRE-TL (may need a spatial filtering/GSD) Icing Turbulence LLWS Visibility

12 Importance to user community Subject: VSREF missing From: xxx.xxx@noaa.gov Date: Tue, 06 Dec 2011 10:29:59 -0700 To: Jun.Du@noaa.gov Jun, Are you guys having troubles with the VSREF data? We have not received anything since 12/05 at 09Z. Thanks.

13 Coming SREF upgrade (Spring 2012) Model Change 1. Model adjustment (eliminate Eta and RSM two models and add new NEMS-based NMMB model) 2. Model upgrade (two existing WRF cores from v2.2 to version 3.3) 3. Resolution increase (from 32km/35km to 16km/17km) IC diversity improvement 1. More control ICs (NDAS for NMMB, GFS for NMM, and RR blended with GFS for ARW) 2. More IC perturbation diversity (regional bleeding, downscaled ETR and blended between the two) 3. Diversity in land surface initial states (NDAS, GFS, and RR) Physics diversity improvement 1. More diversity of physics schemes (NAM, GFS, HWRF, NCAR and RR) New capabilities of post-processing 1. precipitation bias correction (individual members and ensemble mean) 2. clustering and associated mean/prob/spread within a cluster 3. member performance ranking (different weights for different members) 4. downscaling to 5km using RTMA (same as NAEFS, evaluated by DET for SREF) New ensemble products 1. max/min, mode, 10-25-50-75-90% forecasts (similar to NAEFS) 2. probs of severe thunderstorm, lightning, dry lightning and fire weather (SPC) 3. addition of hourly ensemble product output from 1-39hr

14 Evaluation of SLP (opl SREF vs. par SREF, Oct. 23 – Nov. 26, 2011) Ens mean fcst: RMSE Ens spread: Rank Histogram Prob fcst: RPSS (12km NAM as ref)Prob fcst: reliability diagram

15 Evaluation of 250U (opl SREF vs. par SREF, Oct. 23 – Nov. 26, 2011) Ens mean fcst: RMSE Ens spread: Rank Histogram Prob fcst: RPSS (12km NAM as ref)Prob fcst: reliability diagram

16 Evaluation of 500H (opl SREF vs. par SREF, Oct. 23 – Nov. 26, 2011) Outlier of opl SREF = 21.1% (to miss truth) Outlier of par SREF = 13.4% (to miss truth) Ens mean fcst: RMSE Ens spread: Rank Histogram Prob fcst: RPSS (12km NAM as ref)Prob fcst: reliability diagram

17 Evaluation of 2mT (opl SREF vs. par SREF, Oct. 23 – Nov. 26, 2011) Outlier of opl SREF = 17.2% (to miss truth) Outlier of par SREF = 14.3% (to miss truth) Ens mean fcst: RMSE Ens spread: Rank Histogram Prob fcst: RPSS (12km NAM as ref) Prob fcst: reliability diagram

18 Evaluation of 2mTd (opl SREF vs. par SREF, Oct. 23 – Nov. 26, 2011) Outlier of opl SREF = 17.8% (to miss truth) Outlier of par SREF = 12.0% (to miss truth) Ens mean fcst: RMSE Ens spread: Rank Histogram Prob fcst: RPSS (12km NAM as ref) Prob fcst: reliability diagram

19 Evaluation of 10m-U (opl SREF vs. par SREF, Oct. 23 – Nov. 26, 2011) Outlier of opl SREF = 19.1% (to miss truth) Outlier of par SREF = 12.9% (to miss truth) Ens mean fcst: RMSE Ens spread: Rank Histogram Prob fcst: RPSS (12km NAM as ref) Prob fcst: reliability diagram

20 Ensemble mean precipitation forecast: ETS of 24hr- accumulation (against CCPA, Oct. 23 – Nov. 26, 2011)

21 Ens spread: Rank Histogram of 24hr-accumulated precipitation at F87hr (Oct. 23 – Nov. 26, 2011) Too much light precipitation at dry-end but less chance to miss heavier precipitation events at high-value end

22 SREF mean forecasts of 24h-accumulated precipitation at F87 (21z, Nov. 18, 2011) 32km SREF mean (opl) 16km SREF mean (par)

23 Bias correction can effectively remove over- predicted light precipitation area! (frequency-matching method similar to that used in GEFS) 16km SREF mean (raw) 16km SREF mean (bias corrected) 24

24 Precipitation bias correction verification (against CCPA, using 12km NAM as reference for RPSS, Nov. 10-26, 2011) Ens mean fcst: areal bias (>=0.01”) Ens mean fcst: ETS (>=0.01”) Prob fcst: RPSS

25 Clustering: 500H and SLP

26 Clustering: 2mT and precipitation

27 32km SREF downscaled to 5km NDFD grid using RTMA Against RTMA Against observation

28 How to disseminate extra SREF data and products to forecasters and users? Prob of thunderstorm, lightning, dry lightning, fire weather (maybe hold off because SPC needs to update a calibration table), bias corrected ensemble mean precipitation, and cluster means at official NCO web Clusters and member ranking information such as weights at ftp site Max, min, mode, 10-25-50-75-90% forecasts at ftp site (more suitable for point forecasts) Hourly ensemble products (mean, spread and prob) remain at NCEP CCS since it’s mainly for AWC who can access CCS (maybe need to convert it to Gempak) Bias corrected precipitation forecast as a new variable at NOMADS If downscaled 2.5/5km data will be available (pending at DET), it will be at ftp site for the time being Collecting new required variables from field forecasters for planning future AWIPS/AWIPS2

29 Plans for SREF SREF system: (1) Improve IC perturbation (3D mask, EMC/DET) (2) Explore EnKF perturbations from hybrid EnKF/GSI DA system (3) Improve physics perturbation (stochastic parameterization physics/EMC and stochastic kinetic energy backscatter scheme/DET) Post-processing: (4) Test BMA method for post processing/DET (5) Calibrate ensemble spread (6) Expand bias correction to North America domain (e.g., grid#221) Products: (7) Produce SREF mean bufr sounding (8) Build SREF model climate (9) Explore NAEFS_LAM (combining with Canadian regional ensemble) Data: (10) Add an ~12km output domain over NA (11) Add new fields to AWIPS/AWIPS2 Verification: (12) Verify ensemble mean cyclone track (Guangping Lou)

30 Plans for HRRRE-TL Similarly to NARRE-TL, to develop a new suite of ensemble products called “HRRRE-TL (Hi-Res Rapid Refresh Ensemble – Time Lagged)” based on existing high-resolution single model runs - 4km NAM-nest 4km HRW-ARW 4km HRW-NMM 4km Matt Pyle run (for SPC) 3km HRRR (GSD runs)


Download ppt "Review of EMC regional ensembles: SREF, NARRE-TL, HRRRE-TL/HREF Jun Du, Geoff DiMego, Binbin Zhou, Dusan Jovic, Bo Yang, Matt Pyle, Geoff Manikin, Brad."

Similar presentations


Ads by Google