Presentation is loading. Please wait.

Presentation is loading. Please wait.

© 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Ensemble-4DWX.

Similar presentations


Presentation on theme: "© 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Ensemble-4DWX."— Presentation transcript:

1 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Ensemble-4DWX update: focus on calibration and verification

2 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Summary of progress Upgraded from WRF V2.2.1 to V3.0.1.1 Replaced 3 physics members (slab LSM, Grell-Dvevenyi Cu, and Betts-Miller Cu schemes); added RUC LSM, no horizontal diffusion, Thomson scheme with positive-definite advection Configured E-4DWX for ATC and operated for a 2 month period (Dec 1 and Jan 30) Configured and run E-4DWX for supporting UT Dept of Air Quality Improved MYJ and YSU PBL height diagnosis and PBL mixing Added new graphics and improve post-processing flexibility (installation for GMOD and plotting historical case archive) and computing parallelisms Presented the system at a number of AMS and other conferences Continued R&D of on-line verification and calibration

3 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Ongoing work and plans E-4DWX science paper for MWR; technology brief for BAMS Member performance evaluation for 2009 Feb and Mar, and compare to the results we obtained for 2008 Feb and Mar; look for improvements Member-based evaluation of ATC E-4DWX run to identify best WRF model configuration ATC cold-air damming study using E-4DWX archive Begin development of 4d-ENKF

4 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory New product examples

5 © 2009 UCAR. All rights reserved. New product examples ATEC-4DWX IPR, 21−22 April 20095 National Security Applications Program Research Applications Laboratory

6 © 2009 UCAR. All rights reserved. New product examples ATEC-4DWX IPR, 21−22 April 20096 National Security Applications Program Research Applications Laboratory

7 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Calibration and verification Ensemble calibration to correct predicted distribution. Calibration is needed for users capable of decision making with probabilistic guidance. Will be needed for foreseeable future. Verification of different ensemble characteristics is easily completed when performing calibration.

8 NCAR/RAL - National Security Applications Program 8 What do we mean by “calibration” or “post-processing”? Probability calibration Temperature [K] Probability Temperature [K] Post-processing has corrected: the “on average” bias as well as under-representation of the 2nd moment of the empirical forecast PDF (i.e. corrected its “dispersion” or “spread”) “spread” or “dispersion” “bias” obs Forecast PDF Forecast PDF

9 NCAR/RAL - National Security Applications Program 9 Benefits of Post-Processing Essential for tailoring to local application:  NWP provides spatially- and temporally-averaged gridded forecast output => Applying gridded forecasts to point locations requires location specific calibration to account for spatial- and temporal- variability ( => increasing ensemble dispersion) => Relatively inexpensive!

10 NCAR/RAL - National Security Applications Program 10 Example of Quantile Regression (QR) Our application Fitting T quantiles using QR conditioned on: 1)Reforecast ens 2)ensemble mean 3)ensemble median 4) ensemble stdev 5) Persistence 6) Log Reg quantile

11 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Summary of progress Calibration (and Verification) Scheme fully-automated Calibrating Temperature and Dew Point Temperature Calibration specific for unique set of models that are available for each cycle Utilizes “persistence” if available 26 sites over DPG Full calibration for all sites ~ 10hrs for each weather variable Using lookup tables ~ 1hr Updating tables once per week

12 NCAR/RAL - National Security Applications Program 12 Calibration Procedure 1)Fit Logistic Regression (LR) ensembles –Calibrate CDF over prescribed set of climatological quantiles –For each forecast: resample 15 member ensemble set For each quantile: 2)Perform a “climatological” fit to the data 3)Starting with full regressor set, iteratively select best subset using “step-wise cross-validation” –Fitting done using QR –Selection done by: a)Minimizing QR cost function b)Satisfying the binomial distribution ( 2nd pass: segregate forecasts into differing ranges of ensemble dispersion, and refit models ) Probability Temperature [K]  obs Forecast PDF T [K] Time Forecastsobserved Regressors for each quantile: 1) reforecast ensemble 2) ens mean 3) ens median 4) ens stdev 5) persistence 6) logistic regression quantile

13 NCAR/RAL - National Security Applications Program 13 Verifying ensemble forecasts Measures Used in automated model selection: 1)Rank histogram 2)Root Mean square error (RMSE) 3)Brier score 4)Rank Probability Score (RPS) 5)Relative Operating Characteristic (ROC) curve

14 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory 6hr Temperature Time-series Before Calibration After Calibration

15 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory 36hr Temperature Time-series Before Calibration After Calibration

16 NCAR/RAL - National Security Applications Program 16 Raw versus Calibrated PDF’s obs Blue is “raw” ensemble Black is calibrated ensemble Red is the observed value Notice: significant change in both “bias” and dispersion of final PDF (also notice PDF asymmetries)

17 Troubled Rank Histograms Slide from Matt Pocernic 1 2 3 4 5 6 7 8 9 10 Ensemble # 1 2 3 4 5 6 7 8 9 10 Ensemble # Counts 0102030 Counts 0102030

18 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory 6hr Temperature Rank Histograms

19 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory 36hr Temperature Rank Histograms

20 NCAR/RAL - National Security Applications Program 20 Verifying ensemble forecasts Measures Used: 1)Rank histogram 2)Root Mean square error (RMSE) 3)Brier score 4)Rank Probability Score (RPS) 5)Relative Operating Characteristic (ROC) curve => Using these for automated calibration model selection

21 NCAR/RAL - National Security Applications Program 21 Rank Probability Score for multi-categorical or continuous variables

22 Skill Scores Single value to summarize performance. Reference forecast - best naive guess; persistence, climatology A perfect forecast implies that the object can be perfectly observed Positively oriented – Positive is good

23 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory 36hr Temperature Time-series RMSE Skill Score CRPS Skill Score Reference Forecasts: Black -- raw ensemble Blue -- persistence

24 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory RMSE of Models 6hr Lead-time 36hr Lead-time

25 © 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Significant Calibration Regressors 6hr Lead-time 36hr Lead-time

26 NCAR/RAL - National Security Applications Program 26 Future Plans  Correct over-dispersion of calibration  Implement procedure for a) wind speed, b) wind direction, c) precipitation, d) pressure  Diagnose most informative model set to use operationally  Develop Scheme for model points without surface observations => over whole model-gridded domain

27 Continuous scores: MSE Average of the squares of the errors: it measures the magnitude of the error, weighted on the squares of the errors it does not indicate the direction of the error Quadratic rule, therefore large weight on large errors:  good if you wish to penalize large error  sensitive to large values (e.g. precipitation) and outliers; sensitive to large variance (high resolution models); encourage conservative forecasts (e.g. climatology) Attribute: measures accuracy Slide from Barbara Casati => For ensemble forecast, use ensemble mean

28 Scatter-plot and Contingency Table Does the forecast detect correctly temperatures above 18 degrees ? Slide from Barbara Casati Brier Score y = forecasted event occurence o = observed occurrence (0 or 1) i = sample # of total n samples => Note similarity to MSE

29 Conditional Distributions Conditional histogram and conditional box-plot Slide from Barbara Casati

30 Scatter-plot and Contingency Table Does the forecast detect correctly temperatures above 18 degrees ? Does the forecast detect correctly temperatures below 10 degrees ? Slide from Barbara Casati

31 Discrimination Plot Outcome = No False Alarms Outcome = Yes Hits Decision Threshold Slide from Matt Pocernic

32 Receiver Operating Characteristic (ROC) Curve


Download ppt "© 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Ensemble-4DWX."

Similar presentations


Ads by Google