Presentation is loading. Please wait.

Presentation is loading. Please wait.

NWS TAF Verification Brandi Richardson NWS Shreveport, LA.

Similar presentations


Presentation on theme: "NWS TAF Verification Brandi Richardson NWS Shreveport, LA."— Presentation transcript:

1 NWS TAF Verification Brandi Richardson NWS Shreveport, LA

2 Home Previous Next Help Do we care how our forecasts verify? NO!

3 Home Previous Next Help Do we care how our forecasts verify? Yes! The NWS measures verification by many means –Probability of Detection (POD) –False Alarm Ratio (FAR) –Critical Success Index (CSI) –Percent Improvement Set goals for verification Local offices add own flavor Total IFR (IFR, LIFR, VLIFR)

4 Home Previous Next Help Why is verification important? Need to know what to improve –Lose credibility if too many forecasts are wrong Lose customers Lose jobs –Additional training –New techniques –Improved model guidance Need to know what we are doing well

5 Home Previous Next Help NWS TAF Verification TAFs evaluated 12 times per hour (every five minutes), or 288 times per 24-hour period TAFs compared to ASOS five-minute observations –ASOS = Automated Surface Observing System, located at TAF airports Stats calculated by flight category –i.e., VFR, MVFR, IFR, LIFR, VLIFR

6 Home Previous Next Help Probability of Detection How often did we correctly forecast a particular flight category to occur? –Also known as “Accuracy” POD = V/(V+M) –V = forecasted and verified events Ex: IFR conditions forecasted…IFR conditions occurred –M = missed events Ex: VFR conditions forecasted…IFR conditions occurred Ranges from 0 – 1, 1 being perfect

7 Home Previous Next Help False Alarm Ratio How often did we forecast a particular flight category to occur that did not occur? –i.e., how often did we “cry wolf”? FAR = U/(U+V) –U = forecasted and unverified Ex: IFR forecasted…VRF occurred –V = forecasted and verified events Ex: IFR conditions forecasted…IFR conditions occurred Ranges from 0 – 1, 0 being perfect

8 Home Previous Next Help Critical Success Index CSI = V/(V+M+U) –V = forecasted and verified events Ex: IFR conditions forecasted…IFR conditions occurred –M = missed events Ex: VFR conditions forecasted…IFR conditions occurred –U = forecasted and unverified Ex: IFR forecasted…VRF occurred Ranges from 0 – 1, 1 being perfect Incorporates both POD and FAR Overall score of performance

9 Home Previous Next Help Percent Improvement Forecaster CSI vs. Model Guidance CSI –Did we beat the model? IFR will prevail… IFR?! It’s July and dew points are in the 20s! Take that! Forecaster

10 Home Previous Next Help 2009 NWS Goals The NWS has set goals for TAF forecasts –For total IFR (includes IFR, LIFR, and VLIFR) POD ≥ 0.640 (64%) FAR ≤ 0.430 (43%) How do we measure up?...

11 Home Previous Next Help Examples of Local TAF Verification

12 Home Previous Next Help Examples of Local TAF Verification

13 Home Previous Next Help Examples of Local TAF Verification

14 Home Previous Next Help The Bottom Line Sometimes we do get the forecast wrong. Examination of TAF verification statistics helps to find our weaknesses and allows us to find ways to improve our forecasts. The NWS strives to provide quality products and services to our aviation customers and partners.


Download ppt "NWS TAF Verification Brandi Richardson NWS Shreveport, LA."

Similar presentations


Ads by Google