Download presentation

Presentation is loading. Please wait.

Published byNicholas McNamara Modified over 4 years ago

1
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 1 Pertti Nurmi Juha Kilpinen Annakaisa Sarkanen ( Finnish Meteorological Institute ) Probabilistic Forecasts of Near-Gale Force Winds in the Baltic Applying ECMWF, EPS and Other Methods ECMWF Forecast Products User Meeting 15 – 17 June 2005

2
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 2 A study with 2 frameworks: i.Develop warning criteria / Guidance methods to forecast probability of near-gale force winds in the Baltic Joint Scandinavian research undertaking e.g. Finland and Sweden issue near-gale & storm force wind warnings for same areas using different criteria => homogenise ! ii.Evaluation of ECMWF products Deterministic and probabilistic forecasts Two (maybe three) calibration methods Here, only ECMWF data applied Later, HIRLAM, too Here, 6 Finnish coastal stations Later, c. 15-20 stations from Sweden, Denmark, Norway Goal: Common Scandinavian operational practice (?) Introduction:

3
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 3 ECMWF MARS u & v components at 10 m => wind speed at 10m Forecast lead times: +12 hr to +144 hr Data retrieval: 0.5 * 0.5 degree resolution Operational, Control, EPS data (interpolated to 0.5 o * 0.5 o ) Nearest grid point used Forecasts / observations valid: 00, 06, 12, 18 utc Observations: 10 minute mean wind speed Data coverage: 1/10/04 – 30/4/05 212 days Data:

4
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 4 We may have problems: with height of instrumentation ? with observing site surroundings and obstacles ? –with the coast ? –with nearby islands ? –with barriers ? –with installations ? with low-level stability ? NE

5
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 5 02_873 - Hailuoto 02_910 - Valassaaret 02_980 - Nyhamn 02_979 - Bogskär 02_981 - Utö 02_987 - Kalbådagrund Observing stations ( 6 out of 39 )

6
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 6 (m) 55 50 45 40 35 30 25 20 15 10 5 Heights of the instrumentation ( in red, the 6 out of 39 )

7
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 7 46 / 8 m 873 - Hailuoto

8
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 8 22 / 18 m 910 - Valassaaret

9
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 9 32 / 4 m 979 - Bogskär

10
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 10 25 / 8 m 980 - Nyhamn

11
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 11 31 / 9 m 981 - Utö

12
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 12 32 / 23 m 987 - Kalbådagrund

13
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 13 32 m 15,5 m/s 10 m 15 Wind speed dependence: Logarithmic wind profile 14 m/s 979 - Bogskär Unstable Neutral Stable threshold

14
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 14 Methods for producing probabilistic forecasts: Deterministic forecasts: Error distribution of original sample (212 cases) Approximation of the error distribution with a Gaussian fit (, ): sample error method 1.EPS (51 members): Probability of wind speed > 14 m/s 2.Kalman filtering –Various approaches No details given here 3.Deterministic forecasts, adjusted by a posteriori estimate of the observed error distribution of the dependent sample Probability distribution of near-gale –Gives an estimate of the upper limit of the probabilistic predictability.

15
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 15 Methods for producing probabilistic forecasts: 4.Deterministic forecasts, adjusted with a Gaussian distribution fitted to model forecasted stability (temperature forecasts at 2 adjacent model levels) Probability distribution of near-gale, stability method -Scheme used at SMHI (H. Hultberg) 5.Neighbourhood method -Both spatial (right) and temporal neighbours -c. 25-75 members -Applicable primarily for hi-res models ?

16
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 16 Traditionally, calibration of the EPS is done by re-labeling the probabilities using the information of the reliability diagram (large sample of past forecasts and observations is needed) Here, Kalman filtering is used to calibrate the EPS mean (as well as operational and control forecasts). Then each EPS member is transformed with the same relationship (state vector). –This will calibrate the mean of the distribution, hopefully also the spread. Kalman filtering is also used in the traditional way to correct the deterministic forecasts and then to estimate the probabilities using the observed error distribution. Calibration of EPS forecasts:

17
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 17 Sample climatologic characteristics 46 m 22 m 32 m 25 m 31 m 32 m

18
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 18 Sample climatologic characteristics, ref. ECMWF 46 m 22 m 32 m 25 m 31 m 32 m

19
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 19 46 m 22 m 32 m 25 m 31 m 32 m Sample climatologic characteristics, ref. ECMWF

20
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 20 Sample climatologic characteristics, ref. ECMWF

21
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 21 Deterministic FCs: Bias - RMSE - 981_Utö ME (Bias) RMSE Ensemble spread w.r.t to FC lead time

22
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 22 Deterministic FCs: Bias - RMSE - 987_Kalbåda RMSE ME (Bias) w.r.t to FC lead time

23
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 23 Probabilistic FCs: Brier Skill w.r.t to FC lead time 987_Kalbåda Brier Score: BS = ( 1/n ) Σ ( p i – o i ) 2 – Common accuracy measure of prob fcs – o i is binary (0 or 1) – Analogous to MSE in probability space – A quadratic scoring rule Sensitive to large forecast errors ! Careful with limited datasets ! – Influenced by climatologic frequency of the sample Different samples not to be compared Brier Skill Score: BSS = [ 1 – BS / BS ref ] *100 Range: 0 to 1 Perfect score = 0 Range: - to 100 Perfect score = 100

24
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 24 Relative Operating Characteristic To determine the ability of a forecasting system to discriminate between situations when a signal is present (here, occurrence of gale) from no-signal cases (noise) To test model performance relative to a specific threshold Applicable for probability forecasts and also for categorical deterministic forecasts Allows for their comparison Gained popularity in forecast verification in recent years Probabilistic FCs: ROC

25
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 25 Graphical representation in a square box of the Hit rate (H) (y-axis) against the False Alarm Rate (F) (x-axis) for different potential decision thresholds Curve is plotted from a binned set of probability forecasts by stepping (or sliding) a decision threshold (e.g. 10% probability intervals) through the forecasts, each probability decision threshold generating a separate 2*2 contingency table The probability forecast is transformed into a set of categorical yes/no forecasts A set of value pairs of H and F is obtained, forming the curve It is desirable that H be high and F be low, i.e. the closer the point is to the upper left-hand corner, the better the forecast A perfect forecast system, with only correct forecasts & no false alarms, (regardless of the threshold chosen) has a curve that rises from (0,0) (H=F=0) along the y-axis to (0,1) (upper left-hand corner; H=1, F=0) and then straight to (1,1) (H=F=1) H = a / ( a + c ) F = b / ( b + d ) Probabilistic FCs: ROC Curve

26
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 26 To learn more about ROC and Signal Detection Theory, check: http://wise.cgu.edu/ H = a / ( a + c ) F = b / ( b + d ) a+c =1920 b+d =5351 Example ( a )( b ) Probabilistic FCs: ROC Curve generation

27
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 27 Area under the ROC curve Decreases from 1 when curve moves downward from the ideal top-left corner A useless forecast system is along the diagonal, when H=F and the area is = 0.5; Such system cannot discriminate between occurrences and non-occurrences of the event ROC A based skill score: ROC_SS = 2 * ROC A - 1 Negative below the diagonal At its minimum: ROC_SS = - 1, when ROC A = 0 ROC is applicable for deterministic categorical forecasts –ROC_SS translates into KSS TSS (= H – F ) –Only one single decision threshold - only a single ROC point results Typically, this is inside the ROC area, i.e. indicating worse quality ROC, ROC A and ROC_SS are directly related to a decision-theoretic approach –Can be related to the economic value of probability forecasts to end users –Allowing for the assessment of the costs of false alarms Range: -1 to 1 Perfect score = 1 Range: 0 to 1 Perfect system = 1 Probabilistic FCs: ROC A Area

28
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 28 Probabilistic FCs: ROC curve/area; T + 48 hr ROC_EPS ROC A = 0.73 ROC A = 0.85 ROC_Kalman (EPS) 987_Kalbåda

29
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 29 Probabilistic FCs: ROC curve/area; T + 24 hr 981_Utö ROC A = 0.96 ROC A = 0.88 ROC_stability ROC_neighbour

30
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 30 873_Hailuoto 910_Valassaaret EPS Probabilistic FCs: ROC Area w.r.t to FC lead time

31
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 31 981_Utö 987_Kalbåda Probabilistic FCs: ROC Area w.r.t to FC lead time

32
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 32 So far weve just scratched the (sea) surface Need much more experimentation with various methods Different methods for different time/space scales e.g. very-short vs. medium-range ? Biases and other scores depend on station (e.g. observation height) (Statistical) adjustment of original observations required ? Finland has an operational scheme for this ! EPS forecasts are slightly under dispersive Kalman filtering reduces the biases and produces better prob. forecasts for most stations in terms of the ROC curve/area Apply to data from other counterparts Reach the goal… !!! Conclusions Future:

33
pertti.nurmi@fmi.fi 16.6.2005 ECMWF User Meeting / 33 Forecast Quality Project 2005 The Royal Meteorological Society at the behest of the UK weather forecasting industry and their customers, has undertaken a project to establish methodologies and metrics by which the quality of weather forecast services can be assessed from a user perspective on a basis that is clear, scientifically well founded, relevant to the users needs and easily applied and understood. UK forecast user and provider input is NOW needed! www.rmets.org/survey Almost finnish(ed), but one advertisement…

Similar presentations

Presentation is loading. Please wait....

OK

25 seconds left…...

25 seconds left…...

© 2019 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google