Presentation on theme: "Update on SVS-LRF Lead Centre Dr David Jones 1 & Bertrand Denis 2 1 Australian Bureau of."— Presentation transcript:
Update on SVS-LRF Lead Centre http://www.bom.gov.au/wmo/lrfvs/index.html Dr David Jones 1 & Bertrand Denis 2 email@example.com 1 Australian Bureau of Meteorology 2 Environment Canada CBS Expert Team on Extended and Long Range Forecasting Geneva, 26-30 March, 2012
Role of the Lead Centre http://www.bom.gov.au/wmo/lrfvs The role of the Lead Centre is to facilitate the exchange of seasonal and longer range forecast verification results, as specified in the Standardised Verification System (SVS) for Long Range Forecasts (LRF) defined in the WMO Manual on the Global Data-Processing System (new attachment II.8).attachment II.8 The Lead Centre will ensure that clear and concise information explaining the verification scores, graphics and data is available and maintained up-to-date on the SVSLRF website. Also, links to the participating Global Producing Centres will be listed on the site. The Lead Centre will provide and maintain software to calculate the verification scores, as well as recommended datasets for use in the assessments of the forecasts.
Lead Centre - Web Site Lead Centre web site is fully functional Contains Disclaimer, Documentation, Users Guide and Verification Maps As experts and potential users, we value your feedback firstname.lastname@example.org email@example.com http://www.bom.gov.au/wmo/lrfvs Link to Lead Centre for MME site
Lead Centre – Data Presentation http://www.bom.gov.au/wmo/lrfvs Series of Korn shell scripts, using “GrADS” (The COLA Grid Analysis and Display System) for displaying data GrADS was chosen due to its portability, ease of use, and as it is free and publicly available: http://grads.iges.org/grads/g rads.html This means ease of relocation if the Lead Centre responsibilities changed hands. http://grads.iges.org/grads/g rads.html
Lead Centre – Files submitted http://www.bom.gov.au/wmo/lrfvs
Examples of submitted data http://www.bom.gov.au/cgi-bin/climate/wmo.cgi
Number of Web HitsJun-11Jul-11Aug-11Sep-11Oct-11Nov-11Dec-11Jan-12 Total for 8 months Main page638374831331329677741 Usersguide562153 82522439380 Datasets2415184043422016218 Documentation44101122301419114 roc info2744493840544440336 reliability info6550726598853735507 gpc info88413101214978 Scores811131530352131164 AttachmentII-86343325 333220208 msss info1321322344493128241 maps: /cgi-bin/climate/wmo.cgi1962342824347243283716483217 Measuring Success?
Measuring Success? (last ET meeting) Spike coinciding with the announcement of the Exeter Meeting. Generally low level of use – expected or unexpected?
Some Points to Consider Original recommended hindcast period was 1981- 2001. Every centre seems to verify over a different period. Move to 1981 to 2010 as a “priority”. Some models bias correct and correct for variance (i.e. MSSS2 and MSSS3 is zero everywhere), which gives an inflated value for the overall MSSS score compared to models that don’t bias correct. Do we move to dynamical mapping (see http://poama.bom.gov.au/experimental/pasap/ ), noting that the current pages look a bit tired. Consequences for download speed.http://poama.bom.gov.au/experimental/pasap/ ENSO stratification not implemented yet.
Some Points to Consider Some GPC skill maps appear to have errors – GPCs need to periodically check. Ability to assess forecast skill in addition to hindcast skill? Putting forecasts alongside verification. How to verify multimodel ensembles. Role of lead centre for extended range forecasts.