Presentation is loading. Please wait.

Presentation is loading. Please wait.

MyOcean2 First Annual Meeting – 17-18 April 2013 WP 17 Product Quality MyOcean2 First Annual Meeting – Cork 16-17 April 2013.

Similar presentations


Presentation on theme: "MyOcean2 First Annual Meeting – 17-18 April 2013 WP 17 Product Quality MyOcean2 First Annual Meeting – Cork 16-17 April 2013."— Presentation transcript:

1 MyOcean2 First Annual Meeting – April 2013 WP 17 Product Quality MyOcean2 First Annual Meeting – Cork April 2013

2 Work Package Main Objectives MyOcean2 First Annual Meeting – April 2013 Ensure that:  the accuracy of MyOcean products is adequately monitored  changes to the MyOcean system are justified from a product quality point of view  information on product quality is scientifically sound and consistent, is useful, and is communicated effectively  product quality assessment takes into account user requirements and feedback

3 Governed by guidelines developed in WP17.2 Background: Validation categories and who does what Calibration/ Pre-upgrade tests PC x.5/6, WP18 Routine validation (1-10 days behind production) PC x.2 Offline validation with new metrics (1-3 months behind production) WP17.4 QuID Static document for each product containing all quality info. Updated when product changes. Quarterly validation report Single document covering all real-time products. Results for each quarter. User validation (by users – for themselves) Coordinated by WP3 Interaction needed Operational monitoring

4 Partnership MyOcean2 First Annual Meeting – April 2013

5 Achievements wrt workplan In the following slides, you should :  identify the main subjects in your work plan  for each subject :  what was planned and has been achieved in the first period  Which were the contributions  Main results - illustrations  Difficulties encountered and improvement identified  Guidelines, published (topics), challenges  QuARG, rationalised documentation, V3, Quids – status/numbers, challenges (timeline)  New metrics – summarise results from 3 strands (work in early stages) – what to use for cat stats?! Same old slide?  Reports – show some snapshots. Challenges – defining common formats, but has been quite successful. MyOcean2 First Annual Meeting – April 2013

6 Achievements with respect to WP17 workplan

7 Main achievements in year 1 CalVal guidelines published (WP17.2) Quality Assurance Review Group initiated and performed first review – V3 QuIDs more complete and more consistent (WP17.3) Initial results from development of new metrics (WP17.4) Prototype quarterly validation reports produced (WP17.5)

8 Achievements: CalVal guidelines

9 Achievements: CalVal guidelines Planned process Proposals WP 17.2 All PC representatives WP 17.2 sub-teams Guidelines PC Liaison PC Published once per year

10 Achievements: CalVal guidelines In April we chose topics to focus on for phase 1 (we kept scope limited to make it achievable in a short period) In December phase 1 of the guidelines was published, containing the 3 topics which were mature enough: o Conventions for Estimated Accuracy Numbers o Guidelines for sharing QC flags between TACs and MFCs o Conventions for providing statistics for quarterly validation reports PCs are requested to adopt the guidelines by June 2013

11 CalVal guidelines: challenges Difficult to keep the working group discussions moving between face-to-face meetings (April 2012, November 2012, April 2013) For next phase, propose to schedule regular teleconferences to maintain momentum

12 CalVal guidelines: next steps We have agree to work on the following topics for phase 2 of the guidelines: TAC-MFC links: putting more concrete processes in place for QC feedbacks Defining a framework for process-focussed model-observation intercomparisons Defining the quality thresholds which should be used to trigger user alerts when quality is degraded EAN: agreeing a long-term strategy for BGC metrics (will link up with the GODAE OceanView MEP task team where appropriate) EAN: seeking common metrics for sea-ice products in the short to medium term

13 Achievements: improvements in review process for PQ

14 The Quality Assurance Review Group (QuARG) was formed: Fabrice Hernandez (chair), Henning Wehde, Laurent Bertino, Gilles Larnicol, Alistair Sellar Following discussions with PMO, the PQ documentation requirements for PCs were (slightly) simplified Brief summary of the review process was written to give clear guidance to PCs on requirements for them

15 QuID status at V3 QuARG reviewed all QuIDs submitted for V3 to increase consistency and readability – each document reviewed by at least 2 members For V3 there are QuIDs covering 87 products – big effort by many people Remaining 20 should be ready soon – by V3.1?

16 PQ review process: challenges Timescales to review & correct QuIDs were very tight, and many QuIDs submitted late => required a big effort from QuARG chair in the final weeks of V3 integration to review final changes

17 Achievements: metrics development

18 3 partners (MUMM, UK Met Office, Mercator- Ocean) are taking novel validation techniques from NWP and adapting for ocean products Initial results follow...

19 Accuracy of site-specific forecasts (UK Met Office) Comparing single-site timeseries observations with gridded products, focusing on critical thresholds and categorical statistics: – hit/miss rates and derivatives (CSI, ETS, REV, …) – use time/space windows on the gridded product to understand the best method to generate site-specific forecasts – user-focused metrics such as “good weather windows”: e.g. periods with currents below a given threshold Anticipated information for users: – “The hit rate for site-specific forecasts is...” – “The best method to produce site-specific forecasts is to take the mean/max of X neighbouring points in space/time.”

20 Accuracy of site-specific forecasts (UK Met Office) Applied to hourly data gives poor results POD FAR CSI Hourly model v hourly observations Met Office Marine Verification scores for >10.0ms-1 winds (mid-strong wind speed)

21 POD FAR CSI Applied to weekly averaged data gives better results Weekly averaged model v weekly averaged observation Accuracy of site-specific forecasts (UK Met Office)

22 Metrics development - biogeochemistry (MUMM) Checking different terminologies and calculation methods Finetuning calculation method (time windows) NWS chlorophyll comparisons Satellite 26 July Model 26 July Model 31 July

23 Metrics development - biogeochemistry (MUMM) Also investigating… Spatial neighborhood method Added research area – Mediteranean sea – Atlantic ocean – …

24 Triple collocation to determine relative error ranges (Mercator-Ocean) TC method is used to compared data sets of the same geophysical field (wind (Stoffelen 1998), waves (Caires et al 2003), SST (O’Carroll et al 2008), soil moisture (Zwieback et al 2012)). TC gives an estimation of the error associated with each data set and allows the calibration of the data sets. The errors are generally supposed to be uncorrelated with each other and with the geophysical field. Application: one year of SST data (Aug 2011-Sep 2012), Bay of Biscay region. Data sets: multi-sensor L3 observations (taken as reference), IBI and NWS forecasts (1-D averaged). Calculation performed for grid points with at least 200 valid obs. Error models: several tested; illustration with O’Carrol L3 obs IBI MFC NWS MFC Variance of errors

25 Achievements: quarterly validation reports

26 We will produce online reports displaying validation statistics for the real-time MyOcean products, updated quarterly These will be published as a set of web pages to make it easy to browse results

27 Quarterly reports: planned process UKMO (WP17.5) Quarterly statistics (netCDF) PC reps (WP17.5) Operations (PC x.2/3) Interpretation (style must be consistent ) Plots Completed report New metrics (WP17.4)

28 Quarterly reports: status and challenges We have a prototype containing results from most PCs Currently working on defining conventions for summary text to ensure consistency Challenges: – Defining a data format which accommodates results from diverse PCs – but we seem to have managed it – Defining conventions for summary text will be harder

29 Prototype pages Production centres Products Variables or datasets Overview of results for each product

30 Prototype pages RMS errors for each area and forecast lead time Full domain On shelf Off shelf Norwegian Trench RMSE (K)

31 Prototype pages Select an area to get timeseries statistics for that area

32 Achievements in Production Centres

33 Validation highlights from production centres Some of these were presented at the MyOcean science days which had a session devoted to validation Thanks to all contributors MyOcean2 First Annual Meeting – April 2013

34 Assessing Armor global observational analysis using velocity computation  Improvement of the T/S field resolution DEPTH LATITUDE Zonal velocity relative to the surface  Assessment of the density gradient shear Absolute zonal geostrophic velocity (Mulet et al., 2012) Gulf Stream Area Section at 60°W + = Higher shear Better estimate NEW OLD Argo drift at 1000 m: ANDRO (Ollitrault et Rannou, 2010) LATITUDE u (z = 0) Comparison with independant data Standard deviation : 6 cm.s -1 7 cm.s cm.s -1

35 Armor global observational analysis  Comparison with T/S in-situ TAC Statistics over 3 months (Jan-Feb-Mar 2013) : RMS and mean difference from in-situ Number of profiles (°C)psu TemperatureSalinity

36 ARC-MFC: Validation of MIZ sea ice concentration Near sea ice minimum ( ) Observations ModelModel SIC Low ↔ High Lo ↕ Hi Observations ModelModel SIC Low ↔ High Lo ↕ Hi Near sea ice maximum ( ) Character size codes: 1-3% 4-8% 9-14% ≥15% We find: 1. Somewhat low SIC values in model near sea ice minimum 2. Modeling sea ice categories exactly is not easy! (low values on the diagonal) ARC-MFC validation results are available from

37 Baltic MFC: Sealevel validation Timeseries of sea level data at 56 tide gauges Hourly data Satellite-borne altimeter: severe limitations in semi- enclosed seas due to limited accuracy and spatial resolution MyOcean2 First Annual Meeting – April 2013

38 BAL MFC: Ice metrics experiments Ice edge RMS distance is higly dependent on reference data chosen (model or obs) Minimum ice edge length needs to be tuned to supress false alarms Score can be misleading at the start/ end of ice period MyOcean2 First Annual Meeting – April 2013

39 IBI Validation improvements n NARVAL continuous developments. Working lines: 1. To include new OBS Sources n SMOS, ARGO (in D-M: automatic generation of nc files for WP17) RMSE BIAS RMSE February, 2012

40 MED-MFC Biogeochemistry OCT2012-MAR2013 Timeseries of product quality statistics and EAN Product quality estimated over sub-regions, two examples are reported: North-Western Med (NWM) and Levantine (LEV) Known sources of error in forecasts: Western regions: patchy bloom events and high spatial variability, which are not completely resolved. Satellite coverage decreases in winter: decrease of statistical comparison. Eastern regions: oligotrophic conditions at surface not fully handled by satellite observation and models

41 Black Sea MFC quality monitoring for sea level and currents MyOcean2 First Annual Meeting – April 2013 nObsRMSD Corr. Coeff. All Sat Envisat-N Jason-1N Jason Criosat Jason-1G Root men square deviation & correlation coefficient between modeled and satellite sea levels January-September 2012 Example of the modeled Sea Level quality control Example of the modeled currents velocities quality control RMSD histogram for all satellites & tracks Jason-2. Standard deviations between modeled & observed Sea Level. Track RMSD (cm) Frequency Cycle

42 Validation highlights TAC-SL MyOcean2 First Annual Meeting – April 2013 Degree of Freedom Signal: Evolution of the mean contribution of the different altimeters in the merged product. Jason-1 unavailable 1 to15th March Jason-2 reference mission unavailable 10 days just after Jason-1 recovery  Any change in the constellation can impact the quality of the products (improved/degraded sampling ).

43 InSitu Validation Key Performance indicators KPI 1: Data availability KPI 2: Input data coverage KPI 3: Meta data quality KPI 4: Output data quality Arctic Global Mediterranean

44 First period summary and next steps The outcomes promised in the plan:  CalVal guidelines  Phase 1 published; phase 2 will be published Autumn 2013  Operation of QuARG (Quality Assurance Review Group)  Completed reviews for V3, QuIDs very complete; lessons learned...  Metrics development  Initial results are promising; investigations continue  Quarterly validation reporting  Prototype developed; expect publication in July  Establishing links and processes  service desk, users, reanalysis WP, between PCs  Some progress via guidelines; need to do more next year (top priority) MyOcean2 First Annual Meeting – April 2013

45 Thank you

46 Proposed links – production mode PC x.2/3 Service desk Web portal PCs others… Disseminate report Users Quality incidents + user queries WP 17.5: quarterly report WP17.4: new metrics List of recent quality incidents? Results from new metrics for last quarter Results of routine validation for last quarter Real-time operations WP18

47 Proposed links – offline mode WP17.2: CalVal guidelines PCs WP18 WP3: Users R&D: WP19 PC x.5, WP18 PC representatives are part of WP17 for all of these activities WP17.3: QuARG WP17.5: Quarterly reports Discuss impact on development priorities Discuss impact on usefulness of products Understand user needs for accuracy info PC reps in WP17.2 liaise with rest of PC Provide evidence for benefit of upgrades


Download ppt "MyOcean2 First Annual Meeting – 17-18 April 2013 WP 17 Product Quality MyOcean2 First Annual Meeting – Cork 16-17 April 2013."

Similar presentations


Ads by Google