2013-09-11 1 Data Quality Control and Quality Monitoring Jitze van der Meulen, 2013-09-11  WMO AMDAR PANEL.

Slides:



Advertisements
Similar presentations
AMDAR – The aircraft data source
Advertisements

WAFS Workshop on the Use and Visualisation of Gridded SIGWX Forecasts, Paris, New WAFC Gridded Products = New Visualisation Opportunities.
TURKEY AWOS TRAINING 1.0 / ALANYA 2005 TRAINING COURSE ON AUTOMATED WEATHER OBSERVING SYSTEMS ( AWOS ) MODULE D: DATA PROCESSING SYSTEM SONER KARATAŞ ELECTRONIC.
World Meteorological OrganizationIntergovernmental Oceanographic Commission of UNESCO Ship Observations Team ~ integrating and coordinating international.
E-AMDAR QEvC update Jitze van der Meulen, November, 8 th, 2007 
New Resources in the Research Data Archive Doug Schuster.
Review of the current and likely future global NWP requirements for Weather Radar data Enrico Fucile (ECMWF) Eric WATTRELOT & Jean-François MAHFOUF (Météo-France/CNRM/GMAP)
Argo QC with an emphasis on the North Atlantic Justin Buck British Oceanographic Data Centre Joseph Proudman Building 6 Brownlow Street Liverpool L3 5DA,
QEvC issues Jitze van der Meulen, 20 Sept Feedback mechanism Data generator Data use 1. Data control and validation versus reference 2. Evaluation.
World Meteorological Organization Working together in weather, climate and water WMO OMM WMO CAeM Report to ICG-WIGOS February 2015.
QARTOD III Scripps Institution of Oceanography La Jolla, CA Quality Control, Quality Assurance, and Quality Flags Mark Bushnell, NOAA/NOS/CO-OPS November.
Operational Quality Control in Helsinki Testbed Mesoscale Atmospheric Network Workshop University of Helsinki, 13 February 2007 Hannu Lahtela & Heikki.
© Crown copyright Met Office Cost benefit studies for observing systems Stuart Goldstraw, Met Office, CBS-RA3-TECO-RECO, 13 th September 2014.
Lineage February 13, 2006 Geog 458: Map Sources and Errors.
MWR Algorithms (Wentz): Provide and validate wind, rain and sea ice [TBD] retrieval algorithms for MWR data Between now and launch (April 2011) 1. In-orbit.
OECD Short-Term Economic Statistics Working PartyJune Analysis of revisions for short-term economic statistics Richard McKenzie OECD OECD Short.
The AMDAR Observing System Mexico AMDAR Regional Workshop, 8-10 November 2011 Dean Lockett Observing Systems Division, WMO.
TASK TEAM ON WIGOS QUALITY MANAGEMENT (TT-WQM). Topics:  Activities and discussions of the task team  Proposal for WIGOS Quality Monitoring System ST-QM.
NMHS and Airline Responsibilities for AMDAR Programme Implementation & Operation Mexico AMDAR Regional Workshop, 8-10 November 2011 Dean Lockett Observing.
Geostatistical approach to Estimating Rainfall over Mauritius Mphil/PhD Student: Mr.Dhurmea K. Ram Supervisors: Prof. SDDV Rughooputh Dr. R Boojhawon Estimating.
Advances in the use of observations in the ALADIN/HU 3D-Var system Roger RANDRIAMAMPIANINA, Regina SZOTÁK and Gabriella Csima Hungarian Meteorological.
Deutscher Wetterdienst Measurement Technology Humidity Measurements by Aircraft of the E-AMDAR Fleet TECO 2008 Axel Hoff Deutscher Wetterdienst Observing.
1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009.
11-12 June 2015, Bari-Italy Coordinating an Observation Network of Networks EnCompassing saTellite and IN-situ to fill the Gaps in European Observations.
Eurostat Overall design. Presented by Eva Elvers Statistics Sweden.
Inter-comparison and Validation Task Team Breakout discussion.
Data assimilation and observing systems strategies Pierre Gauthier Data Assimilation and Satellite Meteorology Division Meteorological Service of Canada.
® Kick off meeting. February 17th, 2011 QUAlity aware VIsualisation for the Global Earth Observation system of systems GEOVIQUA workshop February, the.
Dataset Development within the Surface Processes Group David I. Berry and Elizabeth C. Kent.
Adaptation Baselines Through V&A Assessments Prof. Helmy Eid Climate Change Experts (SWERI) ARC Egypt Material for : Montreal Workshop 2001.
1 Mexico Regional AMDAR Workshop November 2011 Data Quality Monitoring and Control (QM / QC) Axel Hoff Convenor of WMO AMDAR Panel‘s Science and Technical.
© Crown copyright Met Office Plans for Met Office contribution to SMOS+STORM Evolution James Cotton & Pete Francis, Satellite Applications, Met Office,
GLFE Status Meeting April 11-12, Presentation topics Deployment status Data quality control Data distribution NCEP meeting AirDat display work Icing.
Observations From the Global AMDAR Program Presentation to WMO TECO May 2005 by Jeff Stickland Technical Coordinator, WMO AMDAR Panel.
Requirements from KENDA on the verification NetCDF feedback files: -produced by analysis system (LETKF) and ‘stat’ utility ((to.
Soil moisture generation at ECMWF Gisela Seuffert and Pedro Viterbo European Centre for Medium Range Weather Forecasts ELDAS Interim Data Co-ordination.
WMO Aircraft-Based Observations Programme Strategy and Implementation Plan to 2025 WMO; Observing and Information Systems Department.
© Crown copyright Met Office The EN4 dataset of quality controlled ocean temperature and salinity profiles and monthly objective analyses Simon Good.
Update on Dropout Team Work and Related COPC Action Items Presented by Dr. Bradley Ballish Co-Chair JAG/ODAA and Member of Dropout Team 5 May 2010 COPC.
F. Prates/Grazzini, Data Assimilation Training Course March Error Tracking F. Prates/ F. Grazzini.
Yield Cleaning Software and Techniques OFPE Meeting
Status of AMDAR Implementation in Japan, Forecast Department, Japan Meteorological Agency Prepared for APSDEU-6 Seoul, Korea, 1 June 2005.
WMO AMDAR Programme Overview Bryce Ford - presenting on behalf of WMO and NOAA FPAW Nov 1, 2012.
Page 1 Andrew Lorenc WOAP 2006 © Crown copyright 2006 Andrew Lorenc Head of Data Assimilation & Ensembles Numerical Weather Prediction Met Office, UK Data.
CryoNet Design Principles draft 3. background From the Abridged Final Report of the 17 th World Meteorological Congress mandates (§ 8.8): “Global Cryosphere.
REVIEW OF OUTCOMES FROM THE 2 nd WIGOS Workshop on Quality Monitoring and Incident Management ICG-WIGOS-5 WMO; OBS/WIGOS.
Deutscher Wetterdienst FE VERSUS 2 Priority Project Meeting Langen Use of Feedback Files for Verification at DWD Ulrich Pflüger Deutscher.
OSEs with HIRLAM and HARMONIE for EUCOS Nils Gustafsson, SMHI Sigurdur Thorsteinsson, IMO John de Vries, KNMI Roger Randriamampianina, met.no.
Data Quality Monitoring in RA I Nairobi Regional Meteorological Center Eng. Henry Karanja
Copyright 2010, The World Bank Group. All Rights Reserved. Producer prices, part 2 Measurement issues Business Statistics and Registers 1.
Station lists and bias corrections Jemma Davie, Colin Parrett, Richard Renshaw, Peter Jermey © Crown Copyright 2012 Source: Met Office© Crown copyright.
Understanding and Improving Marine Air Temperatures David I. Berry and Elizabeth C. Kent National Oceanography Centre, Southampton
June 20, 2005Workshop on Chemical data assimilation and data needs Data Assimilation Methods Experience from operational meteorological assimilation John.
Slide 1 Investigations on alternative interpretations of AMVs Kirsti Salonen and Niels Bormann 12 th International Winds Workshop, 19 th June 2014.
SeaDataNet Technical Task Group meeting JRA1 Standards Development Task 1.2 Common Data Management Protocol (for dissemination to all NODCs and JRA3) Data.
Data aggregation and products generation in the Mediterranean Sea
Chairs: James Cotton and Niels Bormann
Plans for Met Office contribution to SMOS+STORM Evolution
ET-SBO Report to ICT-IOS-9
Document 5.4.1(3) Observing System Design and Evolution
The WMO Rolling Review of Requirements and the OSCAR tools
Dissemination Workshop for African countries on the Implementation of International Recommendations for Distributive Trade Statistics May 2008,
Handbook on Meteorological Observations
Lidia Cucurull, NCEP/JCSDA
Impact of adjusting the times of radiosonde launches
6.1 Quality improvement Regional Course on
ICG-WIGOS January 2017 CAeM Report Jitze van der Meulen
Impact of aircraft data in the MSC forecast systems
ECMWF usage, governance and perspectives
Presentation transcript:

Data Quality Control and Quality Monitoring Jitze van der Meulen,  WMO AMDAR PANEL

Data Quality Control and Quality Monitoring2 Workshop on Aircraft Observing System Data Management Workshop on Aircraft Observing System Data Management (2012) Recognize comparisons of aircraft observations to NWP model background fields as a critical component of Aircraft Observations Quality Control (AO QC). Consider whether or not such comparisons should be done before AO data are exchanged on the GTS. Semi-automatic near real time monitoring information such as data counts, missing data, higher than normal rejects by the assimilation system, etc. should be exchanged regularly (monthly or more frequently as required and agreed to) between designated centres and data managers (and/or producers). This could include and Alarm/Event system. Consideration should be given to the designation of centres to carry out international QC of Aircraft Observations (WMO and ICAO), possibly before insertion on the GTS, to flag the data. That distribution of ICAO automated aircraft observations on the GTS be done using WMO approved format (BUFR) with an appropriate template (similar to the AMDAR ones) for clear identification of the source of the data (ADS, MODE-S, Aircraft ID, etc.).

Data Quality Control and Quality Monitoring3 The AMDAR Panel has identified 20 key aspects for further developing the Aircraft Observations Quality Control (AO QC); 3rd party data ADS (ICAO), other new data sources (Mode-S) Archiving (data and metadata) Delivery (level II data; also profile data for local), relation to time/place resolution) Optimization of observations data targeting (additional, for applications) data coverage (global), provision (e.g. Africa); programme extentions Developing countries, special constraints (data comm. issues) Data format (incl. resolution) Code issues (incl. data header) Data display Data access Data transfer Typical data: Atmosph. Composition data Phenomena: Icing, Turbulence, use of data (e.g. direct input, verification) Timeliness (taking into account Q/C processes) Data checking, filtering, flagging (relation with rules, M.GDPFS) Excluding aircraft (how to manage) Quality control: monitoring (availability), technics (NWP), stages (real time, off-line); flagging principles; archiving; logistics; feed back Metadata (definition, use, archive)

Data Quality Control and Quality Monitoring4 Outline for the Definition of the Global AO DM Framework

Data Quality Control and Quality Monitoring5 Use of NWP NWP model forecast background fields are regarded as the most appropriate references for (near) real time quality control. For operational practices it's will be necessary to evaluate these references to define the moist appropriate choices (update intervals, forecast interval), time and space interpolation techniques or algorithms. NWP background fields as defined used for references require sufficient information on its uncertainties. Traceability to objective observations is required, providing information on its uncertainties (time and place related) and possible seasonal variations or daily characteristics (daytime/night time). In particular altitude related bias behaviour is relevant.

Data Quality Control and Quality Monitoring6 Temperature differences statistics (all observations / quarter)

Data Quality Control and Quality Monitoring7 Background references For reference: positions: interpolation of grid point (3D) time: nearest time stamp of run time (analyses) or forecast (+nH), called background HIRLAM/HARMONIE (short range, high resolution): 00 (run), 03 (00+3H), 06 (run), 09 (06+3H), 12 (run), etc. ECMWF (medium term): 00 (run), 03 (00+3H), 06 (00+6H), 09 (00+9H), 12 (run –or H), etc.

Data Quality Control and Quality Monitoring8 HIRLAM/Harmonie error std

Data Quality Control and Quality Monitoring9 Interpretation of differences Should be based on rules, defined by Int. Metrological Organizations (ICSU, BIPM) Statistical analyses first, then definition of parameter used for further interpretation and requirements Expressed in terms of uncertainty (not STD or RMSE), preferably 95% confidence interval

Data Quality Control and Quality Monitoring10

Data Quality Control and Quality Monitoring11 Observed wind vector Uncertainty in FF Uncertainty in V Uncertainty in U Uncertainty in F (95% conf.)

Data Quality Control and Quality Monitoring12 Windvector difference - expressed as scalar

Data Quality Control and Quality Monitoring13 Windvector difference (median of daily means) - expressed as scalar

Data Quality Control and Quality Monitoring14 Profiles: requirements, detailness

Data Quality Control and Quality Monitoring15 Obs data = processed data AMDAR sensor data processing

Data Quality Control and Quality Monitoring16 TA bias by aircraft type

Data Quality Control and Quality Monitoring17 TA bias by aircraft type

Data Quality Control and Quality Monitoring18 TA bias by aircraft type

Data Quality Control and Quality Monitoring19 TA bias by aircraft type

Data Quality Control and Quality Monitoring20 TA bias by aircraft type

Data Quality Control and Quality Monitoring21 Types of errors  Observations (air temperature, wind, humidity) are incorrectly measured or derived ( i.e. not confirming the required measurement uncertainties)  Incorrect position or time of observation  Incorrect encoding ( e.g. for altitude). May be reporting incorrect positions affects currently NWP most seriously, especially when reporting from (virtual) areas with few observations (data sparse areas), like over oceans and seas and especially at lower altitudes. Using NWP background fields only, usually no distinction can be made between these three types and it is assumed that the quantities are incorrectly measured only. Appropriate tools to detect horizontal and vertical positional errors are still no part of standard QC practices, although some methods are implemented (distance check between observations). The same holds for detecting inaccurate date and time stamps and latency in processing data on board ( may be Mode-S data comparisons may be useful ).

Data Quality Control and Quality Monitoring22 Typically, some FM42 & BUFR reports show PALT, FL < pressure altitude[RWY] Typically, some reports show only FL > altitude[RWY] or only FL > 0 The altitude issue (continued) …..

Data Quality Control and Quality Monitoring23 Pressure altitude versus altitude Delta(ISA-T)/K  PALT/m  P/hPa |  -1.4 K +1.2 K  | Bias: +0.1 K (median) U=1.3K

Data Quality Control and Quality Monitoring24 Coverage

Data Quality Control and Quality Monitoring25 PALT < 200 m

Data Quality Control and Quality Monitoring26 Timeliness

Data Quality Control and Quality Monitoring27 BUFR templates: confusion all around FM42 (only one) UD AS01 RJTD RRX AMDAR 0111 ASC JP9ZX N 13949E //// PS /012 TB/ S F002 VG///= BUFR Upper air aircraft :(multiple) IUA C01 RJTD BUFR " Ý €Ë ð JP9Z4XW5}ÕH ¿ºwaTœü˜^5f?ÿô¥£UuƒQd€GQù€˜D[WóÿÿJP9Z5WX9}ÕK ±[ ucE€ŒXIuxÿÿô¥£SsU§ÝQdÀGRk‘€˜Ä„ƒWÿÿJP9Z575Z}ÕO ±Fuil€tPIuxÿÿô¥£E…sWÝQd ð ›Çj‘€™ÐI†3UÿÿÿJP9Z5WX9}ÕQ€ ³.Puó€|D Oõr¿ÿô¥£U…§ÝQe éh‡-Wa€–ÖŃYçÿÿ7777 Source: Manual on the GTS, not Manual on Codes

Data Quality Control and Quality Monitoring28 BUFR templates: confusion all around 11 templates for Class 11 reports: Single level report sequences (conventional data), Recommended template: BUFR template for AMDAR, version 7 (ref. tabel , optionally added with IAGOS template for a single observation, version 2 However all BUFR originating centres produce all different (obsolete) templates. Profile reports are not generated

Data Quality Control and Quality Monitoring29 BUFR templates: confusion all around Source: Manual on Codes

Data Quality Control and Quality Monitoring30 BUFR templates: confusion all around

Data Quality Control and Quality Monitoring31 BUFR templates: confusion all around

Data Quality Control and Quality Monitoring32 