Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quality assurance and data quality in relation to trend assessments in EMEP Kjetil Tørseth, Wenche Aas and Jan Schaug Documenting changes in atmospheric.

Similar presentations


Presentation on theme: "Quality assurance and data quality in relation to trend assessments in EMEP Kjetil Tørseth, Wenche Aas and Jan Schaug Documenting changes in atmospheric."— Presentation transcript:

1 Quality assurance and data quality in relation to trend assessments in EMEP Kjetil Tørseth, Wenche Aas and Jan Schaug Documenting changes in atmospheric chemical composition is one of the main objectives for the measurement programme requires consistent measurements over long periods and knowledge on factors which may influence the results This presentation will give a brief overview of the quality assurance and the data quality for the early years of the programme. Some experiences form the re-evaluation of data in relation to the “assessment report” will also be presented

2 Data availability and quality in 1999

3 History EMEP is an extension of the OECD project LRTAP ( ) . Preparatory phase initiated in 1977 and with regular measurements from 1978. Focus on acid rain and sulphur compounds Had a mandatory part and an extended voluntary part (to include as many participants as possible) Initially no fixed Data Quality Objectives All data of a reasonable quality was welcomed Different efforts for data quality improvements and harmonisation have been in place from the start Manual for sampling and chemical analysis (1978) Laboratory intercomparisons Data checking in cooperation with countries and flagging Training and audits In addition a series of workshops and expert meetings have been arranged

4 Documentation (available from CCC)
OECD and SNSF reports Acidifying and eutrophying compounds; (53 reports) Ozone (11) VOC (8) Heavy metals and persistent organic compounds (8) Particulate matter (3) Laboratory intercomparisons (20) Field intercomparisons (3) Expert meetings and workshops (23) Phase reports (7) Quality assurance (5) Other (11) more than 150 reports Contributions to joint reports, peer reviewed papers Technical notes

5 Problems encountered Countries used different methodologies
Time series existed Unwilling or unable to change methods Not all precipitation components were analysed (only a few sites prior to 1987) Insufficient information about methods used Metadata not known Lack of experience or documentation Not adequate qualifications in a number of countries Lack of efficient tools for quality control Problems in data reporting Proposals for a series of field intercomparisons were put forward in 1983 but was not funded before 1986

6 Field Intercomparisons
Field intercomparison of measuring methods for sulphur dioxide and particulate sulphate in ambient air (Langenbrügge)(EMEP/CCC-Report 2/86) Vavihill, Field Intercomparison of Samplers for Sulphur Dioxide and Sulphate in Air (EMEP/CCC-Report 4/91) Comparison of Measuring Methods for Nitrogen Dioxide in Ambient Air (Kleiner Feldberg)(EMEP/CCC-Report 2/86) Non-EMEP intercomparisons (with participation but not published by EMEP) Field comparisons, co-located with reference instrumentation

7 Schauinsland

8

9

10

11 SO4 SO2

12

13

14

15 List of flags used in the EMEP data base
All flags are grouped in two categories: V (valid measurement) or I (invalid measurement). Flags are sorted according to severity. Flags above 250 indicate an exception that has invalidated or reduced the quality of the data element. Flags below 250 indicate that the element is valid, even if it may fail simple validation tests. The flag 100 is used to indicate that a value is valid even if an exception in the range has also been flagged. In this case the 100 flag must appear before the other flags. In all other cases, the most severe flag should appear first if more than one flag is needed. Group 9: Missing flags Group 8: Flags for undefined data elements Group 7: Flags used when the value is unknown Group 6: Mechanical problem Group 5: Chemical problem Group 4: Extreme or inconsistent values Group 2: Exception flags assigned by the database co-ordinator

16 Detailed example; Group 4
Group 4: Extreme or inconsistent values 499 INU V Inconsistent with another unspecified measurement 478 IBA I Invalid due to inconsistency discovered through ion balance calculations 477 ICO I Invalid due to inconsistency between measured and estimated conductivity 476 V Inconsistency discovered through ion balance calculations, but considerd valid 475 V Inconsistency between measured and estimated conductivity, but considerd valid 460 ISC I Contamination suspected 459 EUE I Extreme value, unspecified error 458 EXH V Extremely high value, outside four times standard deviation in a lognormal distribution 457 EXL V Extremely low value, outside four times standard deviation in a lognormal distribution 456 IDO I Invalidated by data originator 451 I Invalid due to large sea salt contribution 450 V Considerable sea salt contribution, but considered valid

17 Should be considered; TCM-method; is not appropriate, particular in the summer Sweden and Finland changed from absorbing solutions after the Vavihill exercise Norway have used filterpacks over the whole period Denmark also stated with filterpacks early UK data consistent and of good quality Romanian data (from the WMO BAPMON network) should be discarded Old data from Yugoslavia and Hungary should be re-examined

18 Current status with respect to Re-evaluation of data;
Many countries have reported, others have indicated that they will do so soon (deadline was Oct. 1st 2001) Most countries have found some deviating results from what presented at the internet but in general differences are small Reasons for deviations are vary errors in mirroring to the web conversion factors corrected or rejected data errors in files submitted or punching errors etc. Very few countries have really re-assessed their data National documentation on reporting and corrections is sparse Focus should be on systematic changes No updates to the data are made yet !

19 Proposal for further actions
The CCC should in close cooperation with the countries re-assess the data, adding on the quality flags as presented in TFMM2 This will ensure a consistent database with validated and invalidated data The DQO for the basic data should be used, and data that comply with the DQO are valid data without reservations. It is further proposed that data that is considered to be more than 25 per cent from the “true value”, judged from the relevant laboratory comparisons, are considered to be invalid. Ion balances and statistical tests, graphical inspections of time series, and the important results from the Langenbrügge, the Kleiner Feldberg, and the Vavihill field comparisons are important elements in an re-evaluation together with results from more recent field exercises.


Download ppt "Quality assurance and data quality in relation to trend assessments in EMEP Kjetil Tørseth, Wenche Aas and Jan Schaug Documenting changes in atmospheric."

Similar presentations


Ads by Google