Presentation is loading. Please wait.

Presentation is loading. Please wait.

United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.

Similar presentations


Presentation on theme: "United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1."— Presentation transcript:

1 United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1

2 2 Introduction  IRES Chapter 9: deals with Data Quality Assurance and Meta Data  Under IRES, countries are encouraged to: Develop national quality assurance programs Document these programs Develop measures of data quality Make these available to users

3 3 Prerequisites of Data Quality  Institutional and organizational conditions, including:  Legal basis for compilation of data  Adequate data-sharing and coordination between partners  Assurance of confidentiality and security of data  Adequacy of resources – human, financial, technical  Efficient management of resources  Quality awareness

4 4 Promoting Data Quality  Make quality a stated goal of the organization  Establish standards for data quality  Track quality indicators  Conduct regular quality assurance reviews  Develop centres of expertise to promote quality  Deliver quality assurance training

5 5 What is a Quality Assurance Framework?  All planned activities to ensure data produced are adequate for their intended use  Includes: standards, practices, measures  Allows for: Comparisons with other countries Self-assessment Technical assistance Reviews by international and other users  See Figure 8.1 for examples of quality frameworks

6 6 Quality Assurance Framework  Six Dimensions of Data Quality, based on ensuring “fitness for use” 1.Relevance 2.Accuracy 3.Timeliness 4.Accessibility 5.Interpretability 6.Coherence

7 7 Quality Measures and Indicators  Should cover all elements of the Quality Assurance Framework  Methodology should be well-established, credible  Must be easy to interpret and use  Should be practical – reasonable, not an over- burden  For Key Indicators, see Chapter 8, Table 8.2

8 8 Sample Quality Indicators  From IRES Table 9.2, linked to QA Framework  Relevance: user feedback on satisfaction, utility of products and data  Accuracy: response rate, weighted response rate, number and size of revisions  Timeliness: time lag between reference period and release of data  Accessibility: number of hits, number of requests  Interpretability: amount of background info available  Coherence: validation of data from other sources

9 Quality assurance must be built into all stages of the survey process Survey Stages: 1.Specify needs 2.Design 3.Build 4.Collect 5.Process 6.Analyze 7.Disseminate 8.Archive 9.Evaluate 9 Quality Assurance Framework

10 1. Specify Needs Activities:  Determine needs: define objectives, uses, users  Identify concepts, variables  Identify data sources and availability  Prepare business case Quality Assurance  Consult with users and key stakeholders  Clearly state objectives, concepts  Establish quality targets  Check sources for quality, comparability, timeliness  Gather input and support from respondents 10

11 2. Design Activities:  Determine outputs  Define concepts, variables  Design data collection methodology  Determine frame & sampling strategy  Design production processes Quality Assurance  Consult users on outputs  Select, test & maintain frame  Design & test questionnaire and instructions  Use established standards  Develop processes for error detection  Develop & test imputation 11

12 3. Build Activities:  Build collection instrument  Build processing system  Design workflows  Finalize production systems Quality Assurance  Focus test questionnaire with respondents  Test systems for functionality  Test workflows; train staff  Document  Develop quality measures 12

13 4. Collect Activities:  Select sample  Set up collection  Run collection  Finalize collection Quality Assurance  Maintain frame  Train collection staff  Use technology with built in edits  Implement verification procedures  Monitor response rates, error rates, follow-up rates, reasons for non- response 13

14 5. Process Activities:  Integrate data from all sources  Classify and code data  Review, validate and edit  Impute for missing or problematic data  Create and apply weights  Derive variables Quality Assurance  Monitor edits  Implement follow-ups  Focus of most important respondents  Analyze and correct outliers 14

15 6. Analyze Activities:  Transform data to outputs  Validate data  Scrutinize and explain data  Apply disclosure controls  Finalize outputs Quality Assurance  Track all indicators  Calculate quality indicators  Compare data with previous cycles  Do coherence analysis  Validate against expectations and subject matter intelligence  Document all findings 15

16 7. Disseminate Activities:  Load data into output systems  Release products  Link to meta data  Provide quality indicators  Provide user support Quality Assurance  Format & review outputs  Verify that tools do not introduce errors  Verify disclosure control  Ensure all meta data and quality idicators are available  Provide contact names for user support 16

17 8. Archive Activities:  Create rules and procedures for archiving and disposal  Maintain catalogues, formats, systems Quality Assurance  Periodic testing of processes and systems  Ensure meta data are attached 17

18 9. Evaluate Activities:  Conduct post mortem reviews to assess performance, identify issues  Take corrective actions or make new investments, as required Quality Assurance  Consult with clients about needs, concerns  Monitor key quality indicators  Periodic data quality reviews  Perform ongoing coherence analysis  Compare with best practices elsewhere 18

19 19 Meta Data  Important for assessing “fitness for use” and ensuring interpretability  Required at every step of the survey process  Critical for enabling comparisons with other data  Should include results of data quality reviews  Figure 8.4: generic set of meta data requirements

20 20 Future of Meta Data  Should become a driver of survey design  Can be used proactively to prescribe definitions, concepts, variables and standards  Can support the harmonization of international surveys and data  Efforts are underway to create an integrated approach for producing and recording meta data

21 21 Thank you! Andy Kohut, Director Manufacturing and Energy Division Statistics Canada Section B-8, 11 th Floor, Jean Talon Building Ottawa, Ontario Canada K1A 0T6 Telephone: 613-951-5858 E-mail: andy.kohut@statcan.gc.caandy.kohut@statcan.gc.ca www.statcan.gc.ca


Download ppt "United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1."

Similar presentations


Ads by Google