High quality official statistics: benchmarking as an integral part of a quality management system David Fenwick UNECE/ILO Joint meeting on CPIs Geneva.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

HELPING THE NATION SPEND WISELY Performance audit and evaluation: common ground with Internal Audit ? The UK National Audit Office experience Jeremy Lonsdale.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Project Monitoring Evaluation and Assessment
The quality framework of European statistics by the ESCB Quality Conference Vienna, 3 June 2014 Aurel Schubert 1) European Central Bank 1) This presentation.
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
Quality evaluation and improvement for Internal Audit
Purpose of the Standards
Standards and Guidelines for Quality Assurance in the European
TC176/IAF ISO 9001:2000 Auditing Practices Group.
Conducting the IT Audit
Internal Auditing and Outsourcing
The use and convergence of quality assurance frameworks for international and supranational organisations compiling statistics The European Conference.
A Common Immigration Policy for Europe Principles, actions and tools June 2008.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
Quality assurance activities at EUROSTAT CCSA Conference Helsinki, 6-7 May 2010 Martina Hahn, Eurostat.
European Statistical Law – in preparation Kirsten Wismer & Lars Thygesen.
Overview report of a series of FVO fact- finding missions and audits carried out in 2012 and 2013 in order to evaluate the systems put in place to give.
GUIDELINES ON CRITERIA AND STANDARDS FOR PROGRAM ACCREDITATION (AREA 1, 2, 3 AND 8)
Quality Assurance in the European Higher Education Area Tibor Szanto ENQA Rogaska Slatina, 30 November 2007.
European Conference on Quality in Official Statistics, Rome 8-11 July Satisfying User and Partner Needs- the Use of Specific Reviews at Eurostat.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
© OECD A joint initiative of the OECD and the European Union, principally financed by the EU. Quality Assurance José Viegas Ribeiro IGF, Portugal SIGMA.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Interagency Cooperation Dr Laura Cleary. Scope Terminology Rationale Benefits National Security Strategy & Border Management: Two Examples Critical Factors.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
Commissioning Self Analysis and Planning Exercise activity sheets.
Assessing the Capacity of Statistical Systems Development Data Group.
User needs Iain MacLeay – Head Energy Balances, Prices and Publications Date May 2009.
FOURTH EUROPEAN QUALITY ASSURANCE FORUM "CREATIVITY AND DIVERSITY: CHALLENGES FOR QUALITY ASSURANCE BEYOND 2010", COPENHAGEN, NOVEMBER IV FORUM-
Monitoring public satisfaction through user satisfaction surveys Committee for the Coordination of Statistical Activities Helsinki 6-7 May 2010 Steve.
1 Status of PSC recommendations (January December 2007) Portfolio Committee on Public Service and Administration 14 March 2008.
Result Orientation in Interreg CENTRAL EUROPE Annual Meeting, Luxemburg, 15 September 2015 Monika Schönerklee-Grasser, Joint Secretariat.
Performance Stories Evaluation - A Monitoring Method to Enhance Evaluation Influence Riad Naji, Catriona King, Richard Habgood.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
European Conference on Quality in Official Statistics 8-11 July 2008 Mr. Hing-Wang Fung Census and Statistics Department Hong Kong, China (
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
Implementation of the European Statistics Code of Practice Yalta September 2009 Pieter Everaers, Eurostat.
Changes in the context of evaluation and assessment: the impact of the European Lifelong Learning strategy Romuald Normand, Institute of Education Lyon,
1 The Future Role of the Food and Veterinary Office M.C. Gaynor, Director, FVO EUROPEAN COMMISSION HEALTH & CONSUMER PROTECTION DIRECTORATE-GENERAL Directorate.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Assessment Validation. MORE THAN YOU IMAGINE ASQA (Australian Skills Quality Authority) New National Regulator ASQA as of 1 July, 2011.
UNEP EIA Training Resource ManualTopic 14Slide 1 What is SEA? F systematic, transparent process F instrument for decision-making F addresses environmental.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
TC176/IAF ISO 9001:2000 Auditing Practices Group.
ICAJ/PAB - Improving Compliance with International Standards on Auditing Planning an audit of financial statements 19 July 2014.
How Good are you at Managing your Processes? Operational Excellence.
Croatia: Result orientation within the process of preparation of programming documents V4+ Croatia and Slovenia Expert Level Conference Budapest,
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Folie 1 Sarajevo, October 2009 Stefan Friedrichs Managing Partner Public One // Governance Consulting Project Management in the Public Sector Monitoring.
1 Recent developments in quality matters in the ESS High level seminar for Eastern Europe, Caucasus and Central Asia countries Claudia Junker, Eurostat,
1 Recent developments in quality related matters in the ESS High level seminar for Eastern Europe, Caucasus and Central Asia countries Claudia Junker,
Internal Audit Quality Assessment Guide
TAIEX-REGIO Workshop on Applying the Partnership Principle in the European Structural and Investment Funds Bratislava, 20/05/2016 Involvement of Partners.
Project: EaP countries cooperation for promoting quality assurance in higher education Maria Stratan European Institute for Political Studies of Moldova.
21 June 2011 High level seminar for EECCA on “Quality matters in statistics” High level seminar for EECCA on “Quality matters in statistics” The Code of.
Auditing Sustainable Development Goals
Overview of the ESS quality framework and context
Assessment of Quality in Statistics GLOBAL ASSESSMENTS, PEER REVIEWS AND SECTOR REVIEWS IN THE ENLARGEMENT AND ENP COUNTRIES Mirela Kadic, Project Manager.
Palestinian Central Bureau of Statistics
Taking the STANDARDS Seriously
European Statistics Code of Practice
Overview of the ESS quality framework and context
Presentation transcript:

High quality official statistics: benchmarking as an integral part of a quality management system David Fenwick UNECE/ILO Joint meeting on CPIs Geneva 2014

Background: what is Benchmarking? Benchmarking is. – The process of comparing with, and learning from, others about what you do and how well you do it, with the aim of creating improvements. – Part of a continuous process of measuring products, services and practices against those organisations recognised to be industry leaders.

Background: general benefits of Benchmarking – for any process Establishes a baseline for performance. Compares performance and practice with others. Captures best/new ideas from tested and proven practices. Identifies methods of achieving improved performance. Facilitates greater involvement and motivation of staff in change programmes. Helps to identify/clarify goals. Identifies management information gaps e.g. relative costs.

Background: benefits of Benchmarking - NSIs & the European context From the perspective of a National Statistical Institute. – Leads to a better understanding of the ‘big picture’ and gaining a broader perspective of the factors (or enablers) that facilitate the implementation of good practice. – Facilitates the sharing of solutions to common problems and builds consensus. – Facilitates collaboration and understanding between organisations. In Europe. – Benchmarking is being used as a tool for bringing about improvements in performance in both public sector and commercial organisations as a means of increasing the competitiveness of the European economy as a whole. – The European Commission has spearheaded a number of benchmarking initiatives over recent years in which member states lead projects that will enable them to learn from shared experiences and good practice.

Benchmarking & Eurostat’s Peer Review Process The European Statistics Code of Practice. – Sets the standard for developing, producing and disseminating European statistics. – It builds upon a common European Statistical System definition of quality in statistics. The Code of Practice contains fifteen principles which address the institutional environment under which national and EU statistical authorities operate as well as the production and dissemination of European statistics. – Associated with the fifteen principles is set of indicators of good practice.

Benchmarking & Eurostat’s Peer Review Process The institutional environment. – Institutional and organisational factors have a significant influence on the effectiveness and creditability of a statistical authority & its statistics. – Relevant issues are professional independence; mandate for data collection; adequate resources; quality commitment; statistical confidentiality; impartiality and objectivity. E.g. first code relates to the Institutional environment & states. “Professional independence of statistical authorities from other policy, regulatory or administrative departments and bodies, as well as from private sector operators, ensures the credibility of European Statistics”.

Benchmarking & Eurostat’s Peer Review Process First principle has eight indicators. – Independence from political and other external interference. – Heads of Statistics senior level access to policy authorities and administrative public bodies & are of the highest professional calibre. – Heads of Statistics have responsibility for ensuring that statistics are independent. – Heads of Statistics decide methodology, content & timing of releases. – Work programmes are published and periodic progress reports made. – Statistical releases are separate from political/policy statements. – Public comment on statistical issues. – The appointment of the Head of Statistics is based on professional competence.

Benchmarking & Eurostat’s Peer Review Process Principles relating to production and dissemination deal with. – Whether the available statistics meet users’ needs and whether the statistics comply with European quality standards and serve the needs of European institutions, governments, research institutions, business concerns and the public generally. – The extent to which the statistics are relevant, accurate and reliable, timely, coherent, comparable across regions and countries, and readily accessible by users.

Benchmarking & Eurostat’s Peer Review Process For instance, Principle 12, covering outputs, relates to whether the European Statistics accurately and reliably portrays reality. It is assessed in the basis of three indicators. – Source data, intermediate results and statistical outputs are regularly assessed and validated. – Sampling errors and non-sampling errors are measured and systematically documented according to the European standards. – Revisions are regularly analysed in order to improve statistical processes.

Benchmarking & Eurostat’s Peer Review Process Compliance to the Code of Practice principles is assessed through a self-regulatory approach. Peer reviews in 2006/07 focused on institutional environment and dissemination. – Comprised of self-assessments & peer reviews conducted by experts from other NSIs & Eurostat. – 3-days visits on-site with NSI representatives & main stakeholders. – Exercise resulted in reports covering all reviewed countries, improvement actions by the NSIs & a set of good practices – Final report presented to the European Parliament and the Council in – Eurostat’s monitoring was considered effective and proportionate.

Benchmarking & Eurostat’s Peer Review Process Four characteristics contributed to success of Eurostat’s peer reviews. – Professionalism. Competence & objectivity. – Helpfulness. Making useful and beneficial suggestions. It is usually much harder to suggest how to fix problems than identifying them. – Realism. Not setting standards that are too high, whilst not compromising compliance to Eurostat regulations. – Engagement. Engaging in a dialog with the National Statistical Institute that is being reviewed. The reviewer has been able to put themselves into the shoes of the reviewed.

Benchmarking: the ONS/ABS joint exercise - background Exercise initiated by Australian Bureau of Statistics (ABS). – ONS invited to participate. One motivation was data collection & ONS’ use of handheld computers, but also felt to be “similar” organisations & possibility of learning from each others methodologies – quality CPIs compiled cost-effectively. – Eventually just 2 partners : ABS & ONS. Ideally, a benchmark exercise should involve 3 or more. Important to. – Determine purpose and boundaries at early stage. – Operate within agreed operational guidelines & objectives. Including mode of working. – Focus on a pre-determined set of performance indicators. Importance of memorandum of understanding. Knowledge accrues over time. – May need to review e.g. Performance indicators.

The ONS/ABS Benchmarking Process: an overview Set-up benchmarking team.  Agree the purpose & the mode of working.  Identify activities/processes which will benefit from benchmarking, and plan the project  Map the process and identify areas for improvement.  Collect data/arrange & organise visits.  Analyse the data to produce an action plan aimed at delivering performance improvements.  Identify the best practices.  Win support for & implement action plan.  Monitor results & develop strategy of continuous improvement. Iterative process

The ONS/ABS Benchmarking Process: initial discussions An initial face-to-face meeting took place. – To agree basic scope of the benchmarking & the method of working for the benchmarking of consumer price indices. Critical. – It determined future direction of the project. – Laid down the foundations for co-operation. – Also secured an understanding of the mutual commitment involved. Subsequently formalised in a memorandum of understanding.

The ONS/ABS Benchmarking Process: initial discussions -the objectives The agreed objectives were. – To evaluate outcomes and comparative value for money and identify best practices. – To have a proper understanding of own performance, including cost-effectiveness. – To improve performance through learning from others.

The ONS/ABS Benchmarking Process: initial discussions - the principles determining the scope of the benchmarking Criteria for the inclusion of issues included. Aspects of data collection and index construction which were seen by at least one party as a potential issue. – I.e. there was prima facia case for believing that sub-optimal solutions were being pursued. Potential issues that were common to both parties, or where there was scope for sharing experience. – E.g. where the approaches taken by the two parties differed, thereby providing scope for comparisons of different solutions. Issues with a broad focus that went beyond the bounds of methodology to issues relating to. – E.g. value for money where data collection and index compilation costs are compared with the quality of the statistical output.

The ONS/ABS Benchmarking Process: initial discussions - the principles determining the scope of the benchmarking Purely methodological reviews were not seen as part of the scope of this particular exercise as effective mechanisms already existed for the latter. – E.g. the rolling programme of methodological audits of major statistical outputs conducted by ONS. Rules of engagement also determined – including the freely sharing of information & respecting confidentiality. OBSERVATIONS: The initial meeting. Demonstrated the importance of conducting sufficiently detailed preliminary background work before entering into bi-lateral discussions. Highlighted the realisation that pre-conceived ideas about the outcomes could seriously prejudice the objectivity of the benchmarking exercise. The importance of expressing things in a neutral way. – E.g. “the relative merits of recording prices using handheld computer compared with pen and paper” rather than “the case for moving from pen and paper to handheld computers for the collection of prices”.

The ONS/ABS Benchmarking Process: collection & analysis of information Major task to generate relevant & comprehensive material. The drawing of process maps to give comparative overviews of the statistical processes. The design & completion of a standard initial questionnaire for basic fact. – Moderated by use of readily available meta data. An initial comparative analysis by ABS followed by an discussion and agreement on the main issues raised. – This stage also identified additional information that needed to be collected e.g. information on costs. Repetitions of the above stage until issues had been satisfactorily resolved. The drafting of the final report plus a post-hoc evaluation of the benchmarking process. OBSERVATION: Acted as a focus for inward-reflection and generated issues that had not been identified at the initial scoping exercise but collection of additional information lead to diminishing returns (particularly on costs).

The ONS/ABS Benchmarking Process: Process map (UK CPI)

The ONS/ABS Benchmarking Process: collection & analysis of financial information Different financial recording systems were in place. – Comparative information at a detailed level could not be extracted easily. The crude breakdown of financial information that resulted added very little value. – Awareness of need not to take cost information out of context. – Cost differentials needed to be undertaken with care. Apart from a handful of key processes broad based estimates of very specific costs elements had only a limited influence on the final recommendations. It was also felt that the process would have benefited by more consideration at an early stage of qualitative rather than quantitative performance indicators, such as the benefits of ONS’ quality management system for consumer prices based on ISO9000.

The ONS/ABS Benchmarking Process: report writing One organisation takes the lead in writing the report. – Went through a number of drafts. – All key recommendations had been agreed during earlier discussion. Comments were exchanged by and mainly related to points of clarification and presentation. Recommendations covered a broad range of issues including. – Editing and validation of primary data, the use of handheld computers for collecting prices in shops, general quality assurance procedures and longer-term index development; contracting out of price collection; a number of statistical design issues including the frequency of price collection. – From the very specific and concrete to a recommendation that an issue be investigated further. – Some went wider than day-to-day operational issues and addressed the perceived weaknesses in performance indicators and other management information such as the identification of costs, which was considered important in determining future directions.

The ONS/ABS Benchmarking Process: post evaluation Essential part of benchmarking undertaken in two stages. An initial evaluation in the context of the National Statistical Agencies Benchmarking Network Evaluation Workshop. – In large part undertaken in order to provide an input into discussions on possible future collaborations on other benchmarking projects. – It was done before the benchmarking report & finalisation of recommendations but still useful. Short presentations & general discussion. – General lessons learnt for future cooperation. Included: better guidance on cost information; need for standard methodology e.g. in questionnaire; the value of qualitative indicators. A final evaluation about a year after completion of benchmarking. – The main evaluation. More systematic: review of follow-up actions & value for money.

The ONS/ABS Benchmarking Process: some resulting action points Develop better performance indicators of the quality of the CPI collection, processing, analysis and output. Consider qualitative as well as quantitative indicators. – ACTION. Agencies worked together to develop a "standard" set of performance indicators. – ACTION. Performance Indicators reviewed annually to ensure standards and targets are appropriate. Each standard is monitored and reported on monthly. Trends are identified and any remedial action taken.

The ONS/ABS Benchmarking Process: some resulting action points Incorporate the improved performance information into the collection management information systems and monitor performance on a regular basis. – ACTION: Standard criteria for quality (as in earlier slide) can be summarised as. Accuracy; Timeliness; Efficiency; Relevance. – ACTION: Converted into measurable performance indicators. – ACTION: Each criteria monitored monthly. External contractor who surveys prices set a financially-driven standard and target to achieve. – Achievement of target results in performance payment to the contractor. – Penalty fee if failure to meet standard.

The ONS/ABS Benchmarking Process: conclusions Initial discussions, prior to the process itself, provided a useful tool through informal self-evaluation. The value of benchmarking was not restricted by the limited range of performance indicators. Discussion extended beyond the directly measurable to why different approaches were adopted to some aspects of index construction. E.g. different formulae for elementary aggregation. Direct benefits can continue to accrue beyond the benchmarking exercise. E.g. from follow-up action points. – Benchmarking is the beginning rather than the end of the process of improvement. Further exploration in greater detail of issues raised during the course of the initial benchmarking can pay dividends. – Work became more focused on specific issues as the benchmarking progressed and the issues of concern became more apparent.

The ONS/ABS Benchmarking Process: conclusions Longer-term benefits also included the subsequent opportunity for networking. Financial & management information compiled specifically for benchmarking can be useful management information in its own right. – Performance indicators are a necessary ingredient of the process of continuous improvement and are not just short-term management tools; A stage can be reached where there are diminishing returns associated with the collection of additional data. – Greater benefits can accumulate from further analysis and investigation of the data already accumulated. – E.g. the resources put into producing some of the more detailed financial information add limited value and certainly didn’t influence recommendations. The challenge is in recognising when this stage is reached.

Benchmarking: some general observations The initial benchmark provides a useful tool for the identification of potential issues. – The initial scoping exercise will set out the parameters and identify where problems and issues are the same and/or where one organisation has already identified solutions. Hidden objectives and lack of trust will reduce effectiveness. – The exercise will often have been done for a specific purpose beyond that of pure comparison for improvement. All partners need to be clear about why they hope to get out of the exercise!

Benchmarking: some general observations Following up initial actions in greater detail can be useful. – This is where real benefits to individual members are realised. A significant number of improvements to the ONS Quality Management Systems were made as a result of initial findings. Some data - financial information for example, can be useful in its own right. – E.g. A Benchmarking Exercise will often reveal information, not directly related to the initial objectives, that can be used to identified areas for improvement

Benchmarking: some afterthoughts Direct benefits usually accrue from follow-up action points. Longer term benefits include opportunity for on-going discussions and communication. – Quality is an ongoing process.

High quality official statistics: benchmarking as an integral part of a quality management system End of Presentation Thank you for listening David Fenwick