We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byElena Mullenax
Modified about 1 year ago
© Copyright IBM Corporation 2009 Performance Reporting: Insights from International Practice Richard Boyle December 2009
© Copyright IBM Corporation Overview Analysis of Reported Performance Indicators Understanding the Breadth of the Data Ensuring the Quality of Reporting Indicators Varying Practices Recommendations for Improvement Resources and Contact Information “The purpose of this report is to examine the reporting of outputs and outcomes in four countries.” Australia Canada Ireland United States “What types of indicators are actually being reported?”
© Copyright IBM Corporation Analysis of Reported Performance Indicators How Many Indicators Did Each Country Report On? U.S. performance reports focused on small number of indicators, per report. Australian performance reports had an average of 100 indicators each. Canada and Ireland fell in between. U.S. Also Reported a Much Higher Proportion of Quantitative Indicators, and a Lower Number of Qualitative or Discrete Events
© Copyright IBM Corporation Analysis of Reported Performance Indicators (cont.) What Was the Focus of the Performance Reports? 80 percent of indicators in U.S. reports focused on outcomes performance. Other countries examine a mix of program and agency performance. 50 percent of reported indicators in other countries focused on outputs
© Copyright IBM Corporation Insuring the Quality of Reported Indicators Each country’s indicators were rated against SMART Criteria to assess quality. Varying Practices for: The U.S Focus was strongly on outcomes Quantitative and quality indicators Targets and baseline data Australia and Ireland Focus was more on output and activity indicators Canada Falls in between. Greater focus on outcome and quantitative indicators Specific – The nature and the required level of performance can be identified. Measurable – the required performance can be measured. Achievable – the required performance associated with the indicator can be accomplished. Relevant – the required performance will contribute to the organization’s goals Time-bound – there is a deadline or specified time frame
© Copyright IBM Corporation U.S. Tends to Use Higher Quality Indicators in Its Reports
© Copyright IBM Corporation Recommendations on Developing a Good System for Reporting on Outputs and Outcomes Recommendations to Improve Report Preparation: Use a consistent, comparable, and structured approach. Include a good performance story to accompany the indicators. Specify outcome indicators and explain the results against the indicator. Recommendations to Improve Report Presentation: Provide both target and baseline data to guide performance assessment over time. Ensure effective use of technology in presenting the performance data collected Present performance information that includes output and activity indicators in addition to outcome indicators, when discussing agency performance.
© Copyright IBM Corporation Resources “Performance Reporting: Insights from International Practice” For free copies of this report, visit the IBM Center for The Business of Government website Author: Richard Boyle, PhD Head of Research Institute of Public Administration Lansdowne Road Ballsbridge Dublin 4 Prepared by Consueline Yaba
ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.
Department of Water Affairs (DWA) and Water Trading entity (WTE) Predetermined Objectives – 2013/14 March 2013.
P e r f o r m a n c e Measuring Results of Organizational Performance Lesson 4 Performance Methodology: The Balanced Scorecard.
Project Design and Implementation: Developing a Useful Monitoring and Evaluation Plan.
International Telecommunication Union Results-Based Management (RBM) terminology and definitions Bruce Gracie.
Program Objectives and Evaluation Using Healthy Campus 2010: Making It Happen and Logic Models Jim Grizzell, MBA, MA, CHES, F-ACHA.
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
December 13, 2007 Session: “Performance Measurements for Transportation Libraries” Hank Zaletel, MTKN/Iowa DOT Library/CTRE Maggie Sacco, Transportation.
Session 5: Why indicators? What indicators? Girls Not Brides Workshop on the Theory of Change and Measuring Impact.
THE NATURE AND PURPOSE OF STRATEGIC PLANNING Chris Sidoti ppt 3.
Carlo Carugi, Senior Evaluation Officer GEF Independent Evaluation Office March, 2015 Evaluation in the GEF and Training Module on Terminal Evaluations.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Learning and development - consultancy - research EIPA 2011 © European Institute of Public Administration - Institut européen dadministration publique.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to: funding agencies,
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Evaluation in the Field: Putting Concepts into Action Janet Myers, PhD MPH Richard Vezina, MPH CAPS HIV Prevention Conference April 21, 2006.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Impact assessment framework Natasha Innocent Senior Policy Adviser Learning and Skills MLA.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
1 Pertemuan 9 Department Organization Matakuliah:A0274/Pengelolaan Fungsi Audit Sistem Informasi Tahun: 2005 Versi: 1/1.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
1 UN Coherence: High level monitoring and evaluation approach.
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
INTERNAL AUDIT WORKSHOP: 19 JUNE 2007 JUANITA WILKENS.
Council of Europe Child Participation Assessment Tool Agnes von Maravic Children’s Rights Division Council of Europe Based on slides prepared by Gerison.
What is an effective induction (within an academic context) and how do you implement it across the whole university or college? Michael Hill Action on.
A Human Rights-Based Approach to Monitoring Session 6 (cont.)
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Results achieved under IFAD VII and directions for results measurement under IFAD VIII Edward Heinemann Programme Manager, Action Plan Secretariat, Office.
Introduction to Impact Assessment This introduction uses the Weavers Triangle, designed by Jane Weaver it has been developed by the Charities Evaluation.
CHAPTER 7 Apply Phase. What is Application? The application of the theory to the problem, phenomenon or issue in the world of practice.
New frontiers Evaluation methods Theory of change Project cycle and risk management Jesper Johnsøn, CMI, U4 Bergen, February 4, 2014.
CONFERENCE EVALUATION PLANNING Good planning is essential ! (‘’fail to plan is plan to fail’’)
1. Accreditation: ◦ Purpose ◦ Guiding Principles Accreditation Standards Development Working Group ◦ Members ◦ Goals Proposed Framework for Standards.
The Academic Assessment Process Advisory Committee for Academic Assessment Office of Academic Assessment Kent State University January 21, 2003.
Copyright © 2010 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Planning and Developing Community Programs and Services.
Standard Two Les Steele Executive Vice President.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Results Matrix (LFM) and Results-Based Budget (RBB) Mariangela Bagnardi.
Indicator Baseline Target Milestones PERFORMANCE MEASUREMENTS.
Coordinator of Assessment Coordinate assessment efforts on campus Maintain the NCCC General Education Assessment Plan Collect assessment results from course.
THE OECD APPROACH TO ASSESSING ORGANISATIONAL EFFECTIVENESS Frank van Tongeren Head of Division, Policies in Trade and Agriculture (OECD) ICAE pre-conference.
CHAPTER Prepared by: Jerry Zdril, CGA Tools for Business Decision-Making Third Canadian Edition MANAGERIAL ACCOUNTING Weygandt-Kimmel-Kieso-Aly 6.
United Nations Regional Seminar on Census Data Dissemination and Spatial Analysis for Arabic Speaking Countries, Amman, Jordan May 2011 Identification.
Indicators to measure the effectiveness of the implementation of the Strategy State of the art Karin Sollart Netherlands Environmental Assessment Agency.
National Disability Advocacy Program Reform: Measurable Performance Outcomes Thank you for your invitation, time and attention.
J.B. Speed School of Engineering University of Louisville KEEPS Energy Management Toolkit Step 3: Set Performance Goals Toolkit 3A: Set Energy Performance.
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
© 2017 SlidePlayer.com Inc. All rights reserved.