DISCUSSION PAPER PREPARED FOR THE WHO’S DEPARTMENT OF HUMAN RESOURCES FOR HEALTH ON BEHALF OF THE UN TASK FORCE ON IMPACT ASSESSMENT OF FELLOWSHIPS BY.

Slides:



Advertisements
Similar presentations
Guidance Note on Joint Programming
Advertisements

TVET working group contributions. What are the possible options for obtaining decent living and working conditions without joining the informal economy?
HEALTH PLANNING AND MANAGEMENT
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
From Research to Advocacy
Donald T. Simeon Caribbean Health Research Council
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Ray C. Rist The World Bank Washington, D.C.
Results-Based Management: Logical Framework Approach
ECVET WORKSHOP 2 22/23/24 November The European Quality Assurance Reference Framework.
PPA 502 – Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
Purpose of the Standards
1 Designing a Monitoring and Evaluation System for a Rural Travel and Transport Project Michael Bamberger Gender and Development Group The World Bank RTTP.
Performance Management Upul Abeyrathne, Dept. of Economics, University of Ruhuna, Matara.
Standards and Guidelines for Quality Assurance in the European
CASE STUDIES IN PROJECT MANAGEMENT
Quality assurance in IVET in Romania Lucian Voinea Mihai Iacob Otilia Apostu 4 th Project Meeting Prague, 21 st -22 nd October 2010.
Strategic, Annual Performance & Operational Planning Process
Contribution Analysis A different approach to measurement of impact.
Lesson 3: Monitoring and Indicator Macerata, 23 nd October Alessandro Valenza, Director, t33 srl.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Michalis Adamantiadis Transport Policy Adviser, SSATP SSATP Capacity Development Strategy Annual Meeting, December 2012.
Developing Indicators
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Wgnho Management for Performance Department of Conservation Management for Performance Project.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
WORLD HEALTH ORGANIZATION Draft Report WHO/HQ Geneva – Dr. Sasha Goubarev WHO/SEARO & WHO/Nepal Presented by Karen Gladbach Contributions by Arie Rotem.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Independent Evaluation Group World Bank November 11, 2010 Evaluation of Bank Support for Gender and Development.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
PREPARING FOR MONITORING AND EVALUATION 27 – 31 May 2013 Bangkok Bangkok Office Asia and Pacific Regional Bureau for Education United Nations Educational,
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Grant Application Form (Annex A) Grant Application Form (Annex A) 2nd Call for Proposals.
Results orientation: audit perspective Jiri Plecity, Head of Unit H1, Relations with Control Authorities, Legal Procedures, Audit of Direct Management.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Assessment, Accreditation, and Retention. “Thriving at the Liberal Arts College: Best Practices in Operations and Research” Dr. Claire Robinson, University.
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Evaluation What is evaluation?
Planning Planning is considered the most important element of the administrative process. The higher the level of administration, the more the involvement.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Working with Logic Models
Managing for Results Capacity in Higher Education Institutions
Performance Management System
Strategic Planning for Learning Organizations
Monitoring and Evaluation using the
WHAT is evaluation and WHY is it important?
United Nations Voluntary Fund on Disability (UNVFD)
GSF Results and Financial Monitoring Workshop
Integrating Gender into Rural Development M&E in Projects and Programs
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

DISCUSSION PAPER PREPARED FOR THE WHO’S DEPARTMENT OF HUMAN RESOURCES FOR HEALTH ON BEHALF OF THE UN TASK FORCE ON IMPACT ASSESSMENT OF FELLOWSHIPS BY ARIE ROTEM WITH THE SUPPORT OF MICHAEL A. ZINOVIEFF Impact Assessment of United Nations System Fellowship Programmes: An Evaluation Framework

Fellowships in the United Nations System “ A Fellowship in the United Nations System is a specially tailored or selected training activity that provides a monetary grant to a qualified individual for the purpose of fulfilling special learning objectives; such training, , should be in response to nationally approved human resources policies and plans, and should aim at impact and relevance for all stakeholders involved.”

Observations concerning fellowship programs Variability in design and implementation Large investment of efforts and resources Insufficient evidence to support accountability Need to ascertain contribution beyond benefit to individuals Need to coordinate approaches Potential for strengthening fellowship system

Evaluation assumptions Evaluation aims to support decision making An integral part of all phases of planning and implementation Main purpose to improve rather than to prove Engagement of stakeholders essential for success

Key Challenges Conceptual logic: Why do we expect fellowship program to contribute? Attribution: What else may have caused observed outcomes? Fidelity: Did we follow the necessary steps? Modalities of fellowships: What kind of impact could we reasonably expect? Time frame: How long is it likely to take before impact could be observed?

Mapping the pathway A reliable “map” of the fellowship pathway, required to:  Monitor progress towards the desired destination  Manage road blocks and unintended consequences  Replicate successes and prevent mistakes

Theory based evaluation “ Theory or program based approaches map out the channels through which the activities, inputs, and outputs are expected to result in the expected outcomes. It also allows for the identification of unintended effects”. Inder Jit Ruprah, Office of Evaluation and Oversight, IADB.

Logic Pathway and Benefit Chain INPUTACTIVITYOUTPUTSUSAGEIMPACT Improved Outcomes For Clients Improved Capacity Improved Services Policies Plans Agreements Fellows Funds Selection Placement Training Mentoring M&E Satisfaction Gained knowledge and skills Behaviour change Use of gained skills Contribution to performance Bridging gaps

Contribution analysis An analysis of activities and interim outcomes that could reasonably be expected to contribute to a positive impact for the beneficiary institution. Confirming the logical link among these outcomes and excluding other known influences (Conceptual logic )

Reducing uncertainty Identifying outside factors not related to the fellowship that may have an effect ( Attribution) Ascertaining that the fellowship program has been implemented properly within a logical conceptual framework ( Fidelity)

Verification of fidelity Clear objectives for fellowship addressing agreed priorities Selection of appropriate fellows and training programs Appropriate support during training Opportunity to apply knowledge and skill Utilisation and support after return home Opportunity to contribute and to “echo” learning

Assess the existing evidence on results Set indicators and obtain evidence to verify achievement of the desired results at each level ( Monitoring key milestones) Identify weak links between elements of the results chain and obtain further evidence to enable adjustments.

World Bank Indicators Relevance: consistent with organizational objectives and HRD plans Efficiency: value for money and adequate administrative and financial procedures Effectiveness: major objectives were achieved. Sustainability: continuation of benefit, once fellowships investment completed. Impact: improved performance of organization and benefit to community

Extended Kirkpatrick’s framework Planning, design and implementation- evidence of the appropriateness, relevance, efficiency and sustainability of the program. Reaction – evidence of satisfaction (what the trainees/fellows thought and felt about the training); Learning – evidence of learning (the resulting increase in knowledge or capability as reflected in end of course assessment);

Extended Kirkpatrick measures of training Behaviour – evidence of behaviour change or capability improvement as reflected in job performance; Results – evidence of contribution to the institution resulting from the fellows’ performance. Mega Impact- evidence of the long term benefit of the improved performance of the institution and the services it provides to specific communities or target groups.

Evidence of Contribution Self perception of enhanced capacity\contribution Others’ perceptions of enhanced capacity\contribution (360 degrees) Continuing to study and develop professionally Recognition by professional bodies of gained qualification (eligibility for higher studies and duties) Change in behaviour\attitude\performance

Evidence of Contribution Change of work practice\procedures \ways of doing things associated with learning Initiation of a new program or aspects of a program Attainment of specific institutional objectives related to added capacity Evidence of bridging a performance gap in the institution\program related to added capacity Evidence of contribution to institution key success factors

Evidence of Contribution Utilization of knowledge\skills by home institution Passing on knowledge\skills to others (dissemination) Assignment to higher or different duties that reflect use of learning Improved career path \progression Demonstration of leadership Improved prospect of retention due to higher motivation\morale

Use of all plausible evidence Use any plausible evidence regardless of the design, method or source used to obtain it. Use both qualitative and quantitative evidence (triangulation of findings) Use information obtained through current monitoring and evaluation approaches and techniques. (Less costly and less bias)

Assemble the performance story Document a Performance Story based on the evidence available showing the attainment of intermediate outcomes (critical milestones) and the logical link among them.

Performance story The performance story describes the journey from the inception of a fellowship program to the attainment of its immediate and long term goals. The important events and experiences along the way are identified as mile stones that could be monitored in order to ascertain that we are moving in the right direction and ultimately that we have reached our destination.

Performance story The performance story should enable ‘reasonable’ observers to determine whether it is plausible that particular interventions led to certain results.

“Plausible Association" “Plausible Association“ implies that reasonable people with adequate information concerning what has occurred or is occurring in the programme, agree that the programme contributed or is likely to contribute to the desired outcomes.

Seek out additional evidence. Where an alternative explanation cannot be discounted, or the program cannot be shown to be a more likely contributor, the program logic should be reviewed and/or additional data gathered and evaluated. Where this can’t be done, further evaluation is required or it should be recognised that the program is not the key contributor to the outcomes.

Work in progress The approach is well suited to development programs where data is likely to reflect ‘progress toward results’, rather than a definitive statement of final outcomes.

A credible performance story includes; The program context (including the results chain), Planned and actual accomplishments, Lessons learnt, Approach for assuring the quality of information The main alternative explanations for the outcomes occurring including reasons why they had limited or no influence. Mayne (2003)

Engagement of stakeholders Requires collaboration of stakeholders Provides better contextual understanding and engenders ownership Partnership and collective responsibility for success

Next Steps Construct benefit chain for selected programs Review existing evidence in relation to mile stones Identify weak links Seek additional evidence Construct performance story Confirm conceptual logic and fidelity of programs

Next steps Assess “plausibility” of contribution to impact Derive lessons and strengthen fellowship programs Disseminate findings and establish benchmarks and guidelines Engage stakeholders in all steps  ( Particularly, fellowship authorities, training institutions, beneficiary institutions, and fellows)

Recommendation Recommend to the 17 th Meeting of Senior Fellowship Officers to allocate existing resources and/or seek additional support for the conduct of pilot studies in selected countries and/or sectors, to test the appropriateness of the contribution analysis approach in the context of impact evaluation of UN system fellowship programmes.