© 2002, CARE USA. All rights reserved. Applying Quality Standards in Impact Evaluation: Case of CARE Program Quality Framework and Evaluation Policy Ahmed.

Slides:



Advertisements
Similar presentations
Mutual accountability and aid transparency Mutual accountability and aid transparency Republic of Moldova 1IATI meeting, OECD Conference center.
Advertisements

ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Delivering as One UN Albania October 2009 – Kigali.
1 Department of State Program Evaluation Policy Overview Spring 2013.
Using the New CAS Standards to Assess Your Transfer Student Programs and Services Janet Marling, Executive Director National Institute for the Study of.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Delivering on Commitments to Gender Equality and Women’s Rights Key issues for HLF4 on aid effectiveness, Busan November 2011 Delivering on Commitments.
Commonwealth Local Government Forum Freeport, Bahamas, May 13, 2009 Tim Kehoe Local Government and Aid Effectiveness.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
What is H(M)IS?. Purpose of HIS “is to produce relevant information that health system stakeholders can use for making transparent and evidence-based.
National Evaluation Capacity Development Key elements for a conceptual framework Marco Segone*, Systemic Management, UNICEF Evaluation Office, and former.
Achieving Lasting Impacts Understanding the shift to more programmatic approaches in CARE.
Purpose of the Standards
RCOG International Office Consultancy Skills and Tools Angela Brown, Technical Assistance Manager, RCOG International Office.
Standards and Guidelines for Quality Assurance in the European
February 8, 2012 Session 4: Educational Leadership Policy Standards 1 Council of Chief School Officers April 2008.
Practicing the Art of Leadership: A Problem Based Approach to Implementing the ISLLC Standards, 4e © 2013, 2009, 2005, 2001 Pearson Education, Inc. All.
Evaluation of OCHA’s Role in Humanitarian Civil-Military Coordination Findings and Recommendations Seminar on Evaluation of UN Support for Conflict Affected.
School Leadership Evaluation System Orientation SY13-14 Evaluation Systems Office, HR Dr. Michael Shanahan, CHRO.
Appraisals – Fish farming projects -Management -Development Kirsten Bjøru Senior Adviser Norad.
Sarojini Persaud, AISA OXFORD DICTIONARY Weight or measure to which others conform or by which the accuracy or quality of others is judged. MANDATORY.
The Vision Implementation Project
Strategic Plan Evidence, knowledge and action for a healthier Ontario October 2, 2013 Presentation to ANDSOOHA.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Capacity Building for Better Agricultural Statistics Misha Belkindas and Graham Eele Development Data Group, World Bank.
Page 0 Agency Approaches to Managing for Development Results Why Results? What Results? Key Challenges, lessons learnt Core principles and draft action.
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
Abu Raihan, MD, MPH Director of Program, Asia IAPB 9th GA, Hyderabad, September 2012 Symposium 6: Challenges in Monitoring.
IPC Global Strategic Programme ( ) IPC Global Partners: IPC REGIONAL Strategic Programme IPC Regional Steering Committee Meeting – March.
PART II – Management Audit: Basic Standards, Values and Norms Shared by Pratap Kumar Pathak.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points Western and Central Africa Dakar, May 2007.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
PACIFIC AID EFFECTIVENESS PRINCIPLES. Purpose of Presentation Provide an overview of Pacific Principles on Aid Effectiveness Provide an overview of Pacific.
1 Decentralization Reforms in Rwanda and Capacity Development for LGs Kigali, 21/01/2011.
BUILDING CAPACITY THROUGH PROFESSIONAL DEVELOPMENT AND INSTRUCTIONAL LEADERSHIP DR. SANDRA J. MOORE DR. ROBERT C. MCCRACKEN RADFORD UNIVERSITY COLLEGE.
BCO Impact Assessment Component 3 Scoping Study David Souter.
CHE Business Plan Mission The mission of the CHE is to contribute to the development of a higher education system that is characterised by.
The Program Evaluation Cycle Module 3. 2 Overview n Overview of the evaluation cycle n Major components of the cycle n Main products of an evaluation.
The Next Stage for Results in Africa. Context 2005 Paris Declaration on Aid Effectiveness 2006 Mutual Learning Events Uganda & Burkina Faso 2007 Hanoi.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
ACCOUNTABILITY AND MONITORING IN HEALTH INITIATIVE PUBLIC HEALTH PROGRAM Capacity Building Consultation 5 – 7 June, 2012 Istanbul, Turkey.
Environmental Evaluation in the International Context Re-considering the role of evaluation Alain Fréchette, Consultant Environmental Evaluation Network.
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
Kathy Corbiere Service Delivery and Performance Commission
1. Key points 1. Planning Monitoring and Evaluation in the Spanish Cooperation 2. Manual for the Management of Evaluations of the Spanish Cooperation.
Presented at the OSPA Summit 2012 January 9, 2012.
Making development evaluation more coherent through Country-Led M&E Systems* Marco Segone, Systemic Management, UNICEF Evaluation Office, and former Vice.
Making Evaluation Work for Effective Policy Reform and Revision Chris Nelson, Director, Quality and Performance Systems Branch Sophie Davies, Manager Evaluations.
CARE NEPAL THURSDAY 21 JUNE 2012 Organizational Accountability Understanding the concept; its value; and options for action Governance Africa Learning.
1 A Multi Level Approach to Implementation of the National CLAS Standards: Theme 1 Governance, Leadership & Workforce P. Qasimah Boston, Dr.Ph Florida.
THE PARIS DECLARATION and DAC EVALUATION QUALITY STANDARDS Task Team on New Context for Development Evaluation DAC EQS Workshop, Auckland, 10 February.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
International Standards of Supreme Audit Institutions (ISSAIs) Jennifer Thomson Director OPSPF & Chief Financial Management Officer World Bank.
Evaluation Capacity Building at Country Level: GEF Focal Points 1 Osvaldo Néstor Feinstein AEA 2011 Conference GEF Evaluation Office Panel.
Evaluation What is evaluation?
January 23,  Balance state’s higher education long range plan and agency operations in the required strategic plan;  Involve agency staff in.
HSTS114 Official, Social & Economic Statistics
Strengthening Accountability Relations with Beneficiaries through GALI
Module 1: Introducing Development Evaluation
PARIS21 Workshop Strategic Statistical Planning GDDS and NSDS links July 27-28, 2005 Accra, Ghana Oliver J.M Chinganya, Regional Advisor, GDDS-Anglophone.
February 21-22, 2018.
CHANGE IS INEVITABLE, PROGRESS IS A CHOICE
Seven Utilization Lessons
Elements of evaluation quality: questions, answers, and resources
Mrs.Dharshana Senanayake Director General
School Leadership Evaluation System Orientation SY12-13
TLQAA STANDARDS & TOOLS
Presentation transcript:

© 2002, CARE USA. All rights reserved. Applying Quality Standards in Impact Evaluation: Case of CARE Program Quality Framework and Evaluation Policy Ahmed Ag Aboubacrine Josephine Kainessie Bockarie Sesay Dr. Moses Lahai Patrick Robin DME Unit – CARE Sierra Leone 5 th AFREA/NONIE/3IE Conference – Cairo -31 st March – 2 nd April 2009

© 2005, CARE USA. All rights reserved. CARE International Program Quality Framework

© 2005, CARE USA. All rights reserved. EVALUATION POLICY CARE International Principles  Relevance (focus on what is important)  Participation (of community representatives)  Focused on impact on the lives of people (significance)  Credibility (objective and reliable methods)  Integrity (ethical standards)  Transparency (willingness to share findings)  Independence (of evaluators) DAC Principles  Purpose Of Evaluation  Impartiality & Independence  Credibility  Usefulness  Participation of Donors and Recipients  Donor Co-operation  Evaluation Programming  Design and Implementation of Evaluations  Reporting, Dissemination and Feedback AFREAGuidelines  Utility  Feasibility  Propriety  Accuracy  Evaluation Accountab ility

© 2005, CARE USA. All rights reserved. EVALUATION POLICY CARE Evaluation Policy Lines 1. Responsibility of COs 2. Consistent with CI Principles (3&6) and Standards (10) 3. Test the relationship with CI’s Vision and Mission and MDGs. 4. Analysis of the degree and consequences of implementation of the CI PQF (SP, UF) 5. Follow professional inter- agency standards (“speak a common language”) 6. Significant participation and high level of influence of participants and stakeholders 7. Evaluation Completeness 8. Conducted openly and in a transparent manner 9. Follow up and accountability 10. Evaluation is a priority  CB + Rigor + Use 11. Generating the resources required the EP DAC Standards 1. Rationale, purpose and objectives of an evaluation 2. Evaluation scope 3. Context 4. Evaluation methodology 5. Information sources 6. Independence 7. Evaluation ethics 8. Quality assurance 9. Relevance of the evaluation results 10. Completeness And …? Other Evaluation Standards & Guidelines (DFID, Sphere, etc.)

© 2005, CARE USA. All rights reserved. Lessons Learnt in Practice  Policy Evaluation as the main guide while designing both the intervention and its evaluation (ToRs)  Operationalize the policy requirements in the evaluation design (in technical offer) – Use checklist  Stickiness to the standards (staff, consultants, donors)  Mix-Methods (no single method!) by separate experts working as a team (Quantitative Study followed by in-depth Qualitative Assessment)  Impact Measurement Vs Participation Principle  Evidence Vs Ownership / Sustainability  Seeking impact Vs Inventing impact  Independence (internal /external)?

© 2005, CARE USA. All rights reserved. Influencing Factors  Capacity Constraints  Human Factor (agenda, skills, competencies, etc.)  Data Collection and Analysis Methods  Analysis of Priorities (felt / normative /relative needs)  Analysis of Impact (Measured Vs Perceived)  Ad-hoc external Vs Action Research through out the lifetime

© 2005, CARE USA. All rights reserved. Rethinking the standards New Challenges  Evolution of thinking (IE)  Strategic Impact Inquiries  Project to Program Shift (P2P)  Chose Appropriate Impact Measurement Methods  Review Of the Program Quality Framework?

© 2005, CARE USA. All rights reserved. Use of Evaluation standards in Post-Conflict Context  Opportunities  Alignment is still possible (Evaluation Policies, Paris Declaration, Accra Agenda for Aid Effectiveness).  Emerging trend of evaluation and accountability by aid agencies  Constraints  Contextual Limits to Evaluation Utilization of evaluation to influence decision makers  Capacity Development  Persistence of emergency culture (dependency)  No process oriented

© 2005, CARE USA. All rights reserved. “Do” and “Don’t” in using evaluation standards DO  Question your design with the lens of your EPs  Select what would be mandatory  Contextualize (set level of compliance for each principle / standard)  Promote attitudes (thinking evaluatively)  Work with qualified academic / research people and/or institutions  Allocate resources and time DON’T  Think that every thing is feasible  Wait for the evaluator to apply the EP  Oh!... That’s the job of M&E Officer  Think that your evaluation should be always perfect (there are always limits!)

© 2005, CARE USA. All rights reserved. For more resources, visit: THANKS!Question?