Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

New Challenges in M&E Lets go. Scaling Up Monitoring & Evaluation Strategic Information PROGRAM GUIDANCE RESULT NEEDS OPPORTUNITIES Resources New directions.
Theory-Based Evaluation:
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
Tools for Policy Influence. RAPID Programme SMEPOL, Cairo, February, Practical Tools.
Post 16 Citizenship Liz Craft Valuing progress Celebrating achievement.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
From Research to Advocacy
MALAWI COUNTRY: Small (13 million) Stable Densely populated CIVIL SOCIETY: Young Growing in voice CO: Programming Livelihood Education Micro-finance 120.
Head of Learning: Job description
The Power of Collaboration Models of Collective Impact
Telling the Story of Canada’s Children A Comprehensive Approach to Accountability National Children’s Alliance November 26, 2004.
Monitoring and Evaluation in the CSO Sector in Ghana
Principal Professional Development project
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
TACKLING POVERTY TOGETHER Youth Contributing to Poverty Reduction.
Project Monitoring Evaluation and Assessment
Building a knowledge platform for agriculture and rural development: Evidence-based learning and results based management in Myanmar. Livelihoods and Food.
Improvement Service / Scottish Centre for Regeneration Project: Embedding an Outcomes Approach in Community Regeneration & Tackling Poverty Effectively.
Chapter 10 Human Resource Management and Performance: a Review and Research Agenda David E. Guest.
Challenge Questions How good is our operational management?
Breakout Sessions Self review Preparing for ALL. Purpose To meet other schools in your ALLS cluster To critically inquire into the effectiveness of current.
Mainstreaming Gender in development Policies and Programmes 2007 Haifa Abu Ghazaleh Regional Programme Director UNIFEM IAEG Meeting on Gender and MDGs.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Developing a Personal Development Plan
Paper Title: “The influence of gender in the relation between Participatory Monitoring and Evaluation, and Citizen Empowerment” Conference Paper by: Kennedy.
Welcome to Workshop 3 The Foundation of Nursing Studies (FoNS) in Partnership with the Burdett Trust for Nursing Patients First: Supporting Nurse-led.
Raising Academic Standards for all School Development Planning Initiative.
Impact assessment framework
Strategic Commissioning
Evaluation synthesis on IFAD’s engagement with indigenous peoples Emerging findings Independent Office of Evaluation of IFAD (IOE) 12 February 2015.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Local Transport Plan 3 Vision and Issues. The Local Transport Plan Will replace LTP2, which expires 31 March must have LTP3 approved and operative.
Across Latitudes and Cultures Bus Rapid Transit Centre of Excellence Durban, South Africa; September 16, 2011 General Assembly 1.
Commissioning Self Analysis and Planning Exercise activity sheets.
NIPEC Organisational Guide to Practice & Quality Improvement Tanya McCance, Director of Nursing Research & Practice Development (UCHT) & Reader (UU) Brendan.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
Rethinking Pre-College Math: A Brief Reminder about Why We’re Here and What We’re Trying to Do Overall context/purpose of project Defining characteristics.
The shift to programs in the LAC region. What is a program? A program is a coherent set of initiatives by CARE and our allies that involves a long-term.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
April_2010 Partnering initiatives at country level Proposed partnering process to build a national stop tuberculosis (TB) partnership.
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
October 8 & 9 th, 2008 Conference on the Economy (COTE) University of the West Indies Programme Management Office Ministry of Planning, Housing and the.
School Improvement Partnership Programme: Summary of interim findings March 2014.
Being Strategic Annette Lees. Strategy is: The essential link between vision and outcome The internal logic that links all parts of our work Both thinking.
Transforming Patient Experience: The essential guide
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
Taking the Chair A National Development Programme for Chairs, Vice- Chairs and Chairs of Committees Module Four Activity 4.1 OHT 1.
Kathy Corbiere Service Delivery and Performance Commission
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
The School Effectiveness Framework
The role of the GAC Before looking at how we organise ourselves we need to know what we aim to do!
Connect2Complete Theory of Change Development for Colleges and State Offices November 10, 2011 OMG Center for Collaborative Learning.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Developing and Implementing Marketing Plans. Learning Outcomes To be able to discuss the meaning of a marketing plan (E) To be able to analyse the internal.
"Learning and achievements of SWA Global platform and its relevance to achieving Hygiene and Sanitation Development in India" India WASH Summit 17 th February.
Understanding DWCPs, tripartite process and role of Trade Unions How the ILO works at a national level.
Dr. Christine Tom Griffith University School-based Assessment for and in Learning.
Corporate-level Evaluation of IFAD’s Decentralization Experience
Key Steps in the Culture Change Process
Paper Title: “The influence of gender in the relation between Participatory Monitoring and Evaluation, and Citizen Empowerment” Conference Paper by: Kennedy.
OGB Partner Advocacy Workshop 18th & 19th March 2010
Reading Paper discussion – Week 4
Presentation transcript:

Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions

PI’s Paper on IM Locates IM within the Programme Approach – therefore impact is measured at programme level. Focuses on three key elements of impact measurement: – Are we achieving the high-level goals we set out to achieve [tracking change]? – Is our theory of change holding true [testing the ToC]? – What’s changing in the context which may influence the above [on-going contextual analysis]? The paper seeks to provide a mix of the conceptual and the practical – but errs on the side of conceptual

Some Key Principles IM is reliant on (and can help build) a strong culture and processes for learning and knowledge management. Prior to work on IM, COs need to have a clear vision and understanding of the Theory of Change for their long-term programmes. IM focuses on contribution analysis at higher levels, drawn from strong M & E at lower levels. IM requires strong internal knowledge, but also partnership with researchers, think tanks, collegial organisations, etc. Emphasis has been placed on quantitative data and analysis, but we need to build more support for qualitative work as well.

Elements of IM a bit more detain on key focus areas

Tracking Change Understanding – and then quantifying – the changes expressed in the Domains of Change and Impact Goal of each programme. High-level changes, which should show contribution to national and international standard indicators – e.g. the MDI plus list. This is a process of contribution analysis – seeking to understand our (CARE + Partners) contribution to population-level changes.

Testing the Theory of Change The programme ToC is a set of hypotheses, which, if they hold true, should give rise to the kinds of social change articulated in the Domains of Change and Impact Goal. Our hypotheses are based on evidence, but also on intuition. Some are foundational, and, if they haven’t already been “proven” will need to be – otherwise the ToC is a house of cards. Hypotheses for testing are based on critical assumptions, which, if do not hold true, threaten the overall ToC. They inform programming strategy, rather than being a part of the strategy. There are many ways to test key hypotheses: through new or existing initiatives; through targeted research; through literature reviews; etc.

On-Going Context Analysis The Theory of Change is located within evolving contexts, which must, on an on-going basis, be reviewed and integrated into the ToC and discussions about change. This is an opportunity to integrate the Programme Approach, IM and EPP processes. Timing will vary in different COs – e.g. Egypt’s rapidly changing context will require closer monitoring and meaning making than some other countries – the same is true for “emergency- prone” COs.

Knowledge Systems and Sub- Systems (with thanks to Tom Barton and ECARMU) The importance of “holding systems”

Sub-systems from Tom Barton Country Office Knowledge System Design, Monitoring & evaluation (DM&E) sub- system Impact Measurement & Tracking (IM) sub- system Knowledge Management sub-system Learning sub- system Portfolio Coordination & Management sub-system

DM&E System Design: aims at the formulation (and on-going modification) of useful, significant efforts toward alleviation of poverty, social injustice and enhancement of human dignity. M & E: the coordinated set of interlinked activities for gathering and analysing information, reporting and supporting decision- making and the implementation of improvements. Monitoring: the regular collection (plus analysis and use) of information about progress within the programme and its projects. Evaluation: periodic reviews and reflective practice toward information from within, as well as about, programmes and projects and their performance. Reflective practice refers to the process of challenging ourselves based on the information we get by asking key questions, e.g., why, so what, and now what?

Knowledge Management System Purpose: turning information into knowledge and evidence; Making knowledge accessible Many COs have very weak and disbursed knowledge management systems – some innovative ideas: sharepoints, M & E and Learning Units, etc.

Learning System Purpose: to support on-going improvements in personal & organisational practice; track and reassess what’s “right.” Learning Systems are based on: – Reflective Practice; – Communication; – Application. Learning happens by doing, and by reading and engaging others.

Synthesis COs generate data and knowledge through monitoring systems; This knowledge (and data) needs to be aligned to the relevant CO programme – and may form part of a learning agenda, or process toward exploring hypotheses; Learning systems need to support the interpretation of data and the use of knowledge – at initiative and programme levels.

Thoughts to Ponder

General Questions How best to create dialogue about IM and supporting systems? What support is there for IM and supporting systems? How is the link between IM and PPLA’s current work on knowledge management articulated? In times of tight budgets, which are the most important processes/elements? Where can a repository of good practice be housed?

WE-IM Specific Emerging Questions WE: Means to an end? End in itself? And is there space/appetite for this discussion? Common hypotheses to test in multiple contexts - e.g. contribution of WE to other outcomes? Role of VSLA in building self- efficacy? Others? Experimental pilots for measuring WE at impact level (Pathways, perhaps….?)

Other thoughts, questions, ideas, missing points???