Outputs, outcomes and impacts Using Theory of Change to plan, develop and evaluate your initiative.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

1 of 17 Information Strategy The Features of an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy The.
1 Instruments and Data Collection New Mexico AmeriCorps April 20, 2006 Sue Hyatt, Project STAR Coach.
Supporting further and higher education Innovation JISC e-Learning Programme: Innovation Strand Lisa Gray Programme Manager.
DeSILA Designing and Sharing Inquiry-based Learning Activities JISC Design for Learning Programme Dr Philippa Levy & Dr Sabine Little (CILASS) John Stratford,
X4L Prog Mtg – March 05 Evaluation – What works best - when and why? Prof Mark Stiles Project Director: SURF X4L, SURF WBL, SUNIWE and ICIER Projects Head.
Tools for Policy Influence. RAPID Programme SMEPOL, Cairo, February, Practical Tools.
1 Project Design Module 5 Session 5. 2 Summary This session provides introduction to project preparation, project documents, and checklist for questions.
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Measuring Impact – Making a Difference Carol Candler – NRF Graeme Oram – Five Lamps.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
From Research to Advocacy
Developing and Implementing a Monitoring & Evaluation Plan
FORESTUR: “Tailored training for professionals in the rural tourist sector” ES/06/B/F/PP QUALITY MANAGEMENT PLAN Valencia, November 2006.
Outcome mapping. Outcome Mapping Developed by the evaluation unit of Developed by the evaluation unit of
Goals, Outcomes and Program Evaluation Community Memorial Foundation March 5, 2014.
2 © Pro Bono Economics  PBE acts as a broker, matching professional economists with charities;  providing pro bono help to measure performance and understand.
Developing a strategic evidence base and using existing knowledge Jennifer Colwell & Dave Wolff CUPP, University of Brighton.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Theory of Change, Impact Monitoring, and Most Significant Change EWB-UK Away Weekend – March 23, 2013.
Laura Pejsa Goff Pejsa & Associates MESI 2014
An Assessment Primer Fall 2007 Click here to begin.
Social Accounting and Audit (SAA) and the Social Audit Network An introduction…
Capacity building is a strategy and capacity gain is an outcome Capacity building as a strategy needs to be evaluated so that judgements can be made about.
Isabel Vogel for ESPA. Isabel Vogel for ESPA, 24 January 2012 What is theory of change? Structured technique for understanding how and why a programme.
Effective dissemination and evaluation
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
CASE STUDIES IN PROJECT MANAGEMENT
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
(c) 2014 Michael Sikes, Ph.D. Theory of Change: Bridging Ideas, Actions, and Results Michael Sikes, Ph.D. (805)
The County Health Rankings & Roadmaps Take Action Cycle.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
The Evaluation Plan.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Adaptive Governance and Policy-making Using the ADAPTool.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
The Logic Model An Introduction. Slide 2 Innovation Network, Inc. Who We Are National nonprofit organization Committed to evaluation as a tool for empowerment.
The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office.
Logic Models and Theory of Change Models: Defining and Telling Apart
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Beyond logical frameworks to program impact pathways CIIFAD M&E Workshop 5 November 2011  135 Emerson Hall Sunny S. Kim, MPH PhD candidate Division of.
Development Hypothesis or Theory of Change M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011 Arif Rashid, TOPS.
Overview of Evaluation ED Session 1: 01/28/10.
Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation.
704: Conducting Business in Fiscally Challenging Times: Strategies and Tools to Get There PCYA Leadership Academy Presentation March 28, 2012.
Developing a Logic Model or Theory of Change. What is a Logic Model? A logic model presents a picture of how your effort or initiative is supposed to.
Copyright © 2014 by The University of Kansas Developing a Logic Model or Theory of Change.
Evaluating your EQUIP Initiative Helen King. Objectives To enable teams to develop a shared understanding of the purpose, use and stakeholders for evaluation;
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
Evidencing the Impact a resource to support the voluntary sector to evidence the impact of adult learning Learning Link Scotland.
SJI Impact Assessment 2014 Based on Impact Assessment for Development Agencies – Learning to Value Change (Chris Roche, Oxfam, digital file 2010)
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
Resource 1. Involving and engaging the right stakeholders.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 5. Review of the organizational strategy.
Introduction to Evaluation
Logic Models and Theory of Change Models: Defining and Telling Apart
Resource 1. Evaluation Planning Template
THEORY OF CHANGE APPROACH
5.3 Using the Theory of Change Throughout the Project Cycle
CATHCA National Conference 2018
Using Logic Models in Project Proposals
Wednesday 13 September UKCF Conference Cardiff
Presentation transcript:

Outputs, outcomes and impacts Using Theory of Change to plan, develop and evaluate your initiative

To make explicit the link between vision, planning and evaluation To introduce you to Theory of Change as a tool for understanding, developing and evaluating your initiative To help you prepare for your team planning time this afternoon 2 Aims of the session

Theory of Change is… an outcomes-based, participatory method…for planning, evaluation, and organisational capacity-building; it defines all building blocks required to bring about a given long-term goal( a development of programme theory and logic models, adapted to complex change where both the path and the destination are evolving (Gamble 2008) useful as a bridging tool that creates provisional stability, linking planning and evaluation (Saunders et al 2005) 3 Introduction to Theory of Change

Purpose of Theory of Change 4 Plan Formulate your Theory of Change Evaluate Test your Theory of Change Improve Develop your initiative

1. Identify your long-term goal. 2. Conduct backwards mapping to identify the intermediate outcomes (or preconditions) necessary to achieve that goal. 3. Identify the activities that your initiative will undertake to create these outcomes. At the same time consider contextual factors, assumptions, resources needed etc… 5 Developing your Theory of Change

A good Theory of Change should be: Plausible – the logic of the theory is credible Doable – achievable with the resources (and time) available Meaningful – stakeholders see the goals as important and worthwhile Testable – there are credible ways of discovering whether the predicted results occur (and how/why) Developing your Theory of Change 6

In teams, develop a simple Theory of Change for your initiative using the template provided (40 mins) Consider: Is your goal meaningful? Does the logic of your theory make sense? What assumptions are you making? What resources will you need? How will you test your theory? Activity 7

8 Testing your Theory of Change Impacts: Sphere of interest Outcomes: Sphere of influence Ouputs: Sphere of control 'complex' long-term evaluation quantitative and qualitative data, synthetic methods, multiple perspectives 'complicated' medium-term evaluation mainly qualitative data/methods (how & why) 'simple' short-term evaluation mainly quantitative data/methods

1. Data from evaluation activities surveys, focus groups, interviews, case studies etc time-consuming to collect and limited in scope 2. Data generated by initiative itself planning documents, decision logs, network maps, meeting notes, s etc freely available but of limited worth externally 3. Naturally occurring institutional data NSS, module feedback, retention/achievement data etc valued as impact data but difficult to show causality 9 Sources of data

1. What aspects of your initiative will you evaluate? Not everything can or should be evaluated – what is both important and testable? 2. Why are you evaluating? To meet performance indicators? To show impact? To understand, inform and develop your initiative? 3. Who are you evaluating for? Senior managers? Funders? Staff? Students? The sector? Yourselves? 4. How will you evaluate? What sources of data will you use? What methods will you use to collect and analyse the data? 10 Developing your evaluation strategy

Theory of Change makes explicit what is assumed or tacitly understood about your initiative Theory of Change is a planning tool that helps you evaluate and an evaluation tool that helps you plan Your Theory of Change is just that – a theory: it should be revisited and revised throughout your initiative Use the team planning time this afternoon to develop your ToC further and consider your evaluation strategy (purposes, audiences, data and methods) 11 Summary and next steps

Dozois, E, Langlois, M & Blanchet-Cohen, H (2010) DE201: A practitioners guide to developmental evaluation, J W McConnell Family Foundation.A practitioners guide to developmental evaluation Gamble, J (2008) A developmental evaluation primer, J W McConnell Family Foundation.A developmental evaluation primer Rogers, P J (2008) Using Programme Theory for Complicated and Complex Programmes, Evaluation, 14 (1), Using Programme Theory for Complicated and Complex Programmes Saunders, M, Charlier B, Bonamy, J (2005) Using evaluation to create provisional stabilities: Bridging innovation in higher education change processes, Evaluation, 11 (1), Using evaluation to create provisional stabilities: Bridging innovation in higher education change processes Theory of Change Community, 12 References and further reading