V MEASURING IMPACT Kristy Muir Stephen Bennett ENACTUS November 2013.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Action Planning. After self evaluation: Action Planning 'Action Planning' is an activity designed to assist in achieving goals by recognising the tasks.
Introduction to Monitoring and Evaluation
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
THE ALASKA EXPERIENCE PRESENTER: ERIN KINAVEY PART C COORDINATOR/ EARLY INTERVENTION MANAGER Meeting the Challenges of Fiscal Stability: Refining and Interagency.
What You Will Learn From These Sessions
Knowledge translation tool: A workbook for the contextualization of global health systems guidance at the national or subnational level _ CPHA, Toronto.
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Laura Pejsa Goff Pejsa & Associates MESI 2014
RBM in the context of Operations and Programme and Project Management Material of the Technical Assistance Unit (TAU)
Lynn Stockley & Associates Introduction to Behavioural Change Lynn Stockley.
Getting on the same page… Creating a common language to use today.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Plan © Plan Assessing programme effectiveness at the global level in a large and complex organisation Presentation delivered to the conference on Perspectives.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Too expensive Too complicated Too time consuming.
Program Evaluation and Logic Models
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Strategic Prevention Framework Overview Paula Feathers, MA.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Logic Models and Theory of Change Models: Defining and Telling Apart
Chapter 5 Managing Responsibly and Ethically Copyright © 2016 Pearson Canada Inc. 5-1.
NFPs place in Society & Economy Better Boards Conference 2012 Melbourne Les Hems Director of Research Centre for Social Impact.
Julie R. Morales Butler Institute for Families University of Denver.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Development Hypothesis or Theory of Change M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011 Arif Rashid, TOPS.
Crosswalk of Public Health Accreditation and the Public Health Code of Ethics Highlighted items relate to the Water Supply case studied discussed in the.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Mapping the logic behind your programming Primary Prevention Institute
Context Evaluation knowing the setting Context Evaluation knowing the setting.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Chapter Thirteen – Organizational Effectiveness.  Be able to define organizational effectiveness  Understand the issues underpinning measuring organizational.
Evaluation design and implementation Puja Myles
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Session 2: Developing a Comprehensive M&E Work Plan.
ISSUES & CHALLENGES Adaptation, translation, and global application DiClemente, Crosby, & Kegler, 2009.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluating Rural Advisory Services
Program Evaluation ED 740 Study Team Project Program Evaluation
Evaluation: For Whom and for What?
Designing Effective Evaluation Strategies for Outreach Programs
Resource 1. Involving and engaging the right stakeholders.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Theory of Change Jon Kolko Professor, Austin Center for Design.
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Introduction to Program Evaluation
Short term Medium term Long term
Logic Models and Theory of Change Models: Defining and Telling Apart
Purpose of Outcomes measurement
WHAT is evaluation and WHY is it important?
Using Logic Models in Project Proposals
Point in Time Count Workshop
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

v MEASURING IMPACT Kristy Muir Stephen Bennett ENACTUS November 2013

WHY MEASURE? Does your or other social enterprises really make a difference? How do you know? How can you prove it? Why bother? Purpose of social enterprise: balancing commercial strategy with social environmental or other public benefits Other benefits from measuring impact To understand your impact: determine the ‘merit, worth or significance’ of something (Scriven 1967) To determine the difference between model/enterprise/idea and implementation failure/success To inform and improve services/ideas/enterprises Measuring Impact 2

There are a range of ways to measure programs/policies/interventions and there are hierarchies of quality / 3 MEASUREMENT THEORIES Measuring Impact Definition "the systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of... Programs” (Rossi and Freeman, 1993) The systematic collection of information about the activities, characteristics, and outcomes of programs that specific people use to reduce uncertainties, improve effectiveness, and make decisions regarding what those programs are doing and affecting’ (York 2000) [our emphasis] Mullen et al. (2005)

4 MEASUREMENT THEORIES Measuring Impact Mullen et al. (2005) Outcome evaluations study the effects of the program on participants Process evaluations assess how the program works: establishment and implementation; facilitators and barriers to effective ways of working Formative and summative evaluation Formative: help businesses, policy makers and practitioners refine and develop the process through which the program is implemented Summative: measure the impact of the program Economic evaluation examples Social Return on investment Cost-benefit analysis Cost-effective analysis

Is your approach to measuring impact ethical? How have you incorporated research ethics – consent and data collection Impartial evaluator Operational challenges Data availability Attribution 5 ETHICS AND MORALS OF EVALUATION Measuring Impact

Social enterprise models Geographic location Diverse client demographics What data is available? Policy/legislative framework Social, cultural, political, religious Comparative programs Existing evidence of intervention 6 CONTEXT Measuring Impact What is best approach for collecting evidence based on the contextual factors for your organisation? What contexts do you need to consider in measuring your impact? (Coatsworth, 2002, reproduced in Lippman 2004)

Evaluators of complex social programs should: Develop a theory o E.g. marshmallow test & impulsive behaviour / impulse control Develop a hypothesis based on that theory (what might work for whom under what circumstances) o If we intervene early to teach children impulse control we can decrease the likelihood they will end up in gaol. Intervention will involve… Conduct research to test the hypothesis Make conclusions based on the hypothesis and the findings Feed back into the development of theory 7 THEORY TO PRACTICE Measuring Impact

What? A ‘program logic’ is a systematic, visual representation of the underlying assumptions of a planned program. A results logic illustrates why and how a program is presumed to work. These models present a sequential, interactive account of how inputs will lead to outputs, which will result in the desired outcomes. Designing a log frame/theory/hypothesis 1.What is the major outcome anticipated? Describe the problem your social enterprise is attempting to solve. What is your social mission? 2.What other goals or sub-outcomes anticipated or desired (which will help work towards the major outcome)? What are the near and long term desired results? 3.What’s the theory behind why the program will work? What broad systems and structures [or programs] are in place to achieve these outcomes? What are the successful strategies that have helped other organisations achieve the outcomes you desire? What other evidence is available? 4.What outputs from these systems and structures might contribute to the outcomes? 8 LOGIC FRAME APPROACH Measuring Impact

9 A LOGIC MODEL Measuring Impact Inputs/ Resources ActivitiesOutputsOutcomesImpact What goes inWhat happensImmediate resultsShort and long term results Effects on root causes – sustained change Resources Equipment Knowledge/ expertise What does your organisation/ enterprise do? How much activity occurs? For each activity what are the results? For example: Improved living conditions Increased income For example: Change in poverty Change in social norm & attitudes W.K. Kellog Foundation (2004), Ebrahim & Rangan (2010) Logic model evaluation method Map appropriate methods against the program logic What information could or needs to be collected Client segmentation What population level information is available

Lead to [MAJOR OUTCOME] Lead to [OUTCOMES] Which in turn will result in [OUTPUTS] 123 And through... [WHAT OTHER ASSUMPTIONS ARE IMPORTANT RE PROGRAM STRUCTURE] By providing... [WHAT AND HOW] Supports... [WHO is being supported] Model [WHAT]

For example…

What is the business model for your social enterprise? - Purpose - Objectives and strategy & goals - Theory/ research - Finance model How will you measure the social impact of your social enterprise? 12 WHAT DOES THIS MEAN FOR ME? Measuring Impact

13 Q&A Measuring Impact