Download presentation
Presentation is loading. Please wait.
Published byElijah Reeves Modified over 9 years ago
1
Jonathan A. Morell, Ph.D. Director of Evaluation – Fulcrum CorporationFulcrum Corporation jamorell@jamorell.com http://evaluationuncertainty.com (734) 646-8622 Presented to the United Nations Development Programme February 20 th, 2014 © 2012 Jonathan Morell Strong Evaluation Designs for Programs with Unexpected Consequences
2
The Essence of the Problem 2 © 2012 Jonathan Morell Complex system behavior drives unexpected outcomes Network effects Power law distributions Ignoring bifurcation points State changes and phase shifts Uncertain and evolving environments Feedback loops with different latencies Self organization and emergent behavior Ignoring full range of stable and unstable conditions in a system Etc. Guaranteed evaluation solution Post-test only Treatment group only Unstructured data collection But we loose many evaluation tools Time series data Comparison groups Specially developed surveys and interview protocols Qualitative and quantitative data collection at specific times in a project’s life cycle Etc. Why the loss? Because establishing evaluation mechanisms require Time Effort Money Negotiations with program participants, stakeholders, and other parties
3
Some Examples of the Kinds of Problems we may Run Into ProgramOutcome Evaluation is Looking for Possible Unexpected Outcomes Evaluation Design Weakness Free and reduced fees for post- natal services Survey/interview Health indicators for mother and child Child development indicators Drug and supply hoarding New sets of informal fees Lower than expected use of service No interview or observation to estimate amount of fees No way to correlate fees with attendance or client characteristics Improve agricultural yield Records, interviews, observations Yield New system cost Profit Perverse effects of increased wealth disparities No other communities to check on other reasons for disparity No interviews to check on consequences disparities Improve access to primary education Records, surveys Attendance Graduation Life trajectory Interaction with other civil society development projects Networking effects of connections Census of other civil society projects Data on interaction among projects Data on consequences of interaction © 2012 Jonathan Morell
4
Adding “Surprise” to Evaluation Planning 4 Funding Deadlines Logic models Measurement Program theory Research design Information use plans Defining role of evaluator Logistics of implementation Planning to anticipate and respond to surprise © 2012 Jonathan Morell
5
Overall Summary: Methods 5 © 2012 Jonathan Morell
6
6 Let’s look at this one.
7
Example Improve Access to Primary Education Outcome Evaluated ForPossible Unexpected OutcomesEvaluation Design Weakness Records, surveys Attendance Graduation Life trajectory Interaction with other civil society development projects Networking effects of connections Census of other civil society projects Data on interaction among projects Data on consequences of interaction © 2012 Jonathan Morell A Relevant Theory: We Know About Phase Shifts When Network Connections Increase Evaluation Redesign Identify other civil society programs Measure connections Ignore details of which programs are connected Collect data frequently to detect timing of change
8
8 © 2012 Jonathan Morell Let’s look at this one.
9
Example: Agricultural Yield Outcome Evaluated ForPossible Unexpected OutcomesEvaluation Design Weakness Records, interviews, observations Yield New system cost Profit Perverse effects of increased wealth disparities No other communities to check on other reasons for disparity No interviews to check on consequences disparities Evaluation Methodology: Expand Monitoring Outside Boarders of Agriculture Program Evaluation Redesign Adopt a “whole community” perspective Identify a wide range of social indicators Identify a diverse set of key informants Conduct regular open-ended interviewing © 2014 Jonathan Morell
10
10 How can an evaluation be designed to change? Agile Evaluation © 2012 Jonathan Morell Let’s look at this one.
11
Example Free / Reduced Fees for Post-Natal Services Outcome Evaluated ForPossible Unexpected OutcomesEvaluation Design Weakness Survey/interview Health indicators for mother and child Child development indicators Drug and supply hoarding New sets of informal fees Lower than expected use of service No interview or observation to estimate amount of fees No way to correlate fees with attendance or client characteristics Add a process component to the evaluation design Survey of mothers to assess total cost of service Open ended interviews with clinic staff about consequences of the new system for their work lives Nice to say, but agile evaluation can be expensive Do we want both? Do we want only one of these tactics? These are the kinds of questions that have to be added to all the other decisions we make when designing an evaluation © 2014 Jonathan Morell
12
What are the practical and political reasons for surprise? 12 Any single organization has limited money, political capital, human capital, authority and power Narrow windows of opportunity Competition requires bold claims Resource owners have parochial interests Design expertise limited Collaboration across agency boundaries is very difficult Short term success is rewarded Partial solutions can accrue to major success over time Pursuing limited success with limited resources is justifiable. Result Narrow programs Simple program theories Small set of outcomes Planners may know better but they are doing the best job they can. Evaluators have to follow. © 2010 Guilford Publications
13
13 © 2010 Guilford Publications Where do the surprises in the cases fall in the life cycle scenario?
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.