Jonathan A. Morell, Ph.D. Director of Evaluation – Fulcrum CorporationFulcrum Corporation (734)

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

The Role of the IRB An Institutional Review Board (IRB) is a review committee established to help protect the rights and welfare of human research subjects.
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
How to Apply for an Interlibrary Cooperation Grant from the Alaska State Library March 23, 2013 Alaska Library Association Conference Valdez.
Strategic Management & Planning
Evaluation in the Face of Uncertainty: Maximizing Methodological Choice when Unpredictable Outcomes are Likely Environmental Evaluators Network June 23.
Outsourcing and HRM Brian S. Klaas. The Market or the Organization When outsourcing is used, firms are relying on a market-based form of governance to.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Donald T. Simeon Caribbean Health Research Council
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Strategies and Structures for Research and Policy Networks: Presented to the Canadian Primary Health Care Research Network, 2012 Heather Creech, Director,
Designing an Effective Evaluation Strategy
Chapter 07: CHANS and Conflict Management. DISCUSSION TODAY Coupled Human and Natural Systems (CHANS) Conflict and INRM Co-management.
Project Monitoring Evaluation and Assessment
An Assessment Primer Fall 2007 Click here to begin.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
The Promise & Challenge of Health Care IT in Community Clinics: Insights from the California Community Clinics Initiative Prepared for the convening on.
Return On Investment Integrated Monitoring and Evaluation Framework.
Information and Communication Technology Research Initiative Supporting the self management of obesity: The role of ICTs University.
Title I Needs Assessment and Program Evaluation
Monitoring, Review and Reporting Project Cycle Management A short training course in project cycle management for subdivisions of MFAR in Sri Lanka.
Standards and Guidelines for Quality Assurance in the European
Continuous Quality Improvement (CQI)
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Early Childhood Mental Health Consultants Early Childhood Consultation Partnership® Funded and Supported by Connecticut’s Department of Children and Families.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Whilst the pharmaceutical industry plays a key role in developing and producing medicines, there is a tension between industry’s need to expand product.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Program Evaluation Using qualitative & qualitative methods.
Darren A. DeWalt, MD, MPH Division of General Internal Medicine Maihan B. Vu, Dr.PH, MPH Center for Health Promotion and Disease Prevention University.
MAST: the organisational aspects Lise Kvistgaard Odense University Hospital Denmark Berlin, May 2010.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
January 25, 2011 Georgia Behavioral Health Caucus Community Care Joseph Bona, MD, MBA Chief Medical Officer DeKalb Community Service Board.
State Smart Transportation Initiative October 9, 2014 Matthew Garrett Oregon DOT Director Erik Havig Oregon DOT Planning Section Manager.
The Evaluation Plan.
A Masterclass in Crisis Management Presented by Alba 17 th June 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Developing Indicators
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Presented by Linda Martin
Risk Communications for Disaster Response in an increasingly Wired World What communicators need to know and do Christine Clark Lafleur “ Establishing.
1 Webinar: Challenges in Clinical Training Ben Wallace, Executive Director, Clinical Training Reform Health Workforce Australia.
Adult Drug Courts: The Effect of Structural Differences on Program Retention Rates Natasha Williams, Ph.D., J.D., MPH Post Doctoral Fellow, Morgan State.
Workshop 6 - How do you measure Outcomes?
Dr. David Mowat June 22, 2005 Federal, Provincial & Local Roles Surveillance of Risk Factors and Determinants of Chronic Diseases.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
Targeted and Intensive Interventions: Assessing Process (Fidelity) Cynthia M. Anderson, PhD University of Oregon.
Lecture 17 NATURAL RESOURCE PLANNING AND MANAGEMENT Dr. Aneel SALMAN Department of Management Sciences COMSATS Institute of Information Technology, Islamabad.
Program Evaluation for Nonprofit Professionals Unit 2: Creating an Evaluation Plan.
Governance and Commissioning Natalie White DCSF Consultant
School Improvement Partnership Programme: Summary of interim findings March 2014.
Section I: Bringing The Community Together Center for Community Outreach Key Components of Afterschool Programs.
Chapter 11: Building Community Capacity to Take Action Operation: Military Kids Ready, Set, Go! Training.
Corporate Governance. Strategic Control Strategic control  the process of monitoring and correcting a firm’s strategy and performance  Informational,
Adopting Simulation Technology to Teach Veterinary Emergency Response By Angela Clendenin ALEC 640 – Theory of Change October 20, 2015.
Kathy Corbiere Service Delivery and Performance Commission
Program Evaluation Principles and Applications PAS 2010.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
IAB Involvement in ERCs: Assessing and Strengthening the Role.
Lesson 4: Evaluation Plan Macerata, 29 th October Alessandro Valenza, Director, t33 srl.
MANAGEMENT INFORMATION, INFORMATION MANAGEMENT AND A PERFORMANCE MANAGEMENT STRATEGY YSDF THURSDAY 12, NOVEMBER, 2009.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
Training of Process Facilitators 1- Training of Process Facilitators 5-1.
Using Qualitative Methods to Identify System Dynamics and Inform System Evaluation Design Margaret Hargreaves Mathematica Policy Research American Evaluation.
JMFIP Financial Management Conference
Evaluating the Quality and Impact of Community Benefit Programs
DATA COLLECTION METHODS IN NURSING RESEARCH
STRATEGIC MANAGEMENT IN HEALTH CARE 10 MARCH 2009
Program Evaluation of Nebraska’s Gamblers Assistance Program
Presentation transcript:

Jonathan A. Morell, Ph.D. Director of Evaluation – Fulcrum CorporationFulcrum Corporation (734) Presented to the United Nations Development Programme February 20 th, 2014 © 2012 Jonathan Morell Strong Evaluation Designs for Programs with Unexpected Consequences

The Essence of the Problem 2 © 2012 Jonathan Morell Complex system behavior drives unexpected outcomes  Network effects  Power law distributions  Ignoring bifurcation points  State changes and phase shifts  Uncertain and evolving environments  Feedback loops with different latencies  Self organization and emergent behavior  Ignoring full range of stable and unstable conditions in a system  Etc. Guaranteed evaluation solution  Post-test only  Treatment group only  Unstructured data collection But we loose many evaluation tools  Time series data  Comparison groups  Specially developed surveys and interview protocols  Qualitative and quantitative data collection at specific times in a project’s life cycle  Etc. Why the loss? Because establishing evaluation mechanisms require  Time  Effort  Money  Negotiations with program participants, stakeholders, and other parties

Some Examples of the Kinds of Problems we may Run Into ProgramOutcome Evaluation is Looking for Possible Unexpected Outcomes Evaluation Design Weakness Free and reduced fees for post- natal services Survey/interview  Health indicators for mother and child  Child development indicators  Drug and supply hoarding  New sets of informal fees  Lower than expected use of service  No interview or observation to estimate amount of fees  No way to correlate fees with attendance or client characteristics Improve agricultural yield Records, interviews, observations  Yield  New system cost  Profit  Perverse effects of increased wealth disparities  No other communities to check on other reasons for disparity  No interviews to check on consequences disparities Improve access to primary education Records, surveys  Attendance  Graduation  Life trajectory  Interaction with other civil society development projects  Networking effects of connections  Census of other civil society projects  Data on interaction among projects  Data on consequences of interaction © 2012 Jonathan Morell

Adding “Surprise” to Evaluation Planning 4  Funding  Deadlines  Logic models  Measurement  Program theory  Research design  Information use plans  Defining role of evaluator  Logistics of implementation  Planning to anticipate and respond to surprise © 2012 Jonathan Morell

Overall Summary: Methods 5 © 2012 Jonathan Morell

6 Let’s look at this one.

Example Improve Access to Primary Education Outcome Evaluated ForPossible Unexpected OutcomesEvaluation Design Weakness Records, surveys  Attendance  Graduation  Life trajectory  Interaction with other civil society development projects  Networking effects of connections  Census of other civil society projects  Data on interaction among projects  Data on consequences of interaction © 2012 Jonathan Morell A Relevant Theory: We Know About Phase Shifts When Network Connections Increase Evaluation Redesign  Identify other civil society programs  Measure connections  Ignore details of which programs are connected  Collect data frequently to detect timing of change

8 © 2012 Jonathan Morell Let’s look at this one.

Example: Agricultural Yield Outcome Evaluated ForPossible Unexpected OutcomesEvaluation Design Weakness Records, interviews, observations  Yield  New system cost  Profit  Perverse effects of increased wealth disparities  No other communities to check on other reasons for disparity  No interviews to check on consequences disparities Evaluation Methodology: Expand Monitoring Outside Boarders of Agriculture Program Evaluation Redesign Adopt a “whole community” perspective  Identify a wide range of social indicators  Identify a diverse set of key informants  Conduct regular open-ended interviewing © 2014 Jonathan Morell

10 How can an evaluation be designed to change? Agile Evaluation © 2012 Jonathan Morell Let’s look at this one.

Example Free / Reduced Fees for Post-Natal Services Outcome Evaluated ForPossible Unexpected OutcomesEvaluation Design Weakness Survey/interview  Health indicators for mother and child  Child development indicators  Drug and supply hoarding  New sets of informal fees  Lower than expected use of service  No interview or observation to estimate amount of fees  No way to correlate fees with attendance or client characteristics Add a process component to the evaluation design  Survey of mothers to assess total cost of service  Open ended interviews with clinic staff about consequences of the new system for their work lives Nice to say, but agile evaluation can be expensive  Do we want both?  Do we want only one of these tactics?  These are the kinds of questions that have to be added to all the other decisions we make when designing an evaluation © 2014 Jonathan Morell

What are the practical and political reasons for surprise? 12  Any single organization has limited money, political capital, human capital, authority and power  Narrow windows of opportunity  Competition requires bold claims  Resource owners have parochial interests  Design expertise limited  Collaboration across agency boundaries is very difficult  Short term success is rewarded  Partial solutions can accrue to major success over time  Pursuing limited success with limited resources is justifiable. Result  Narrow programs  Simple program theories  Small set of outcomes Planners may know better but they are doing the best job they can. Evaluators have to follow. © 2010 Guilford Publications

13 © 2010 Guilford Publications Where do the surprises in the cases fall in the life cycle scenario?