Workshop on Using Contribution Analysis to Address Cause-Effect Questions Danish Evaluation Society Conference Kolding, September 2008 John Mayne, Advisor.

Slides:



Advertisements
Similar presentations
Linking regions and central governments: Indicators for performance-based regional development policy 6 th EUROPEAN CONFERENCE ON EVALUATION OF COHESION.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Making Contribution Claims IPDET 2011 Ottawa John Mayne, Ph D Advisor on Public Sector Performance Adjunct Professor, University of Victoria
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Mywish K. Maredia Michigan State University
Monitoring and Evaluation in the CSO Sector in Ghana
What You Will Learn From These Sessions
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
PPA 502 – Program Evaluation
Chapter 15 Evaluation.
Evaluation. Practical Evaluation Michael Quinn Patton.
Types of Evaluation.
Really Using (Useful) Theories of Change IPDET 2013 John Mayne, Advisor on Public Sector Performance
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Thinking Actively in a Social Context T A S C.
Thank you Connections Outcomes – Key Government Priority Individual’s Outcomes Personalisation Quality Strategy Carers Strategy SDS Bill IntegrationCCOF.
Performance Measurement and Analysis for Health Organizations
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Support Systems for Indigenous Primary Health Care Services Alister Thorpe, Kate Silburn #, Ian Anderson 23 March 2010 # La Trobe University.
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Contribution Analysis: An introduction Anita Morrison Scottish Government.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Highlights from Educational Research: Its Nature and Rules of Operation Charles and Mertler (2002)
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Inquiry and Investigation. What was the TOPIC? PROBLEM? CIVIC INQUIRY?
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
The shift to programs in the LAC region. What is a program? A program is a coherent set of initiatives by CARE and our allies that involves a long-term.
Process Use: Intentional Practice or Just Good Practice? anzea 2013 Conference 22–24 July 2013 Alexandra Park, Epsom, Auckland Michael Blewden Massey University.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
BCO Impact Assessment Component 3 Scoping Study David Souter.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
1 Transport Canada Transports Canada Presentation for CES - Conference 2000 Presented by Jennifer Birch-Jones, Evaluation Manager Gail Young, Evaluation.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Impact of evaluations matters IDEAS Conference 2011, Amman “Evidence to Policy: Lessons Learnt from Influential Impact Evaluations” Presenter: Daniel Svoboda,
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Chapter Thirteen – Organizational Effectiveness.  Be able to define organizational effectiveness  Understand the issues underpinning measuring organizational.
Evaluation design and implementation Puja Myles
This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Kathy Corbiere Service Delivery and Performance Commission
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Strategies for making evaluations more influential in supporting program management and informing decision-making Australasian Evaluation Society 2011.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Office of the Auditor General of Canada Modernizing Accountability A need for evaluation Presentation to the CES 2003 Annual Conference Vancouver John.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Continual Service Improvement Methods & Techniques.
Learning By Doing (Or Looping the Loop, Scooping the Poop & Shooting the Hoop)
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
ISSUES & CHALLENGES Adaptation, translation, and global application DiClemente, Crosby, & Kegler, 2009.
“Evaluation Follow-Up: Challenges and Lessons” AEA 2011 Conference Think-Tank Session Organizer: Scott Chaplowe Presenters: Osvaldo Feinstein Bidjan Nashat.
By Dr. Talat AnwarAdvisor Centre for Policy Studies, CIIT, Islamabad Centre for Policy Studies, CIIT, Islamabad
Monitoring and Evaluating Rural Advisory Services
Making Causal Claims in Non-Experimental Settings
Advocacy and CampaiGning
Monitoring and Evaluating Rural Advisory Services
Our new quality framework and methodology:
Danish Evaluation Society Conference Kolding, September 2008
Quality in Evaluation: the international development experience
Regulated Health Professions Network Evaluation Framework
Changing the Game The Logic Model
Presentation transcript:

Workshop on Using Contribution Analysis to Address Cause-Effect Questions Danish Evaluation Society Conference Kolding, September 2008 John Mayne, Advisor on Public Sector Performance

2 Workshop Objectives Understand the need to address attribution Understand how contribution analysis can help Have enough information to undertake a contribution analysis on your own

3 Outline Dealing with attribution Contribution analysis Working a case Levels of contribution analysis Conclusions

4 The challenge Attribution for outcomes always a challenge Strong evaluations (such as RCTs) not always available or possible A credible performance story needs to address attribution Sensible accountability needs to address attribution Complexity significantly complicates the issue What can be done?

5 The idea Based on the theory of change of the program, Buttressed by evidence validating the theory of change, Reinforced by examination of other influencing factors, Contribution analysis builds a reasonably credible case about the difference the program is making

6 The typical context A program has been funded to achieve intended results The results have occurred, perhaps more or less It is recognized that several factors likely ‘caused’ the results Need to know what was the program’s role in this

7 Two measurement problems Measuring outcomes Linking outcomes to actions (activities and outputs), i.e. attribution Are we making a difference with our actions?

8 Attribution Outcomes not controlled; are always other factors at play Conclusive causal links don’t exist Are trying to understand better the influence you are having on intended outcomes Need to understand the theory of the program, to establish plausible association Something like contribution analysis can help

9 The need to say something Many evaluations and most public reporting are silent on attribution Credibility greatly weakened as a result In evaluations, in performance reporting and in accountability, something be said about attribution

10 Proving Causality The gold standard debate (RCTs et al) Intense debate underway, especially in development impact evaluation Some challenge on RCTs (e.g. Scriven) Does appear if RCTs have limited applicability Then what do we do?

11 Proving Causality AEA and EES: many methods capable of demonstrating scientific rigour Methodological appropriateness for given evaluation questions Causal analysis: auto mechanic, air crashes, forensic work, doctors— Scriven’s Modus Operandi approach

12 Theory-based evaluation Reconstructing the theory of the program Assess/test the credibility of the micro-steps in the theory (links in the results chain) Developing & confirming the results achieved by the program

13 Contribution analysis: the theory There is a postulated theory of change The activities of the program were implemented The theory of change is supported by evidence Other influencing factors have been assessed & accounted for Therefore The program very likely made a contribution

14 Steps in Contribution Analysis 1. Set out the attribution problem to be addressed 2. Develop the postulated theory of change 3.Gather the existing evidence on the ToC 4. Assemble & assess the contribution story 5. Seek out additional evidence 6. Revise & strengthen the contribution story 7. Develop the complex contribution story

15 1. Set out the attribution problem Acknowledge the need to address attribution Scope the attribution problem What is really being asked What level of confidence is needed? Explore the contribution expected What are the other influencing factors? How plausible is a contribution?

16 Cause-Effect Questions Traditional attribution questions Has the program caused the outcome? How much of the outcome is caused by the program? Contribution questions Has the program made a difference? How much of a difference?

17 Cause-Effect Questions Management questions Is it reasonable to conclude that the program made a difference? What conditions are needed to make this type of program succeed? Why has the program failed?

18 building an evaluation office contribution story Evaluation aim is to ‘make a difference’ (an outcome) e.g., improvements in management and reporting, more cost-effective public service, enhanced accountability, etc. Evaluation products (outputs): Evaluations and evaluation reports Advice and assistance Step 1

19 2. Develop the ToC and Risks to It Build the postulated results chain and ToC Identify roles played by other influencing factors Identify the risks to the assumptions Determine how contested the ToC is

20 outputs (goods and services produced by the program) activities (how the program carries out its work) intermediate outcomes (the benefits and changes resulting from the outputs) end outcomes (the final or long-term consequences) Examples negotiating, consulting, inspecting, drafting legislation Examples checks delivered, advice given, people processed, information provided, reports produced Examples satisfied users, jobs found, equitable treatment, illegal entries stopped, better decisions made Examples environment improved, stronger economy, safer streets, energy saved Immediate outcomes (the first level effects of the outputs) Examples actions taken by the recipients, or behaviour changes Results A results chain External Factors

21 outputs (goods and services produced by the program) activities (how the program carries out its work) intermediate outcomes (the benefits and changes resulting from the outputs) end outcomes (the final or long-term consequences) Examples negotiating, consulting, inspecting, drafting legislation Examples checks delivered, advice given, people processed, information provided, reports produced Examples satisfied users, jobs found, equitable treatment, illegal entries stopped, better decisions made Examples environment improved, stronger economy, safer streets, energy saved Immediate outcomes (the first level effects of the outputs) Examples actions taken by the recipients, or behaviour changes Results Why will these immediate outcomes come about? Results chain links External Factors

22 Theories of change A results chain with embedded assumptions and risks identified An explanation of why the results chain is expected to work; what has to happen Reduction in smoking Anti-smoking campaign Assumptions: target is reached, message is heard, message is convincing, no other major influences at work Risks: target not reached, poor message, peer pressure very strong

23 Strengthened management of agriculture research Institutionalization of integrated PM&E systems and strategic management principles Enhanced planning processes, evaluation systems, monitoring systems, and professional PM&E capacities More effective, efficient and relevant agricultural programs information training and workshops facilitation of organizational change outputs immediate outcomes intermediate outcomes final outcomes (impacts) (impacts Assumptions: Intended target audience received the outputs. With hands on, participatory assistance and training, AROs will try enhanced planning, monitoring and evaluation approaches. Risks: Intended reach not met; training and information not convincing enough for AROs to make the investment; only partially adopted to show interest to donors. Assumptions: Over time and with continued participatory assistance, AROs will integrate these new approaches into how they do business. The projects activities complement other influencing factors. Risks: Trial efforts do not demonstrate their worth; pressures for greater accountability dissipate; PM&E systems sidelined. Assumptions: The new planning, monitoring and evaluation approaches will enhance the capacity of the AROs to better manage their resources. Risks: Management becomes too complicated; PM&E systems become a burden; information overload; evidence not really valued for managing Assumptions: Better management will result in more effective, efficient and relevant agricultural programs. Risks: New approaches do not deliver (great plans but poor delivery); resources cut backs affect PM&E first; weak utilization of evaluative information. Results Chain Theory of Change: Assumptions and Risks Figure 1 Enhancing Management Capacity in Agricultural Research Organizations (AROs) Adapted from Horton, Mackay, Anderson and Dupleich (2000).

24 Theory one: Classification The quality of particular aspects of health care can be monitored and measured to provide valid and reliable rankings of comparative performance Theory two: Disclosure Information on the comparative performance and the identity of the respective parties is disclosed and publicised through public media Theory six: Rival Framing The ‘expert framing’ assumed in the performance measure is distorted through the application of the media’s ‘dominant frames’ Theory four: Response Parties subject to the public notification measures will react to the sanctions in order to maintain position or improve performance Theory five: Ratings Resistance The authority of the performance measures can be undermined by the agents of those measured claiming that the data are invalid and unreliable Theory seven: Measure manipulation Response may be made to the measurement rather than its consequences with attempts to outmanoeuvre the monitoring apparatus Theory three a, b, c, d Alternative sanctions The sanction mounted on the basis of differential performance operate through: a) ‘regulation’ b) ‘consumer choice’ c) ‘purchasing decisions’ d) ‘shaming’ Theory three: Sanction Members of the broader health community act on the disclosure in order to influence subsequent performance of named parties Figure 2 An initial ‘ theory map ’ of the public disclosure of health care information From Pawson et al. (2005)

Theory of Change for an Evaluation Office better informed management implementation of recommendations & advice acceptance of recommendations & advice Outputs Immediate Outcomes Final Outcomes Results Chain More effective programs informed decision-making productive operations cost-effective programs Better benefits to citizens Evaluation Reports -findings & conclusions -recommendations Advice better management practices Intermediate Outcomes Enhanced value of evaluative thinking Our contribution story line managers’ & organisation initiatives Evaluation Studies -participation Better designed programs Better data for evaluations Step 2 Other influencing factors Office has credibility and evidence Changes not planned anyway Recommendations work

26 3. Gather existing evidence Assess the logical robustness of the ToC Gather available evidence on Results Assumptions Other influencing factors

27 4. Assemble and assess the contribution story Set out the contribution story Assess its strengths and weaknesses Refine the ToC

28 Theory of change analysis Need to identify which of the links in the results chain have the weakest evidence Some may be supported by prior research Some may be well accepted But some may be a large leap of faith, or the subject of debate With limited resources, these contested links are where effort should be focused

29 5. Seek out additional evidence Determine what is needed Gather new evidence

30 Strengthening Techniques Refine the results chain and/or gather additional results data Survey knowledgeable others involved Track program variations and their impacts (time, location, strength) Undertake case studies Identify relevant research or evaluation Use multiple lines of evidence Do a focused mini-evaluation

31 The Agr Research Orgs evaluation CA done: Theory of change developed Other influencing factors recognized The theory of change was revised based on lessons learned CA that could have been done: A more CA structured approach More analysis of other factors More attention to the risks faced

32 6. Revise and strengthen the contribution story Build the more credible contribution story Reassess its strengths and weaknesses Revisit step 5

33 A CA Case Study Patton (2008). Advocacy Impact Evaluation. JMDE, 5(9): Collaboration of agencies spent over $2M on a campaign to influence a Supreme Court decision Evaluation Issue: Did it work? Conclusion: the campaign contributed significantly to the Court’s decision

34 Features Was a stealth campaign Evaluation used Scriven’s General Elimination Method, or the modus operandi approach. Undertook considerable document review and interviews, an in-depth case study which served as the evidence for the evaluation

35 Cause-effect Attribution vs contribution Attribution concepts don’t work well in complex settings Contribution analysis identifies likely influences Case examined 2 alternative possible influences

36 Levels of contribution analysis Minimalist contribution analysis Contribution analysis of direct influence Contribution analysis of indirect influence

37 Minimalist CA Develop the theory of change Confirm that the expected outputs were delivered then, Based on the strength of the theory of change, conclude the program made a contribution

38 Other influencing factors Literature and knowledgeable others can identify the possible other factors Reflecting on the theory of change may provide some insight on their plausibility Prior evaluation/research may provide insight Relative size compared to the program intervention can be examined Knowledgeable others will have views on the relative importance of other factors

39 CA of direct influence Minimalist CA, plus Verifying the expected direct outcomes occurred Confirming the assumptions associated with the direct outcomes Accounting for other influencing factors

40 CA of indirect influence CA of direct influence, plus Verifying the intermediate and final outcomes occurred Confirming the assumptions associated with these indirect outcomes Accounting for other influencing factors

41 A credible contribution statement Description of program context and other influencing factors A plausible theory of change Confirmed program activities, outputs and outcomes CA findings: evidence supporting the ToC and assessment of other influencing factors Discussion of the quality of evidence

42 When is CA useful? Program is not experimental Funding is based on a theory of change Program has been in place for some time No real scope for varying the intervention(s)

43 Contribution analysis Builds evidence on Immediate/intermediate outcomes, the behavioural changes Links in the results chain Other influencing factors at play Other explanations for observed outcomes Contribution Evaluation