New Technique for Comparison to an Alternative: The Negotiated Alternative Let’s Get Beyond this Tedious Best Method Logjam.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

HELAC WORKSHOP 5 A RESPONSE FROM KAREN RADFORD HERITAGE AND CONSERVATION MANAGER PRIORITIES OF STAKEHOLDERS.
Postgraduate Course 7. Evidence-based management: Research designs.
Presented by: Tom Chapel Focus On…Thinking About Design.
Evidence-Based Education (EBE) Grover J. (Russ) Whitehurst Assistant Secretary Educational Research and Improvement United States Department of Education.
Midterm Eval of Teaching Get out piece of paper Do NOT put your name on it What should Prof. Mitchell Start doing? Stop doing Continue doing.
John Lynch, P.E. Palisade Software Miami Users Conference October 25,
What You Will Learn From These Sessions
Developing Indicators to Assess Program Outcomes Kathryn E. H. Race Race & Associates, Ltd. Panel Presentation at American Evaluation Association Meeting.
8. Evidence-based management Step 3: Critical appraisal of studies
Data Collection* Presenter: Octavia Kuransky, MSP.
Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
PPA 502 – Program Evaluation
Evaluation Research COMT 502. topics Evaluation research –Process of determining whether intervention has its intended result.
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
Evaluation. Practical Evaluation Michael Quinn Patton.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Strengthening the quality of research for policy engagement in the African context – achievements and aspirations Tebogo B. Seleka Botswana Institute for.
Selecting a Research Design. Research Design Refers to the outline, plan, or strategy specifying the procedure to be used in answering research questions.
Operational Issues – Lessons learnt So you want to do an Impact Evaluation… Luis ANDRES Lead Economist Sustainable Development Department South Asia Region.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Boundary Management, Exclusive Access and Certification Schemes The Challenge of Change: Managing for Sustainability of Oceanic Top Predators April 12,
Too expensive Too complicated Too time consuming.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
1 Hsin Chu, August 2012 Regulatory Impact Assessment Charles-Henri Montin, Senior Regulatory Expert, Ministry of economy and finance, Paris
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 11 Experimental Designs.
BUILDING STRONG SM Plan Formulation: General Module G-1: What is plan formulation?
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.
MEDIATING NATURAL RESOURCE CONFLICTS: USEFUL TOOLS AND CONCEPTS MICHAEL BROWN SENIOR MEDIATION EXPERT STANDBY MEDIATION TEAM UN DEPARTMENT OF POLITICAL.
Setting Goals and Getting Started with Scenarios Emily McKenzie.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
STANDARDS OF EVIDENCE FOR INFORMING DECISIONS ON CHOOSING AMONG ALTERNATIVE APPROACHES TO PROVIDING RH/FP SERVICES Ian Askew, Population Council July 30,
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
Potential and Pitfalls of Experimental Impact Evaluation: Reflections on the design and implementation of an experimental Payments for Environmental Services.
Scientifically-Based Research What is scientifically-based research? How do evaluate it?
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
+ NASP’s Position Statement on Prevention and Intervention Research in the Schools Training School Psychologists to be Experts in Evidence Based Practices.
Possible Evaluation Model: Reactions from the Field David T. Conley, Ph.D. Professor, University of Oregon.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Evaluation design and implementation Puja Myles
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
Antigua Monday, December 7, What is PSIA? The analysis of the distributional impact of policies …on the welfare of different socio- economic groups.
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
National Quality Center Evaluating Your CQM Program and Improvement Efforts Kevin Garrett, MSW Lori DeLorenzo, RN, MSN May 19, 2016.
IEc INDUSTRIAL ECONOMICS, INCORPORATED Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches Cynthia Manson, Principal June 23,
PRAGMATIC Study Designs: Elderly Cancer Trials
a New Focus for External Validity
Right-sized Evaluation
Clinical Studies Continuum
Needs assessment and evaluation : service improvement and women
Faith/Opinion/Knowledge
Development Impact Evaluation in Finance and Private Sector
Strategic Environmental Assessment (SEA)
WHAT is evaluation and WHY is it important?
Building a Strong Outcome Portfolio
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

New Technique for Comparison to an Alternative: The Negotiated Alternative Let’s Get Beyond this Tedious Best Method Logjam

Comparison & Evaluation Summative evaluation assess the merit and worth of an intervention: – Merit is the net incremental change attributable to the intervention. – Worth is the value of these changes to those affected. Comparative method is the core of summative evaluation

Evaluation is Use-Inspired Applied Research Prospects for use are enhanced when the research is: – Salient; relevant or valuable for use in evaluation, – Legitimate; fair, unbiased and respectful; feasible, – Credible; true, technically adequate in handling evidence.

Applied Research vs. Evaluation Select Method Find Application Evaluate an intervention Select useful methods Applied Research Evaluation

An approximate answer to the right question is worth a good deal more than the exact answer to an approximate problem. John W. Tukey Wise Advice from an Excellent Statistician

Current Acceptable Options for Comparison (per GAO, AEA) Experimental (random assignment) designs, Quasi experimental (comparison group) designs, Interrupted time series (pre post) designs, Systematic case studies (see Yin), Expert judgment

What Is Being Evaluated? In evaluation terms the evaluand. Evaluand in resource, environmental and conservation settings is always located at the intersection of natural and human systems. Comparison must engage active elements of the two systems that: – Set the context for the intervention, – Include the mechanisms of change (e.g. change in human behavior and key levers in the natural system, – Include the intended and unintended affected elements on both systems, – Include other elements SALIENT to decision makers and key stakeholders.

Two New Options for Comparison Natural Alternative When you go to the place of the intervention you sometimes find a naturally occurring local comparison, Can be true for site specific as well as policy interventions. Negotiated Alternative A likely alternative to a decision is discussed and agreed across key parties to a decision. Negotiated alternative addresses both systems and key sources of variation (time, location) and can used to determine effects in either or both systems. Can be used ex post and ex ante

Examples of Natural Alternatives Indian Ford Creek (Oregon) ORV Use and Piping Plover in Cape Cod National Seashore

Examples of Negotiated Alternatives If PGE had developed the plan on their own leading to litigation and causing the decision to be delayed by 3 years. The ensuing development would not have included transfer of senior instream water rights, deeding of 1,500 acres of shore lands to the public, nor funding for the monitoring program and research on effects of dam removal.

Testing Negotiated Alternatives Applications in 7 Oregon fish and water cases in 2 DOI National Park Service National Seashores in 5 EPA Enforcement cases (water) Currently in approximately 25 EPA Superfund cases Results of methods testing Comparing triangulated judgments of three groups using negotiated alternative – Expert panel, science advisory team and parties, – Internal reliability Cronbach’s Alpha 0.09 – Internal validity * – External validity consistent with model and other science forecasts

Negotiated Alternative vs. Experimental Designs Select Method Find Application Evaluate an intervention Select useful methods Applied Research Evaluation Salience and legitimacy compromised, Credibility enhanced Salience and legitimacy high, Credibility lower

Summary Evaluation pursues approximate answers to the right questions, Choice of method should be guided by use (salience, legitimacy and credibility), Negotiated alternatives have excellent promise to expand our options for comparison.