Incorporating Evaluation into a Clinical Project

Slides:



Advertisements
Similar presentations
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Advertisements

Using RE-AIM as a tool for Program Evaluation From Research to Practice.
Donald T. Simeon Caribbean Health Research Council
What You Will Learn From These Sessions
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
Developing a Logic Model
Sustaining Community Based Programs CYFAR Conference Boston, 2005.
MSP course 2007 Phase 0 – Setting up Kumasi, Ghana 2008 Wageningen International.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington.
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Presented by Vicki M. Young, PhD October 19,
STUDY PLANNING & DESIGN TO ENHANCE TRANSLATION OF HEALTH BEHAVIOR RESEARCH Lisa Klesges, Russell Glasgow, Paul Estabrooks, David Dzewaltowski, Sheana Bull.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Program Evaluation and Logic Models
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
2004 National Oral Health Conference Strategic Planning for Oral Health Programs B.J. Tatro, MSSW, PhD B.J. Tatro Consulting Scottsdale, Arizona.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Copyright © 2014 by The University of Kansas Refining the Program Intervention Based on Research.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
THIS PRESENTATION IS INTENDED AS ONE COMPLETE PRESENTATION. HOWEVER, IT IS DIVIDED INTO 3 PARTS IN ORDER TO FACILITATE EASIER DOWNLOADING AND VIEWING,
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Evaluate the Plan.
Evaluation design and implementation Puja Myles
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Session 2: Developing a Comprehensive M&E Work Plan.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Stages of Research and Development
Logic Models How to Integrate Data Collection into your Everyday Work.
Introduction Social ecological approach to behavior change
Evaluating the Quality and Impact of Community Benefit Programs
Translating Research Into Practice: Pragmatic Research Approaches
Dissemination and Implementation Research
Designing Effective Evaluation Strategies for Outreach Programs
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Module 2 Basic Concepts.
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Expanded CONSORT Figure for Planning and Reporting D & I Research
Health Education THeories
PURPOSE AND SCOPE OF PROJECT MONITORING AND EVALUATION M&E.
Strategic Planning for Learning Organizations
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Theory of Change template
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Ross O. Love Oklahoma Cooperative Extension Service
Logic Models and Theory of Change Models: Defining and Telling Apart
CATHCA National Conference 2018
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Chapter 14 Evaluation in Healthcare Education
WHAT is evaluation and WHY is it important?
Regulated Health Professions Network Evaluation Framework
Building a Strong Outcome Portfolio
Theory of Change template
Monitoring and Evaluating FGM/C abandonment programs
Secondary outcomes in pharmacological and behavioral treatment trials standardized or tailored to intervention type? Kevin M. Gray, M.D. Professor and.
Program Planning: Models and Theories
Sustainability of Organizational Change
Why now? New requirement for all RACs in the next Request for Applications (RFA) Improve communications among all participants Increased need to identify.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

Incorporating Evaluation into a Clinical Project University of Colorado School of Medicine Department of Family Medicine – Evaluation Hub Whitney Jones Rhodes, PhD Janet Corral, PhD Russell E. Glasgow, PhD

What is the Evaluation Hub? evaluationhub.org A service by the Department of Family Medicine for all members of the DFM Family!

What is evaluation? “Research seeks to prove, evaluation seeks to improve…” M.Q. Patton

Why evaluate? To gain insight about a project or QI initiative – What works and what doesn’t? To improve practice – How can QI efforts be modified or adapted to enhance success? To assess effects – How well are objectives and goals being met? How does the QI initiative benefit patients and other stakeholders? Is there evidence of effectiveness? Why are these outcomes (positive and negative) obtained?

“Evaluation should be a regular checkup, not an autopsy…” It is never too early to plan your evaluation!

Deciding What to Evaluate Evaluations need to be focused to assess issues that are most meaningful to stakeholders while using time and resources efficiently. (Evaluation is about values). Be specific about what aspects of your project you want to evaluate and how you plan to use the results. No, you don’t have to evaluate every component of your project! Focus on items that are actionable, not just interesting.

Bottom Line and Ultimate Evaluation question “What intervention (components), delivered under what conditions, produce what effects for what populations (subgroups), with what resources on which outcomes, and how do they come about?”

Types of Evaluation Formative evaluation Process evaluation Is the program/intervention feasible, acceptable and appropriate? When used: A program is new or being modified/adapted Process evaluation Has the program been adapted as planned? When used: Program is being implemented, ongoing basis as program is running (ideally!) For more info: Betterevaluation.org

Types of Evaluation Outcome evaluation Impact evaluation To what extent did the program achieve the intended effects (and any unintended effects) in the intended population? Impact evaluation Did the program meet its goals (e.g. 80% of patients with hypertension)? Economic and Resource Evaluation What are the costs and resources associated with operating this program? When used: Regular intervals while the program is running; at the end of the program. This applies for all types of evaluation other than formative. Can be used to make the case for continuation/future funding. For more info: https://www.cdc.gov/std/Program/pupestd/Types%20of%20Evaluation.pdf

A Different Approach: Pragmatic Research Pragmatic trial: Real-world test in a real-world population Explanatory trial: Specialized experiment in a specialized population Pragmatic designs emphasize: Participation or reach Adoption by diverse settings Ease of Implementation Maintenance Generalizability Maclure, M. (2009). Explaining pragmatic trials to pragmatic policy-makers. Canadian Medical Association Journal, 180(10), 1001-1003.

Logic Models “If…then” Logic models are a very useful way of helping to specify evaluation questions, and to sharpen thinking about questions just discussed. http://www.sciencecartoonsplus.com/pages/gallery.php

Short- and Long-Term Outcomes Logic Model   Planned Work Intended Results Resources/ Inputs: Activities Outputs Short- and Long-Term Outcomes Impact In order to accomplish our activities, we will need the following: In order to achieve our goals, we will accomplish the following activities: We expect that once activities are accomplished, we will have the following evidence: (Direct data resulting from activities) We expect that if these activities are accomplished, the following changes will occur: (Changes in individuals -- attitudes, behaviors, knowledge, skills, status) We expect these activities will create the following “big picture” change: (Future social change we are working to create) Source: The Evaluation Center - School of Education and Human Development - University of Colorado Denver

What type of evaluation do you want? What are your key questions? Keep in mind the KISS principle!

What do I need to consider at this stage? What is the PRIMARY GOAL of the program or activity to be evaluated? What is the current status of the program; is it still evolving or relatively fixed? What KEY QUESTIONS would you like the evaluation to answer? How will you know if the program is successful? What methods or design are you considering? How will you use the information from the evaluation? How many and what RESOURCES do I have for the evaluation?

What am I trying to evaluate?

RE-AIM Framework Helps you to understand what your evaluation should ‘target’ and what level of impact you are trying to measure Clear, standardized means of measuring the impact of a program or innovation and its potential for translation into practice. Can be useful in evaluating and reporting on a variety of initiatives. Five dimensions, some at the individual level with others at the setting level Reminder – you don’t have to measure everything! Re-aim.org

RE-AIM Framework Re-aim.org Reach Effectiveness Adoption Implementation Maintenance Re-aim.org

RE-AIM Framework (Individual level) Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Individual level) The absolute number, proportion, and representativeness of individuals (e.g. residents, patients) who participate in a particular program or intervention.

RE-AIM Framework (Individual level) Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Individual level) Impact of an intervention on relevant outcomes (including unintended consequences or potential negative effects, quality of life, and economic outcomes). Each project will have its own unique measures of impact.

Who is adopting the intervention? How many? Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Setting level) The absolute number, proportion, and representativeness of settings and staff (e.g. clinics, clinical staff) who are willing to initiate a program or approve a policy. Who is adopting the intervention? How many?

RE-AIM Framework (Setting level) Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Setting level) How consistently clinical staff follows the program as designed (e.g. How much has the program been adapted? What are the resources and/or costs to deliver?).

(Setting and Individual level) Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Setting and Individual level) The extent to which the effects of a program are maintained within the organization (setting level) and among participants (individual level). (e.g. How did the intervention change over time in the clinic? What long-term effects did the intervention have on participants?).

Project Scope Considerations

Project Scope Considerations How many patients or other stakeholders would you be collecting data on? How often? What data are or will be available as a routine part of the project? What additional measures or assessments are you planning to collect? What else would you like to measure? What is the timeline for this project? What contextual factors may influence the timeline?

Recommended Project Next Steps Complete the logic model Try answering the RE-AIM self-rating quiz (re-aim.org) Considerations for evaluation planning handout http://www.re-aim.hnfe.vt.edu/resources_and_tools/self_rating_screener_and_feedback/quiz.html

Thank you! Whitney Jones Rhodes, PhD whitney.jones@ucdenver.edu Evaluation Resources: evaluationhub.org