Presentation is loading. Please wait.

Presentation is loading. Please wait.

Incorporating Evaluation into a Clinical Project

Similar presentations


Presentation on theme: "Incorporating Evaluation into a Clinical Project"— Presentation transcript:

1 Incorporating Evaluation into a Clinical Project
University of Colorado School of Medicine Department of Family Medicine – Evaluation Hub Whitney Jones Rhodes, PhD Janet Corral, PhD Russell E. Glasgow, PhD

2 What is the Evaluation Hub?
evaluationhub.org A service by the Department of Family Medicine for all members of the DFM Family!

3 What is evaluation? “Research seeks to prove, evaluation seeks to improve…” M.Q. Patton

4 Why evaluate? To gain insight about a project or QI initiative – What works and what doesn’t? To improve practice – How can QI efforts be modified or adapted to enhance success? To assess effects – How well are objectives and goals being met? How does the QI initiative benefit patients and other stakeholders? Is there evidence of effectiveness? Why are these outcomes (positive and negative) obtained?

5 “Evaluation should be a regular checkup, not an autopsy…”
It is never too early to plan your evaluation!

6 Deciding What to Evaluate
Evaluations need to be focused to assess issues that are most meaningful to stakeholders while using time and resources efficiently. (Evaluation is about values). Be specific about what aspects of your project you want to evaluate and how you plan to use the results. No, you don’t have to evaluate every component of your project! Focus on items that are actionable, not just interesting.

7 Bottom Line and Ultimate Evaluation question
“What intervention (components), delivered under what conditions, produce what effects for what populations (subgroups), with what resources on which outcomes, and how do they come about?”

8 Types of Evaluation Formative evaluation Process evaluation
Is the program/intervention feasible, acceptable and appropriate? When used: A program is new or being modified/adapted Process evaluation Has the program been adapted as planned? When used: Program is being implemented, ongoing basis as program is running (ideally!) For more info: Betterevaluation.org

9 Types of Evaluation Outcome evaluation Impact evaluation
To what extent did the program achieve the intended effects (and any unintended effects) in the intended population? Impact evaluation Did the program meet its goals (e.g. 80% of patients with hypertension)? Economic and Resource Evaluation What are the costs and resources associated with operating this program? When used: Regular intervals while the program is running; at the end of the program. This applies for all types of evaluation other than formative. Can be used to make the case for continuation/future funding. For more info:

10 A Different Approach: Pragmatic Research
Pragmatic trial: Real-world test in a real-world population Explanatory trial: Specialized experiment in a specialized population Pragmatic designs emphasize: Participation or reach Adoption by diverse settings Ease of Implementation Maintenance Generalizability Maclure, M. (2009). Explaining pragmatic trials to pragmatic policy-makers. Canadian Medical Association Journal, 180(10),

11 Logic Models “If…then”
Logic models are a very useful way of helping to specify evaluation questions, and to sharpen thinking about questions just discussed.

12 Short- and Long-Term Outcomes
Logic Model Planned Work Intended Results Resources/ Inputs: Activities Outputs Short- and Long-Term Outcomes Impact In order to accomplish our activities, we will need the following: In order to achieve our goals, we will accomplish the following activities: We expect that once activities are accomplished, we will have the following evidence: (Direct data resulting from activities) We expect that if these activities are accomplished, the following changes will occur: (Changes in individuals -- attitudes, behaviors, knowledge, skills, status) We expect these activities will create the following “big picture” change: (Future social change we are working to create) Source: The Evaluation Center - School of Education and Human Development - University of Colorado Denver

13 What type of evaluation do you want? What are your key questions?
Keep in mind the KISS principle!

14 What do I need to consider at this stage?
What is the PRIMARY GOAL of the program or activity to be evaluated? What is the current status of the program; is it still evolving or relatively fixed? What KEY QUESTIONS would you like the evaluation to answer? How will you know if the program is successful? What methods or design are you considering? How will you use the information from the evaluation? How many and what RESOURCES do I have for the evaluation?

15 What am I trying to evaluate?

16 RE-AIM Framework Helps you to understand what your evaluation should ‘target’ and what level of impact you are trying to measure Clear, standardized means of measuring the impact of a program or innovation and its potential for translation into practice. Can be useful in evaluating and reporting on a variety of initiatives. Five dimensions, some at the individual level with others at the setting level Reminder – you don’t have to measure everything! Re-aim.org

17 RE-AIM Framework Re-aim.org Reach Effectiveness Adoption
Implementation Maintenance Re-aim.org

18 RE-AIM Framework (Individual level)
Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Individual level) The absolute number, proportion, and representativeness of individuals (e.g. residents, patients) who participate in a particular program or intervention.

19 RE-AIM Framework (Individual level)
Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Individual level) Impact of an intervention on relevant outcomes (including unintended consequences or potential negative effects, quality of life, and economic outcomes). Each project will have its own unique measures of impact.

20 Who is adopting the intervention? How many?
Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Setting level) The absolute number, proportion, and representativeness of settings and staff (e.g. clinics, clinical staff) who are willing to initiate a program or approve a policy. Who is adopting the intervention? How many?

21 RE-AIM Framework (Setting level)
Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Setting level) How consistently clinical staff follows the program as designed (e.g. How much has the program been adapted? What are the resources and/or costs to deliver?).

22 (Setting and Individual level)
Reach Effectiveness Adoption Implementation Maintenance RE-AIM Framework (Setting and Individual level) The extent to which the effects of a program are maintained within the organization (setting level) and among participants (individual level). (e.g. How did the intervention change over time in the clinic? What long-term effects did the intervention have on participants?).

23 Project Scope Considerations

24 Project Scope Considerations
How many patients or other stakeholders would you be collecting data on? How often? What data are or will be available as a routine part of the project? What additional measures or assessments are you planning to collect? What else would you like to measure? What is the timeline for this project? What contextual factors may influence the timeline?

25 Recommended Project Next Steps
Complete the logic model Try answering the RE-AIM self-rating quiz (re-aim.org) Considerations for evaluation planning handout

26 Thank you! Whitney Jones Rhodes, PhD Evaluation Resources: evaluationhub.org


Download ppt "Incorporating Evaluation into a Clinical Project"

Similar presentations


Ads by Google