What Can Different Types of Evaluation Add to Individual-Focused Behavior Change Interventions? Margaret Handley, PhD MPH Center for Vulnerable Populations.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Using RE-AIM as a tool for Program Evaluation From Research to Practice.
ROADMAP FOR THE FUTURE Developing the Maine Comprehensive Cancer Control Plan
Implementation Research: Using Science to Guide Implementation of Evidence-Based Practices Brian S. Mittman, PhD Director, VA Center for Implementation.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Chapter 2 Flashcards.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Cancer Disparities Research Partnership Program Process & Outcome Evaluation Amanda Greene, PhD, MPH, RN Paul Young, MBA, MPH Natalie Stultz, MS NOVA Research.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
1. 2 Implementing and Evaluating of an Evidence Based Nursing into Practice Prepared By Dr. Nahed Said El nagger Assistant Professor of Nursing H.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Competency Assessment Public Health Professional (2012)-
How to Develop the Right Research Questions for Program Evaluation
A Tool to Monitor Local Level SPF SIG Activities
Essential Service # 7:. Why learn about the 10 Essential Services?  Improve quality and performance.  Achieve better outcomes – improved health, less.
Models for Program Planning in Health Promotion
Public Health Systems Research: What We Know and Need to Learn Glen P. Mays, PhD, MPH Department of Health Policy & Management UAMS College of Public Health.
STUDY PLANNING & DESIGN TO ENHANCE TRANSLATION OF HEALTH BEHAVIOR RESEARCH Lisa Klesges, Russell Glasgow, Paul Estabrooks, David Dzewaltowski, Sheana Bull.
Selecting an Evidence-based Approach (EBA) with the Best Fit Image courtesy of Naypong at FreeDigitalPhotos.net.
ORIENTATION SESSION Strengthening Chronic Disease Prevention & Management.
Darren A. DeWalt, MD, MPH Division of General Internal Medicine Maihan B. Vu, Dr.PH, MPH Center for Health Promotion and Disease Prevention University.
Variation in Process and Priorities between Local Health Department Led Community Health Assessments/Improvement Plans and Hospital Led Community Health.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
Step 6: Implementing Change. Implementing Change Our Roadmap.
SESIH Redesign Update Older Persons and Chronic Care Project Paul Preobrajensky Manager Redesign Program 19 September 2007.
Evaluating Dissemination of AHRQ CER Products Darren Mays, PhD, MPH Department of Oncology Georgetown University Medical Center Lombardi Comprehensive.
RE-AIM Plus To Evaluate Effective Dissemination of AHRQ CER Products Michele Heisler, MD, MPA September, 2011.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Filling evidence gaps Karen Glanz, PhD, MPH Emory University Glorian Sorensen, PhD, MPH Harvard University Dana-Farber Cancer Institute.
The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Barbara Resnick, PhD, CRNP, FAAN, FAANP
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Using Guidelines: The Need for Adaptation Ian D Graham, PhD, FCAHS December 10, 2012 E-GAPPS.
Evidence-Based Public Health Selecting Evidence-Based Interventions Joanne Rinker 1.
SD60 Social Responsibility and Positive Behaviour Support Team Leader Training.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
Factors impacting implementation of a community coalition-driven evidence- based intervention: results from a cluster randomized controlled trial Methods.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
THE EVIDENCE SANDWICH MODEL Dr. Soumyadeep Bhaumik BioMedical Genomics Centre, Kolkata Research priority setting exercises:
Supporting Development of Organisational Knowledge Management Strategy NHS Librarians Meeting 3 rd June 2010.
Evaluation design and implementation Puja Myles
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Basic Concepts of Outcome-Informed Practice (OIP).
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
National Coordinating Center for the Regional Genetic Service Collaboratives ( HRSA – ) Joan A. Scott, MS CGC, Chief, Genetics Services Branch Division.
Scottish Improvement Science Collaborating Centre Strengthening the evidence base for improvement science: lessons learned Dr Nicola Gray, Senior Lecturer,
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Can an evidence based coaching intervention improve outcomes for older people with congestive heart failure (CHF) and their informal caregivers within.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
From Program Theory to Systems Theory: Using Logic Analysis to Re- conceptualize an Evaluation Lori L. Bakken, PhD; Jonathan M. Ross, MD; Curtis A. Olson,
Program Planning for Evidence-based Health Programs.
1 Considerations When Providing Technical Assistance on Using Evidence December 13, 2010 Shawna L. Mercer, MSc, PhD, Director, The Guide to Community Preventive.
Health Care and Promotion Fund Project Expo 2006: Beginning with dissemination in mind: Characteristics of successful health promotion programs Dr. Charles.
Evaluation What is evaluation?
Stages of Research and Development
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
RAPID RESPONSE program
Process Indicators for Patient Navigation
Developing a Health Maintenance Schedule
Models, Theories and Frameworks
4.2 Identify intervention outputs
Presentation transcript:

What Can Different Types of Evaluation Add to Individual-Focused Behavior Change Interventions? Margaret Handley, PhD MPH Center for Vulnerable Populations Department of Epidemiology & Biostatistics and Department of Medicine

Homework?

FINAL PROJECT FOR EPI 246 DUE June 3, pm For the final project for the course, please prepare a 3-5 page document and an abstract ( words) that draws on the materials and lectures from the course. You may choose to develop one of the following in an area that you are currently involved with or anticipate being involved with in the future: 1.Text for a grant proposal for a behavior change intervention or to describe behavior, using theory-informed approaches. 2.Text for a program you would develop, its description, setting and rationale, again linked to materials in class related to theory, tools or programs. 3. The rationale and sample materials for a training you would develop that would lead to behavior change, and how these programs would link to theory. Please be creative but do link the work to the theories, frameworks and models that have been presented in class, and include at least 3-4 citations from the literature. Also include at least one figure/diagram that relates to the course.

Outline 1. Evaluation Approaches and IDS Relevance 2. DIME and RE-AIM Frameworks 3. Examples Tai-Chi Intervention- RE-AIM Fidelity vs Flexibility in Practice-Based Research CBPR and Logic Model for Cancer Screening

Program Evaluation Can Help To… Measure intervention’s effectiveness on targeted process or outcome measures. Determine most efficient and effective strategy for implementation of intervention Verify the mechanisms through which you believe your intervention is working Guide/support replication in other settings Measure fidelity and adaption Align goals with system or stakeholder goals Determine cost-effectiveness & priority

How To Conceptualize Evaluation Outcome Evaluation Process Evaluation Resource Evaluation Relevant Perspectives Clinic/ Organization Public Health Policy

How do we get what we really want? “Program evaluation is the systematic collection of data related to a program’s activities and outcomes so that decisions can be made to improve efficiency, effectiveness or adequacy” - CDC, Practical Evaluation of Public Health Programs ?

Goals of Evaluation Intuitive GoalEvaluation Concept Am I making a difference? What have we done? How well have we done it? What is the value of it? What had biggest impact? What could we get rid of? How effective have we been, and for whom? What are we going to do now that we have this info? Describe/Summarize Quality; Importance; Accountability; Cost- effectiveness Effectiveness; Social equity Evidence-based planning; Outcomes-focusing

A Range of Evaluation Needs in IDS Research We know research implementation is highly dependent on local context and involved inter- related interactions across multiple groups, but we focus on measure of individual behavior change in most evaluations…. Which does not give one much to go on for successful replication or on what are key pitfalls

Framework: DIME and Translating research into Practice (TRIPLaB) Hanburry et al, Implementation Sciences 2010 Designed for UK National Health Service Program on TRIP and implementation – Collaborations for applied research (University and NHS partnerships) 1.Selecting the innovation=Develop 2.Implement in Local Settings 3.Evaluate >>> then conduct large RCT etc.

Develop, Implement and Evaluate (DIME) or Build it with the evaluation in mind Hanburry et al, Implementation Sciences Phases 1.Selecting the innovation=Develop 1.Implement in Local Settings 1.Evaluate

Develop, Implement and Evaluate (DIME) Develop Stakeholder consult to id innovations Conjoint analysis survey Mapping against theory-based characteristics Explore team/social network culture Synthesize/ranking Implement Review acceptability to stakeholders in their local context Review of ‘policy’ cost effectiveness of different strategies Delivered to relevant groups Evaluate Pre-Post test change in outcomes Interrupted time series Cost- effectiveness of implementation Hanburry et al, Implementation Sciences 2010

Develop, Implement and Evaluate (DIME) Hanburry et al, Implementation Sciences 2010 Developing/Selecting the Innovation Phase Stakeholder consult to id innovations to target (e.g qualitative interviews/focus groups >Maternal mental health prioritization w/in MCH) Conjoint analysis survey of stakeholders (e.g. trade- offs of mix of attributes for scenarios, such as likely cost/patient and local expertise to implement, data resources, preferences gets ranked by stakeholders)

DIME cont. Hanburry et al, Implementation Sciences 2010 Mapping against theory-based characteristics resulting in scoring of the different possibilities (e.g. strength of evidence for innovations eg self-efficacy) ‘Diagnostic analysis’ w/semi-quantitative surveys to see if the innovation proposed looks good regarding the local social networks and teams/networks/communication channels Synthesize to choose the innovation having been prioritized as a priority area for stakeholders, conjoint survey ranks high, and maps to evidence and practical considerations.

DIME cont. Implementation Phase Piloting different strategies Review of ‘policy’ cost effectiveness of different strategies – costs, practical factors for each option Detailing of components for fidelity and uptake (eg details of numbers and types of sessions) Deliver to relevant groups

DIME cont. Evaluation Phase Interrupted time series (e.g. snap-shots to examine the impact on processes of care and outcomes) Pre-Post test change in outcomes (surveys ind behavior change measures linked to theory-based constructs, team characteristics, qualitative doer/non- doer analyses>>’black-box’ evaluation) Cost-effectiveness of implementation (micro costs and extent of behavior change achieved to arrive at implementation cost-effectiveness)

RE-AIM TO HELP PLAN, EVALUATE, AND REPORT STUDIES R IncreaseReach E IncreaseEffectiveness A IncreaseAdoption I IncreaseImplementation M IncreaseMaintenance Glasgow, et al. Ann Behav Med 2004;27(1):3-12

PURPOSES OF RE-AIM To broaden the criteria used to evaluate programs to include external validity To evaluate issues relevant to program adoption, implementation, and sustainability To help close the gap between research studies and practice by  Informing design of interventions  Providing guides for adoptees  Suggesting standard reporting criteria

RE-AIM Evaluation Qs AreaEvaluation question to include Reach Efficacy or Effectiveness Adoption What percent of potentially eligible participants a) were excluded, b) took part and c) how representative? What impact on a) all participants who began the program; b) on process intermediate, and primary outcomes; and c) on both positive and negative (unintended), outcomes including quality of life? What percent of settings and intervention agents within these settings (e.g., schools/educators, medical offices/physicians) a) were excluded, b) participated and c) how representative were they?

RE-AIM Evaluation Qs AreaEvaluation question to include Implementation Maintenance What percent of settings and intervention agents within these settings (e.g., schools/educators, medical offices/physicians) a) were excluded, b) participated and c) how representative were they? Were intervention components delivered as intended? What were the long-term effects b) What was the attrition rate; were drop-outs representative; were different intervention components continued? b) How was the original program modified?

Example: Tai Chi Intervention in Community- Based Falls Prevention Program AreaProgram Evaluation Measure Reach Effectiveness Adoption Implementation Maintenance Reach :Those who qualified for program divided by those who responded to the promotion materials Representativeness: Demos of those who were in program compared to those coming to center, using admin data. Change in functional status measures, QOL measure =SF12 Proportion of centers approached who agreed to participate Did the trainers follow key elements of the protocol, adherence to plan, frequency of sessions, inds. Doing program at home, attendance level sustained Plan to continue/actual continuation post-trial

Example: P4H Evaluation AreaProgram Evaluation Measure Reach Adoption Implementation How did accommodating patients circumstances change the reach? How were essential intervention components maintained, and how were protocols changed during implementation? How did accommodating personnel costs affect implementation processes? How did research team working relationships impact uptake?

Example: CBPR Approach to Evaluating a Program to Decrease Cancer Disparities in Southern US Problem: cancer disparities between Af. Americans- whites Goal: Improve early cancer detection and preventive behaviors Evaluation Methods: Use a logic model and CBPR process to develop, implement and evaluate interventions that “capture the spirit of change”while maintaining measureable outcomes Outcomes at multiple levels: Process Evaluation-inputs and planning strategies Impact Evaluation-immediate effects assessed Outcome Evaluation-med. And long term outcomes assessed

Example: CBPR Approach to Evaluating a Program to Decrease Cancer Disparities in Southern US