1 Considerations When Providing Technical Assistance on Using Evidence December 13, 2010 Shawna L. Mercer, MSc, PhD, Director, The Guide to Community Preventive.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Understanding Capacity Building Assistance
RTI Implementer Webinar Series: Establishing a Screening Process
What You Will Learn From These Sessions
Implementation Research: Using Science to Guide Implementation of Evidence-Based Practices Brian S. Mittman, PhD Director, VA Center for Implementation.
Understanding the Details of Your Evidence-Based Options Starr Banks Cherie Rooks-Peck Kathi Wilson Community Guide Branch, Centers for Disease Control.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Family Resource Center Association January 2015 Quarterly Meeting.
Part Two: Organizational Domains and Considerations Defining and Applying Cultural Competence for Kansas SPF-SIG Prevention Programs and Services.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
Action Implementation and Monitoring A risk in PHN practice is that so much attention can be devoted to development of objectives and planning to address.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
WORKING TOGETHER TO PROVIDE EVIDENCE-BASED HEALTHY AGING PROGRAMS: PUBLIC HEALTH, AGING, AND UNIVERSITY COMMUNITIES Lucinda L. Bryant PhD MSHA MBA, University.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
A Healthy Place to Live, Learn, Work and Play:
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
For more information, please contact Jonny Andia at 1.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Healthy North Carolina 2020 and EBS/EBI 101 Joanne Rinker MS, RD, CDE, LDN Center for Healthy North Carolina Director of Training and Technical Assistance.
Community Level Models; Participatory Research and Challenges
How to Develop the Right Research Questions for Program Evaluation
Implementing and Evaluating Evidence-based Strategies
2014 AmeriCorps External Reviewer Training
Essential Service # 7:. Why learn about the 10 Essential Services?  Improve quality and performance.  Achieve better outcomes – improved health, less.
Models for Program Planning in Health Promotion
Adapting an Evidence-based Approach to Fit Your Community.
Selecting an Evidence-based Approach (EBA) with the Best Fit Image courtesy of Naypong at FreeDigitalPhotos.net.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Opioid Misuse Prevention Program “OMPP” Strategic Planning Workshop
1 OPHS FOUNDATIONAL STANDARD BOH Section Meeting February 11, 2011.
Training of Process Facilitators Training of Process Facilitators.
The Evaluation Plan.
Inventory and Assessment of NBCCEDP Interventions Evaluation November 1, 2007.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
2004 National Oral Health Conference Strategic Planning for Oral Health Programs B.J. Tatro, MSSW, PhD B.J. Tatro Consulting Scottsdale, Arizona.
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
Emory University Rollins School of Public Health.
SCIENCE-BASED PROGRAMS AND ADAPTATIONS Prepared by Healthy Teen Network and ACT For Youth.
Evaluation framework: Promoting health through strengthening community action Lori Baugh Littlejohns & Neale Smith David Thompson Health Region, Red Deer,
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Copyright © 2014 by The University of Kansas Refining the Program Intervention Based on Research.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
Why a CPCRN? CDC Expectations Katherine M. Wilson, PhD, MPH CPCRN Technical Monitor Division of Cancer Prevention and Control CDC.
CHAPTER 28 Translation of Evidence into Nursing Practice: Evidence, Clinical practice guidelines and Automated Implementation Tools.
Lessons from the CDC/RTC HIV Integration Project Marianne Zotti, DrPH, MS, FAAN Team Leader Services Management, Research & Translation Team NCCDPHP/DRH/ASB.
Bridging the Research-to-Practice Gap Session 1. Session Objectives  Understand the importance of improving data- informed decision making  Understand.
11 The CPCRN, DCPC, NCI, and the Community Guide: Areas for Collaboration and Supportive Work Shawna L. Mercer, MSc, PhD Director The Guide to Community.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Planning for School Implementation. Choice Programs Requires both district and school level coordination roles The district office establishes guidelines,
Using Quality Improvement Strategies to Implement an Intervention Module Created By Population Health Improvement Partners
Using Logic Models to Create Effective Programs
HPTN Ethics Guidance for Research: Community Obligations Africa Regional Working Group Meeting, May 19-23, 2003 Lusaka, Zambia.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Viviendo valiente AIDS Arms, Inc.
Capacity Building For Program Evaluation In A Local Tobacco Control Program Eileen Eisen-Cohen, Maricopa County Tobacco Use Prevention Program Tips for.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Program Planning for Evidence-based Health Programs.
Critical Program Movement: Integration of STD Prevention with Other Programs Kevin Fenton, MD, PhD, FFPH Director National Center for HIV/AIDS, Viral Hepatitis,
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Incorporating Evaluation into a Clinical Project
Social Network Strategy Quality Assurance Evaluation Tool
Adapting an EBI Program to Fit
Putting Public Health Evidence in Action
Presentation transcript:

1 Considerations When Providing Technical Assistance on Using Evidence December 13, 2010 Shawna L. Mercer, MSc, PhD, Director, The Guide to Community Preventive Services, Centers for Disease Control and Prevention (CDC) Cancer Prevention and Control Research Network, Prevention Research Center Program, CDC

2 Acknowledgements The material for this presentation on considerations when providing technical assistance on using evidence was provided by: The Cancer Prevention and Control Research Network of the Prevention Research Center Program, CDC

3 Overview ● Assessing the fit of organizational characteristics to match potential program, practice or policy characteristics ● Readiness to implement ● Adaptation

4 Planning & Assessment What’s the problem? Setting Objectives What do we want to achieve? Selecting Interventions What works? Implementing How do we do it? Program Planning Steps Evaluating Did it work? How well? Evaluating Did it work? How well?

5 Planning & Assessment What’s the problem? Setting Objectives What do we want to achieve? Selecting Interventions What works? Implementing How do we do it? Program Planning Steps Evaluating Did it work? How well? Evaluating Did it work? How well?

6 Planning & Assessment What’s the problem? Setting Objectives What do we want to achieve? Selecting Interventions What works? Implementing How do we do it? Program Planning Steps Evaluating Did it work? How well? Evaluating Did it work? How well?

7 Before You Hit the Ground Running ● Consider the goals, objectives and target audience for your proposed program, practice or policy (P 3 ) ● Consider the characteristics of the organization or setting ● Then, select the best one to match those goals, objectives, and audience

8 Why it is important to understand evidence-based options Reading about potential options helps to: ● Consider which matches your goals and audience ● Gauge fit to the community and organization or the need for adaptation ● Review the program’s, practice’s, or policy’s methods, facilitators guide, or implementation protocol to understand the steps for delivery ● Learn the costs of implementation

9 Definition of Fit ● Overall compatibility between a program, practice or policy and:  The audience and community served and/or  The organization that will implement it ● Ideal Match

10 Criteria for Selecting an Evidence- Based Program/Practice/Policy ● Choose an option that is well-matched with:  The health topic  The audience  Setting or organizational capacity Do you have what is needed to implement it?  Delivery methods that fit your organizational objectives & structure Using computer technology Calling participants for follow up Promoting access Making policy changes

11 Matching P 3 to the Organization and Community ● Priorities and values ● Readiness for prevention ● Interface with other delivery methods and strategies in use ● New or existing partnerships ● Availability of technical assistance &/or training

12

13 Readiness for Implementation ● Define phases of implementation  Pre-implementation  Implementation  Maintenance ● Discuss important factors or tasks in each phase

14 Pre-Implementation ● Hiring staff or recruiting volunteers ● Program staff orientation  Intervention overview (e.g., components, outcomes)  Logic model (big picture of program, practice, or policy)  Materials/resources  Logistics

15 Pre-Implementation ● Training  Intervention (e.g., logic, model, core elements)  Logistics for each component  Necessary knowledge about topic  Necessary skills for P 3 (e.g., counseling/education, computer, etc.) ● Technical assistance  Developers or researchers Materials/components Updating or adapting materials/components

16 Pre-Implementation ● Enlist community/stakeholder input  Best outreach/recruitment strategies  Estimate number in target population ● Incorporate previous needs assessment data ● Conduct formative research on any adapted materials  Feedback from expert panel  Focus groups/discussion with target populations  Pilot testing

17 Pre-Implementation ● Plan for program evaluation  Determine evaluation focus and questions (e.g., process, outcome) with stakeholders  Create or modify data collection tools  Discuss dissemination plans

18 Implementation ● Conduct program, policy, or practice ● Track implementation of core elements of P 3 (e.g., which components used, steps taken) ● Collect process measures (e.g., attendance, timeliness of activities, satisfaction with activities, need some for policy, etc.) ● Monitor and evaluate activities

19 Examples of Maintenance Activities If P 3 is successful, consider: ● Referrals for further service, if needed ● Seeking additional funding ● Securing a program champion ● Making it a part of the organization’s standards

20 Learning about the Use of Evidence- based Programs, Practices, or Policies ● The field of translation, implementation science is still young  Will known evidence-based option work in every setting?? ● Document what works or does not work in terms of implementation and outcome measures.  Activity logs (e.g., core elements, satisfaction)  Staff meeting minutes (e.g., staff burden and satisfaction)  Changes in behaviors, environment or policies

21 Fidelity versus Adaptation

22 Adaptation is… … making  Changes  Additions  Deletions  Substitutions to an evidence-based P 3 in order to make it more suitable for a particular population and/or an organization’s capacity.

23 Program Fidelity ● Fidelity: faithfulness to the elements of the program, in the way it was intended to be delivered ● Components of fidelity*:  Adherence to program protocol/implementation guide  Dose or amount of program delivered  Quality of program delivery, and  Participant reaction and acceptance * Rabin, Brownson, Haire-Joshu, Kreuter, Weaver. A glossary for dissemination and implementation research in health. Journal of Public Health Management Practice, 2008, 14(2), 117–123.

24 Fidelity vs Adaptation, Some evidence… ● Review of over 500 studies showed a relationship between the level of implementation (fidelity) and program outcomes ● However, fidelity was below 100%; some adaptation always occurs and there is some evidence that it improves outcomes Durlak, J. A. & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Comm Psych, 41,

25 Core Elements & Key Process Steps ● Core elements*:  Required components that represent the theory and internal logic of the intervention and most likely produce the intervention’s effectiveness ● Key process steps:  Required steps that are conducted to contribute to the intervention’s effectiveness  Critical steps taken in program implementation in the program’s methods section or implementation protocol *Eke, Neumann, Wilkes, Jones. Preparing effective behavioral interventions to be used by prevention providers: the role of researchers during HIV Prevention Research Trials. AIDS Education & Prevention 2006, 18(4 Suppl A):44-58.

26 Green, Yellow & Red Light Adaptations Provides guidance on whether a particular adaptation is … …safe (green) …should be made cautiously (yellow) …should be avoided (red)

27 Things That Can Probably Be Modified ● Names of health care centers or systems ● Pictures of people and places and quotes ● Hard-to-read words that affect reading level ● Wording to be appropriate to audience ● Ways to recruit your audience ● Incentives for participation ● Timeline (based on adaptation guides) ● Cultural preferences based on population

28 Things That Can Probably Be Modified: Proceed with Caution ● Substituting activities ● Adding activities to address other risk factors or behaviors ● Changing the order of the curriculum or steps (sequence)

29 Things That Cannot Be Modified ● The health communication model or theory ● The health topic/behavior ● Deleting core elements or whole sections of the program ● Reduction of program  Timeline  Dosage (e.g., activities, time/ session) ● Putting in strategies that detract from the core elements

30 Key Message: If you choose an evidence-based program to adopt, do not change the core elements (what are they) or key process steps.

31

32 Example with Policy/Process Steps

33 Cancer Prevention and Control Research Network (CPCRN) Training TopicsImportant Concepts What do we mean by evidence-based? Define evidence-based (EB) Continuum of evidence Benefits of using EB strategies/programs Needs assessment & program planning Community and target audience analysis Determinants of behavior Finding an evidence-based strategy or program Sources of EB strategies & programs Selecting a strategy or program (“fit”) Adapting the evidenced-based program to meet your needs Define: adaptation, core elements, fidelity Discuss what can & cannot be changed Evaluating your program Process evaluation (e.g., implementation, fidelity, program adaptations) Outcome evaluation

34 For more information on how to select an evidence-based program or policy that meets the needs and constraints of your community : Kurt Ribisl, PhD Principal Investigator – CPRCN Coordinating Center at UNC Chapel Hill The findings and conclusions in this presentation are those of the presenters and do not necessarily represent the views of CDC.