Promoting Science-based Approaches: Bridging Research and Practice by Integrating Research to Practice Models and Community-Centered Models (ISF) Abraham.

Slides:



Advertisements
Similar presentations
RTI International is a trade name of Research Triangle Institute Technical Assistance to North Carolinas Health & Wellness Trust Funds TUPC.
Advertisements

Restructuring the Cancer Programs and Task Force Workgroups.
Using RE-AIM as a tool for Program Evaluation From Research to Practice.
CONNECTICUT SUICIDE PREVENTION STRATEGY 2013 PLANNING NINA ROVINELLI HELLER PH.D. UNIVERSITY OF CONNECTICUT.
Northeast Regional Expert Team Education Development Center, Inc. 55 Chapel Street Newton, MA EDC-CAPT CSAP’s Center for the Application.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Summarizing Community-Based Participatory Research: Background and Context for the Review Lucille Webb, MEd Eugenia Eng, DrPH Alice Ammerman, DrPH Meera.
Institute of Public Policy University of Missouri-Columbia 1 Assessment driven planning: The development of a strategic plan to address risky drinking.
Coordinating Center Overview November 16, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Diabetes Prevention Program Initiative: Year 1 Meeting 1.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
1. 2 Implementing and Evaluating of an Evidence Based Nursing into Practice Prepared By Dr. Nahed Said El nagger Assistant Professor of Nursing H.
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
The Vision Implementation Project
Cross Border Animal Health Plan of Action – Kenya and Uganda Four Strategic areas 1. To improve prevention, management and control of cross border animal.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
IF TRAINING IS THE ANSWER, WHAT’S THE QUESTION? TOWARD AN EVIDENCE-BASED SYSTEM FOR INNOVATION SUPPORT Abraham Wandersman U. Of South Carolina
Narrowing the Gap between Evaluation and Developmental Science: Implications for Evaluation Capacity Building Tiffany Berry, PhD Research Associate Professor.
ORIENTATION SESSION Strengthening Chronic Disease Prevention & Management.
National Prevention Strategy 1. National Prevention Council Bureau of Indian AffairsDepartment of Labor Corporation for National and Community Service.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
The County Health Rankings & Roadmaps Take Action Cycle.
SPF SIG State-Level Evaluation COMMUNITY LEVEL INSTRUMENT (CLI): PART 2.
Canadian Cancer Society Manitoba Division: Knowledge Exchange Network (KEN) & CancerCare Manitoba Manitoba Integrated Chronic Disease Primary Prevention.
Program Evaluation and Logic Models
KENTUCKY YOUTH FIRST Grant Period August July
Perspectives on Impact Evaluation Cairo, Egypt March 29 – April 2, 2009 Presented by: Wayne M. Harding. Ed.M., Ph.D., Director of Projects, Social Science.
Asthma Disparities – A Focused Examination of Race and Ethnicity on the Health of Massachusetts Residents Jean Zotter, JD Director, Asthma Prevention and.
Overview June,  Sub-recipients grant applications will go to ADAMHS/ADAS Boards only.  ADAMHS/ADAS Boards will be expected to identify a primary.
Organizational Conditions for Effective School Mental Health
Dr. David Mowat June 22, 2005 Federal, Provincial & Local Roles Surveillance of Risk Factors and Determinants of Chronic Diseases.
Expanding Research and Evaluation Designs…for QII Carolyn M. Clancy, MD Director, AHRQ September 13, 2005.
PHSB 612: Interventions Diane M. Dowdy, Ph.D. Spring 2008.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
1 Making Healthy Living Easier Shannon Griffin-Blake, PhD Branch Chief for Program Implementation and Development October 17, 2012 CDC’s Division of Community.
Focus Area 23: Public Health Infrastructure Progress Review Richard J. Klein National Center for Health Statistics April 16, 2008.
TM Best Practices—2007 Centers for Disease Control and Prevention Deborah Houston McCall, MSPH, Program Consultant Program Services Branch Office on Smoking.
Lessons from the CDC/RTC HIV Integration Project Marianne Zotti, DrPH, MS, FAAN Team Leader Services Management, Research & Translation Team NCCDPHP/DRH/ASB.
State of California Department of Alcohol and Drug Programs State Incentive Grant Project Overview Michael Cunningham Deputy Director, Program Services.
Project KEEP: San Diego 1. Evidenced Based Practice  Best Research Evidence  Best Clinical Experience  Consistent with Family/Client Values  “The.
: The National Center at EDC
CHAPTER 9 COMMUNITIES AND POPULATIONS AS THE FOCUS FOR HEALTH PROMOTION PROGRAMS.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
Mark Clanton, M.D. M.P.H. Deputy Director Cancer Care Delivery Systems Moving Discovery Through to Delivery: A Critical Opportunity for Leadership and.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
1-2 Training of Process Facilitators Training of Process Facilitators To learn how to explain the Communities That Care process and the research.
Key Leader Orientation 3- Key Leader Orientation 3-1.
State of California Department of Alcohol and Drug Programs The Substance Abuse Research Consortium Semi Annual Meeting Improving the Quality, and Effectiveness.
Session 4 Agenda 1. Strategic Prevention Framework Sustainability Step 4: Implementation Step 5: Evaluation 2. Bringing It All Together 2.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Cal-ABA 26th Annual Western Regional Conference What We Know About Sustaining Programs? Randy Keyworth Ronnie Detrich Jack States.
Using PLCs to Build Expertise Community of Practice October 9, 2013 Tammy Bresnahan & Tammy Ferguson.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
Developing a Prevention Synthesis and Translation System to Promote Science Based Approaches to Teen Pregnancy, HIV, and STI Prevention Kelly M. Lewis,
Critical Program Movement: Integration of STD Prevention with Other Programs Kevin Fenton, MD, PhD, FFPH Director National Center for HIV/AIDS, Viral Hepatitis,
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Abe Wandersman University of South Carolina Duane House Centers for Disease Control and Prevention Bringing Funders, Researchers/Evaluators, and Practitioners.
Myriam Hernandez Jennings
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Models, Theories and Frameworks
Results of the Organizational Performance
Comprehensive Youth Services
The Communities That Care System
Presentation transcript:

Promoting Science-based Approaches: Bridging Research and Practice by Integrating Research to Practice Models and Community-Centered Models (ISF) Abraham Wandersman U. Of Connecticut April 2010

MAKING A DIFFERENCE

HOW DO WE GET THERE?

THE 2015 TARGET DATE FOR ELIMINATING SUFFERING AND DEATH DUE TO CANCER:

AMBITOUS GOALS

Dr. von Eschenbach: I believe we are at what I call a strategic inflection in biology, which means we're at a point of unprecedented growth in three key areas related to cancer research: knowledge, technology, and resources. The integration of growth in these three sectors provides an opportunity for exponential progress. To achieve this progress, we must set a clear direction and focus our efforts into a cohesive strategy.

The goal of eliminating suffering and death due to cancer provides this focus. It does not mean "curing" cancer but, rather, it means that we will eliminate many cancers and control the others, so that people can live with -- not die from -- cancer. We can do this by 2015, but we must reach for it. We owe it to cancer patients around the world -- and their families -- to meet this challenge. May 16, 2003 BenchMarks

HEALTHY PEOPLE 2010

Healthy People 2010 Objectives Target: 1.0 new case per 100,000 persons. Baseline: 19.5 cases of AIDS per 100,000 persons aged 13 years and older in Data are estimated; adjusted for delays in reporting. Target setting method: Better than the best. Data source: HIV/AIDS Surveillance System, CDC, NCHSTP.

Persons Aged 13 Years and Older, 1998 New AIDS Cases Both Genders Females*Males* Rate per 100,000 TOTAL Race and ethnicity American Indian or Alaska Native Asian or Pacific Islander AsianDNC Native Hawaiian and other Pacific Islander DNC Black or African AmericanDNC WhiteDNC Hispanic or Latino Not Hispanic or LatinoDNC Black or African American White Family income level PoorDNC Near poorDNC Middle/high incomeDNC Sexual orientationDNC In 2007, there were 42,495 new cases of HIV/AIDS in adults, adolescents, (2500)

DATA - EVIDENCE

WHY IS EVIDENCE/SCIENCE NOT USED MORE?

Expanding Research and Evaluation Designs…for QII Carolyn M. Clancy, MD Director, AHRQ September 13, 2005

Publication Bibliographic databases Submission Reviews, guidelines, textbook Negative results variable 0.3 year years 50% 46% 18% 35% 0.6 year 0.5 year 9.3 years It takes 17 years to turn 14 per cent of original research to the benefit of patient care to the benefit of patient care Dickersin, 1987 Koren, 1989 Balas, 1995 Poynard, 1985 Kumar, 1992 Poyer, 1982 Antman, 1992 Negative results Lack of numbers Expertopinion Inconsistentindexing 17:14 Original research Acceptance Implementation

Treatments Thought to Work but Shown Ineffective Sulphuric acid for scurvy Leeches for almost anything Insulin for schizophrenia Vitamin K for myocardial infarction HRT to prevent cardiovascular disease Flecainide for ventricular tachycardia Routine blood tests prior to surgery ABMT for late stage Breast CA BMJ February ; 324:474-5.

THE GAP BETWEEN SCIENCE AND PRACTICE IN THE DOCTOR’S OFFICE

OVERALL 54.9% RECEIVED RECOMMENDED CARE ASCH ET AL STUDY, NEJM, 2006

POSSIBLE SOLUTION VA MEDICAL SYSTEM HAS 67% RECOMMENDED CARE SYSTEM HAS ELECTRONIC MEDICAL RECORDS, DECISION SUPPORT TOOLS, AUTOMATED ORDER ENTRY, ROUTINE MEASUREMENT AND REPORTING ON QUALITY, INCENTIVES FOR PERFORMANCE

As Yogi Berra supposedly said, "In theory there is no difference between theory and practice, but in practice there is."

* Why is there a gap between science and practice?

* What is the dominant scientific paradigm for developing research evidence and disseminating it?

*Why is this science model necessary but not sufficient?

*What is the responsibility of the practitioner to deliver evidence-based interventions and what is their capacity to do so?

*What is the responsibility of funders to promote the science of evidence-based interventions and to promote the practice of effective interventions in our communities?

How can evaluation help providers, local CBOS and coalitions, health districts, and state agencies reach results-based accountability ?

Two Routes to Getting To Outcomes (GTO): A) Bridging Science and Practice B) Empowerment Evaluation

 Research To Practice   Practice To Research  CLOSING THE GREAT DIVIDE

2. With an emphasis on risk and protective factors, review relevant infor- mation—both from fields outside prevention and from existing preventive intervention research programs 3. Design, conduct, and analyze pilot studies and confirmatory and replication trials of the preventive intervention program 4.Design, conduct, and analyze large- scale trails of the preventive intervention program 5.Facilitate large-scale implementation and ongoing evaluation of the preventive intervention program in the community 1. Identity problem or disorder(s) and review information to determine its extent Feedback Loop FIGURE 1.1 The preventive intervention research cycle. Preventive intervention research is represented in boxes three and four. Notre that although information from many different fields in health research, represented in the first and second boxes, is necessary to the cycle depicted here, it is the review of this information, rather than the original studies, that is considered to be part of the preventive intervention research cycle. Likewise, for the fifth box, it is the facilitation by the investigator of the shift from research project to community service program with ongoing evaluation, rather than the service program itself, that is part of the preventive intervention research cycle. Although only one feedback loop is represented here, the exchange of knowledge among researchers and between researchers and community practitioners occurs throughout the cycle.

Gates Foundation Preventive Intervention Vaccine/Drug Mechanism Syringes Physician Health System Support System Medical Schools Government Funding

From Research to “Best Practices” in Other Settings and Populations Larry Green American Journal of Health Behavior, )Process 2)Control 3)Self-Evaluation 4)Tailoring Process and New Technology 5)Synthesizing Research

Getting to Outcomes 1) Needs/Resources 2) Goals 3) Best Practice 4) Fit 5) Capacities 6) Plan 7) Process Evaluation 8) Outcome Evaluation 9) CQI 10) Sustain

“Prevention Science” Intervention Basic research Efficacy Effectiveness Services Research Practice Community Organizational Systems 1) Schools 2) Health Agencies 3) Community Coalitions Prevention Support System (Funders) Training Technical Assistance Funding Green Characteristics 1)Process 2)Control 3)Self-Evaluation 4)Tailoring Process and new Technology 5) Synthesizing Research

io Distilling the Information—Prevention Synthesis & Translation System Supporting the Work—Prevention Support System Putting It Into Practice—Prevention Delivery System Synthesis General Capacity Building Innovation-Specific Capacity Building General Capacity Use Innovation-Specific Capacity Use Macro Policy Climate Fundin g Existing Research and Theory Translation

ROUTE B: EMPOWERMENT EVALUATION

What Can Steve Spurrier Teach Us about Loving Evaluation?

Forms Comprehensive Community Plan Chairpersons Consolidate Work of Individual Committees ReligionEducationBusinessParentsYouthHealthMedia Grassroots/ Neighborhood Criminal Justice Conduct Needs Assessment Lead Agency Ad Hoc Committee Of Community Leaders Forms Committees Resulting In Plan Implementation Impact on Community Health Indicators Resulting In Implementation COALITION FORMATION MAINTENANCE OUTCOMES Figure 2. Overview of the development of a community coalition.

Table 1. Evaluation of MPA by Developmental Phases, Ecological Levels, and Stages of Readiness Ecological Levels Developmental phases and measures Intra- personal Inter- personal OrganizationalCommunityPublic Policy Stages of readiness Phase 1: Coalition formation ForecastXInitial mobilization and establishing organizational structure Meeting Effectiveness InventoryXXX Project Insight FormXXX Committee surveyXXX Needs Assessment ChecklistXX Plan Quality IndexXXX Phase 2: Plan implementation Tracking of ActionsXXXBuilding capacity for action and implementing Prevention Plus IIIXXXXX Policy Analysis Case StudyXX Phase 3: Impact Key Leader surveyXXXRefining and institutionalizing Community surveyXX Trend dataXX Level of Institutionalization ScaleXXX

Outcome Evaluation AB

Shoot No ResultsPlanImplement ReadyAim

ReadyAimClose PlanImplementCQI Shoot

ResultsPlanImplement ReadyAimHit

Empowerment Evaluation: An evaluation approach that aims to increase the probability of achieving program success by:

a)Providing program stakeholders with tools for assessing the planning, implementation, and self-evaluation of their program, and

b)Mainstreaming evaluation as part of the planning and management of the program/organization.

EE Principles Core Principles of Empowerment Evaluation Principle 1:Improvement Principle 2:Social Justice Principle 3:Inclusion Principle 4:Democratic participation Principle 5:Capacity building Principle 6:Organizational learning Principle 7:Community ownership Principle 8:Community knowledge Principle 9:Evidence-based strategies Principle 10:Accountability

Accountability QuestionsRelevant Literatures 1.What are the underlying needs and conditions that must be addressed? (NEEDS/RESOURCES) 1.Needs/Resource Assessment 2.What are the goals, target population, and objectives? (i.e., desired outcomes)? (GOALS) 2.Goal Setting 3.What science (evidence) based models and best practice programs can be used in reaching the goals (BEST PRACTICE)? 3.Consult Literature on Science Based and Best Practice Programs 4.What actions need to be taken so the selected program “fits” the community context? (FIT) 4.Feedback on Comprehensiveness and Fit of Program 5.What organizational capacities are needed to implement the program? (CAPACITIES) 5.Assessment of Organizational Capacities 6.What is the plan for this program (PLAN)6.Planning 7.Is the program being implemented with quality (PROCESS) 7.Process evaluation 8.How well is the program working? (OUTCOME EVALUATION) 8.Outcome and Impact Evaluation 9.How will continuous quality improvement strategies be included? (IMPROVE) 9.Total Quality Management; Continuous Quality Improvement 10.If the program is successful, how will it be sustained? (SUSTAIN) 10.Sustainability and Institutionalization

What Is Getting To Outcomes ? By Matthew Chinman, Pamela Imm & Abraham Wandersman A system based on ten empowerment evaluation and accountability questions that contain elements of successful programming Published by the RAND Corporation (quality review) Available at no cost at: “Best Practice Process” - CSAP

The Getting To Outcomes Process #1 Needs/ Resources #2 Goals #3 Best Practices #4 Fit #5 Capacities #6 Plan #7 Process #8 Outcome Evaluation #9 Improve #10 Sustain

GTO-04 Manual

WINNERS Example

GTO-04 Manual Up to date model program descriptions

GTO-04 Manual Risk and protective factor based

Uses of GTO Individual Program Level (e.g., WINNERS) –Still a guide to planning, implementation, evaluation –Use data to continually improve –Determine effectiveness in one program Coalition Level (e.g., CDC grant) –Each committee monitors own programs –Direct TA for program improvement –Fulfill the whole coalition’s accountability requirements State/Federal Level (e.g., S.C. SIG grant) –Monitor several similar programs at once across large area –Aggregate program data for state-wide reporting and within state comparisons –Highlight specific technical assistance needs across the state

GTO ® 2009 Using Getting to Outcomes to improve communities' capacity to conduct high quality prevention programming: A Center for Disease Control & Prevention Empirical Example Chinman et al (2008) American Journal of Community Psychology

GTO Demonstration & Evaluation Purpose: Evaluate a 2-yr GTO intervention to improve prevention capacity and program performance CDC-funded participatory research grant Sample: 2 prevention coalitions (SC, CA) involving 10 programs & 268 coalition staff Design: Quasi-experimental; mixed methods Within each coalition, assign by program (GTO: 2 SC+ 4 CA v. Comparison: 2 SC+2 CA) intervention: participate in GTO comparison: usual practice

Getting To Outcomes Evaluation: Conclusions  GTO improved practitioner capacity & performance of tasks associated with high quality prevention (planning, evaluation, etc.) and programs that used GTO showed greater outcomes  Those with greater exposure to GTO demonstrated more gains in capacity  TA hours show that practitioners mostly want & got help with evaluation activities  GTO can be difficult to absorb without ongoing TA  Organizational issues a major factor  Conversion to “learning organization” not complete  Resources are significant barrier to adoption, implementation, and sustainability  Incentive structure within which coalitions operate not aligned with CQI

GTO Evaluation: Conclusions  Technical assistance to use the steps is critical to the success of GTO  Organizational issues can be a major factor  Lack of resources pose significant barriers to adoption, implementation, and sustainability

“Prevention Science” Intervention Basic research Efficacy Effectiveness Services Research Practice Community Organizational Systems 1) Schools 2) Health Agencies 3) Community Coalitions Prevention Support System (Funders) Training Technical Assistance Funding Green Characteristics 1)Process 2)Control 3)Self-Evaluation 4)Tailoring Process and new Technology 5) Synthesizing Research

io Distilling the Information—Prevention Synthesis & Translation System Supporting the Work—Prevention Support System Putting It Into Practice—Prevention Delivery System Synthesis General Capacity Building Innovation-Specific Capacity Building General Capacity Use Innovation-Specific Capacity Use Macro Policy Climate Fundin g Existing Research and Theory Translation

EXAMPLE COMBINING *BRIDGING RESEARCH AND PRACTICE (ISF) AND *GTO

GTO ® 2009 Teen Pregnancy Prevention The Promoting Science Based Approaches Project CDC Adolescent Reproductive Health Team

The Barriers to Use of Science-Based Approaches (SBA) Funding for training and materials Implementation funding Fear of controversy Lack of motivation (why use SBA?) Suitability for own community Ease of implementation Loyalty to current strategies Philliber, Nolte & Schauer, in prep

The Challenge Teen pregnancy field has growing number of effective prevention programs However, programs are not being implemented as widely nor as effectively as needed to combat rising teen pregnancy

io Distilling the Information—Prevention Synthesis & Translation System Supporting the Work—Prevention Support System Putting It Into Practice—Prevention Delivery System Synthesis General Capacity Building Innovation-Specific Capacity Building General Capacity Use Innovation-Specific Capacity Use Macro Policy Climate Fundin g Existing Research and Theory Translation

PSBA Activities: National Grantees Develop trainings & other tools to build capacity to use SBA Use tools to build capacity of state & regional grantee organizations Disseminate information about SBA to a broad audience GTO

PSBA Activities : State Coalitions and RTCs More intensive –Provide targeted technical assistance to small number of local organizations (5-10) to increase their capacity to use SBA locally Less intensive –Provide information and resources re: SBA to broad audiences within state/region through newsletters, websites, etc. GTO

io Distilling the Information—Prevention Synthesis & Translation System Supporting the Work—Prevention Support System Putting It Into Practice—Prevention Delivery System Synthesis General Capacity Building Innovation-Specific Capacity Building General Capacity Use Innovation-Specific Capacity Use Macro Policy Climate Fundin g Existing Research and Theory Translation

GTO System Model To Achieve Desired Outcomes Training + QI/QA + Tools + TA + = Current Level of Capacity + GTO Steps: (1) Needs & Resources; (2) Goals & Desired Outcomes; (3) Science-based practices; (4) Fit; (5) Capacity ; (6) Plan; (7) Implementation & Process Evaluation; (8) Outcome evaluation; (9) Continuous Quality Improvement; and (10) Sustainability Actual Outcome s Achieved

Levels & AIDS Treatment CountryStateHealth DistrictFQHCProvider Accountability Question 1. NEEDS/ RESOURCES 2.GOALS 3.EVIDENCE- BASED PRACTICES 4.FIT 5.CAPACITY 6.PLAN 7.IMPLEMENTATIO N 8.OUTCOME EVALUATION 9.CQI 10.SUSTAINABILITY

The GTO PANORAMA

GTO Steps 1. Needs and Resources 2. Goals and Objectives 3. Best Practices 4. Fit 5. Capacities 6. Plan 7. Process Evaluation 8. Outcome Evaluation 9. CQI 10. Sustainability GTO Steps

GTO Content 1. Needs and Resources 2. Goals and Objectives 3. Best Practices 4. Fit 5. Capacities 6. Plan 7. Process Evaluation 8. Outcome Evaluation 9. CQI 10. Sustainability Substance Abuse Specific Content GTO Steps

GTO Content Domains 1. Needs and Resources 2. Goals and Objectives 3. Best Practices 4. Fit 5. Capacities 6. Plan 7. Process Evaluation 8. Outcome Evaluation 9. CQI 10. Sustainability Substance Abuse Specific Content GTO Steps Systems of Care Performance Contracting Emergency Preparedness Specific Content Underage Drinking Specific Content Youth Development Specific Content Patient Centered Care Specific Content Teen Pregnancy Specific Content GTO Content Library

GTO Levels 1 Fit 1 Best Practices 1 Needs and Resources 1 Goals and Objectives 1 Capacities 1 Plan 1 Process Evaluation 1 Outcome Evaluation 1 CQI 1 Sustainability GTO Steps INDIVIDUAL Substan ce Abuse Specific Content 1 Fit 1 Best Practices 1 Needs and Resources 1 Goals and Objectives 1 Capacities 1 Plan 1 Process Evaluation 1 Outcome Evaluation 1 CQI 1 Sustainability GTO Steps ORGANIZATIONAL Substan ce Abuse Specific Content 1 Fit 1 Best Practices 1 Needs and Resources 1 Goals and Objectives 1 Capacities 1 Plan 1 Process Evaluation 1 Outcome Evaluation 1 CQI 1 Sustainability GTO Steps COUNTY Substan ce Abuse Specific Content 1 Fit 1 Best Practices 1 Needs and Resources 1 Goals and Objectives 1 Capacities 1 Plan 1 Process Evaluation 1 Outcome Evaluation 1 CQI 1 Sustainability GTO Steps STATE Substan ce Abuse Specific Content 1. Needs and Resources 2. Goals and Objectives 3. Best Practices 4. Fit 5. Capacities 6. Plan 7. Process Evaluation 8. Outcome Evaluation 9. CQI 10. Sustainability Substance Abuse Specific Content GTO Steps NATIONAL Systems of Care Performance Contracting Emergency Preparedness Specific Content Underage Drinking Specific Content Youth Development Specific Content Patient Centered Care Specific Content Teen Pregnancy Specific Content GTO Content Library

GTO Support System Training Technical Assistance QI/QA Tools 1 Fit 1 Best Practices 1 Needs and Resources 1 Goals and Objectives 1 Capacities 1 Plan 1 Process Evaluation 1 Outcome Evaluation 1 CQI 1 Sustainability GTO Steps INDIVIDUAL Substa nce Abuse Specific Conten t 1 Fit 1 Best Practices 1 Needs and Resources 1 Goals and Objectives 1 Capacities 1 Plan 1 Process Evaluation 1 Outcome Evaluation 1 CQI 1 Sustainability GTO Steps ORGANIZATIONAL Substa nce Abuse Specific Conten t 1 Fit 1 Best Practices 1 Needs and Resources 1 Goals and Objectives 1 Capacities 1 Plan 1 Process Evaluation 1 Outcome Evaluation 1 CQI 1 Sustainability GTO Steps COUNTY Substa nce Abuse Specific Conten t 1 Fit 1 Best Practices 1 Needs and Resources 1 Goals and Objectives 1 Capacities 1 Plan 1 Process Evaluation 1 Outcome Evaluation 1 CQI 1 Sustainability GTO Steps STATE Substa nce Abuse Specific Conten t 1. Needs and Resources 2. Goals and Objectives 3. Best Practices 4. Fit 5. Capacities 6. Plan 7. Process Evaluation 8. Outcome Evaluation 9. CQI 10. Sustainability Substance Abuse Specific Content GTO Steps NATIONAL Systems of Care Performance Contracting Emergency Preparednes s Specific Content Underage Drinking Specific Content Youth Development Specific Content Patient Centered Care Specific Content Teen Pregnancy Specific Content GTO Content Library

As Yogi Berra supposedly said: “It’s déjà vu all over again.”

* Why is there a gap between science and practice?

* What is the dominant scientific paradigm for developing research evidence and disseminating it?

*Why is this science model necessary but not sufficient?

*What is the responsibility of the practitioner to deliver evidence-based interventions and what is their capacity to do so?

*What is the responsibility of funders to promote the science of evidence-based interventions and to promote the practice of effective interventions in our communities?

How can evaluation help providers, local CBOs and coalitions, health districts, and state agencies reach results-based accountability ?

As Yogi Berra supposedly said, "If you see a fork in the road, take it."

References Chinman, M., Hunter, S. B., Ebener, P., Paddock, S. M., Stillman, L., Imm, P., Wandersman, A. (2008). The Getting To Outcomes Demonstration and Evaluation: An Illustration of the Prevention Support System. American Journal of Community Psychology, * Wandersman, A. (2003) Community science: Bridging the gap between science and practice with community-centered models. American Journal of Community Psychology, 31, 3/4, Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., et al. (2008). Bridging the gap between prevention research and practice: The Interactive Systems Framework for Dissemination and Implementation. American Journal of Community Psychology, 41, Lesesne et al (2008) Promoting Science Based Approaches to teen pregnancy prevention. American Journal of Community Psychology, Wandersman, A. (2009) Four keys to success (theory, implementation, evaluation, resource/system support): High hopes and challenges in participation. American Journal of Community Psychology. 43 (1/2), 3-21.