ATTC Network Evaluation National Evaluation of the ATTC Network: Findings and Recommendations Presentation to ATTC Directors November 4, 2010 Richard Finkbiner,

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Complexities of Co-occurring Disorders STATE AGENCY PERSPECTIVE June 24, 2004 Renata J. Henry, M.Ed.
CULTURAL COMPETENCY Technical Assistance Pre-Application Workshop.
Training Principles Train the Trainers Montreal February 2010.
Succeeding not seceding: The work of the Texas legislative workgroup on integrated healthcare Mary Lehman Held, L.C.S.W. Lynda E. Frost, J.D., Ph.D. Katherine.
STEM Education Reorganization April 3, STEM Reorganization: Background  The President has placed a very high priority on using government resources.
Family Resource Center Association January 2015 Quarterly Meeting.
NRCOI March 5th Conference Call
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Purpose of the Standards
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
How to Develop the Right Research Questions for Program Evaluation
ATTC Network Evaluation 1 National Evaluation of the Addiction Technology Transfer Center (ATTC) Network: ATTC Event & Activity Reporting Database Review.
WHAT IS “CLASS”? A BRIEF ORIENTATION TO THE CLASS METHODOLOGY.
ATTC Network Overview Substance Abuse and Mental Health Services Administration Center for Substance Abuse Treatment.
Webinar: Leadership Teams October 2013: Idaho RTI.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
HRSA’s Oral Health Goals and the Role of MCH Stephen R. Smith Senior Advisor to the Administrator Health Resources and Services Administration.
1 SARAH HUNTER RAND CORPORATION LAURA STEIGHNER AMERICAN INSTITUTES FOR RESEARCH NOVEMBER 16, 2009 National Evaluation of the Demonstration to Improve.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Training of Process Facilitators Training of Process Facilitators.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
KENTUCKY YOUTH FIRST Grant Period August July
1 Promoting Evidence-Informed Practice: The BASSC Perspective Michael J. Austin, PhD, MSW, MSPH BASSC Staff Director Mack Professor of Nonprofit Management.
HECSE Quality Indicators for Leadership Preparation.
Georgetown University National Technical Assistance Center for Children’s Mental Health 1.
Commissioning Self Analysis and Planning Exercise activity sheets.
Crosswalk of Public Health Accreditation and the Public Health Code of Ethics Highlighted items relate to the Water Supply case studied discussed in the.
Technology Transfer in the Innovation Process CSAT Workforce Development Committee October 20, 2009 Heather Gotham, Ph.D. Evaluator, Mid-America ATTC Laurie.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
The Community Collaboration Coaches Roles, Strategies, and Tools.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Strategic Plan Strategic Goals (Thrusts) 1. Achieve Performance Excellence CRJ uses metrics of performance to evaluate, manage and plan its.
Take Charge of Change MASBO Strategic Roadmap Update November 15th, 2013.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Kathy Corbiere Service Delivery and Performance Commission
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
1 Perspectives on Collaboration Presentation to Travel Demand Modelling in the GTHA Organizational Structure and Regional Collaboration Systems Analysis.
AITA Conference AFP Institute Board Development Joey Wallace RESNA/NATTAP January 24, 2007.
ATTC Network Orientation August An Overview of the Network Mary Beth Johnson Aug. 26, 2008.
Helping Teachers Help All Students: The Imperative for High-Quality Professional Development Report of the Maryland Teacher Professional Development Advisory.
National Coordinating Center for the Regional Genetic Service Collaboratives ( HRSA – ) Joan A. Scott, MS CGC, Chief, Genetics Services Branch Division.
1 A Multi Level Approach to Implementation of the National CLAS Standards: Theme 1 Governance, Leadership & Workforce P. Qasimah Boston, Dr.Ph Florida.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
PENFIELD CENTRAL SCHOOL DISTRICT: K-5 LITERACY CURRICULUM AUDIT Presented by: Dr. Marijo Pearson Assistant Superintendent for Curriculum, Instruction,
Initial Project Aims To increase the capacity of primary schools in partnership with parents to implement a sustainable health and sexuality education.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
ICAJ/PAB - Improving Compliance with International Standards on Auditing Planning an audit of financial statements 19 July 2014.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Shared Services Initiative Summary of Findings and Next Steps.
External Review Exit Report Campbell County Schools November 15-18, 2015.
MUHC Innovation Model.
ATTC Network Orientation
Overview – Guide to Developing Safety Improvement Plan
Overview – Guide to Developing Safety Improvement Plan
2018 OSEP Project Directors’ Conference
Implementation Guide for Linking Adults to Opportunity
As we reflect on policies and practices for expanding and improving early identification and early intervention for youth, I would like to tie together.
Presentation transcript:

ATTC Network Evaluation National Evaluation of the ATTC Network: Findings and Recommendations Presentation to ATTC Directors November 4, 2010 Richard Finkbiner, Ph.D. Project Director Megan Cummings, B.A. Margaret Gwaltney, M.B.A. Deputy Project Director Cori Sheedy, M.A. Roy M. Gabriel, Ph.D. Principal Investigator Jeffrey R.W. Knudsen, M.A.

ATTC Network Evaluation Overview of Today’s Presentation Evaluation Purpose and Approach (review) Evaluation Design and Data Collection (review) Final Reporting Approach 7 Findings from National ATTC Evaluation 5 Recommendations 2

ATTC Network Evaluation Some History: 2005–2010 Planning the evaluation: 2005–2007  Concept and CSAT intent for national evaluation of ATTC Network launched at Nov ATTC Directors’ Meeting (Portland).  Abt and RMC staff continued to engage ATTC Directors and stakeholders in evaluation planning, logic model development, and instrument development through fall  OMB Instrument Package submitted fall  MANILA, Abt, and RMC competed for and won CSAT award to implement the evaluation, summer/fall Implementing the evaluation: 2007–2010  ATTC site visits, key informant interviews, Regional Advisory Board Surveys began winter 2008 and concluded spring  Customer Satisfaction and Benefit Survey began winter 2009 and concluded spring  Change in Practice Study began winter 2008 and concluded spring  Participation in every ATTC Directors’ meeting during this period. 3

ATTC Network Evaluation CSAT Goals for Evaluation Identify the successes of technology transfer efforts and build upon them in the future. Share lessons learned for the enhancement of all regions’ activities. Distinguish between region-specific and more cross-regional processes and outcomes 4

ATTC Network Evaluation Review of Evaluation Design 8 evaluation questions 3 interconnected studies  Planning, Partnering and Service Delivery Study  Customer Satisfaction and Benefit Study  Change in Practice (CIP) Studies 12 data collection activities  3 secondary data (GPRA, existing reports, etc.)  4 tied to CIP studies 5

Logic Model for the National Evaluation of the ATTC Network Resources Funder Goals & Mandates Regional/ State Needs Inputs State/ Regional Multi- Region National Scope Training Technical Assistance Meetings/ Conferences Product Development Research Dissemination Partnership Development Academic Programming Activities Mediating Factors include: geographic expanse/density of service population; culture/ diversity of target population; longevity of ATTC contractor; longevity of SSA directors; state/regional addiction treatment infrastructure; organizational readiness Awareness Raising Skill Building Change in Practice Objectives Program Outputs Immediate Outcomes Intermediate Outcomes Long-Term Outcomes Outcomes Mediating Factors ATTC Network Evaluation 6

Planning, Partnering & Service Delivery Study: Data Collection Activities (Review) Site visits to each regional ATTC and the ATTC National Office, consisting of interviews and focus groups with ATTC staff and consultants ATTC Regional Advisory Board Survey Key informant telephone interviews with ATTC stakeholders within each region Analyses of data from ATTC Event and Activity Reporting Database 7

ATTC Network Evaluation Customer Satisfaction and Benefit Study: Data Collection Activities (Review) Primary data collection activity: Customer Satisfaction and Benefit Survey (CSBS) Secondary data: Integrating findings from ATTC regional site visits, key informant interviews, Regional Advisory Board Surveys, and GPRA. 8

ATTC Network Evaluation Critical Action Surveys  Clinical Supervision (CS)  Motivational Interviewing (MI)  Treatment Planning MATRS (Tx PM) Success Case Interviews Audio Tape Reviews (MI only) Organizational Readiness for Change Survey (Tx PM only) Change in Practice Study: Data Collection Activities (Review) 9

ATTC Network Evaluation Completed Data Collection Activities 15 site visits—all ATTCs and National Office 14 Regional Advisory Board Surveys (N = 198, 56% response rate) N = 162 Key Informant Interviews (86%)  Includes 49 SSA/rep interviews (96%) GPRA analysis  N = 102,000 participant records from 2005–2007  N = 140,000 participant records from 2008–2010 ATTC Event and Activity Reporting Database  N = 4,787 event records, 2008–

ATTC Network Evaluation Completed Data Collection Activities (cont.) N = 1,954 Customer Satisfaction and Benefit Surveys N = 388 Critical Action Surveys (76% response rate) N = 67 Success Case Interviews (93% of target) N = 9 MI Audiotape Surveys (12% of target) N = 16 Organizational Readiness for Change Surveys 11

ATTC Network Evaluation Final Reporting Approach Final report organized around 7 main evaluation findings (not presented by sub-study, evaluation question, or data collection method) and 5 recommendations  Building upon the interconnected nature of the data collections in each of the 3 studies  A synthesis of results that increases confidence in the convergent validity and robustness of these findings for program and policy development 3 sub-study reports to follow (Jan. 2011)  Webinars for ATTCs on these more specific results Presentations Journal articles 12

ATTC Network Evaluation Today’s Presentation Finding 1: Richard Finkbiner, MANILA Findings 2 and 3: Cori Sheedy, Abt Findings 4 and 5: Roy Gabriel, RMC Findings 6 and 7: Jeff Knudsen, RMC Recommendations: Roy, RMC 13

ATTC Network Evaluation Suggested Ground Rules for Questions and Discussion in Today’s Presentation During presentation of each finding, restrict your questions to those of clarification of our data sources, interpretation, etc. Reserve offering more reflective observations and implications until after all 7 findings have been presented. Discussion of the 5 recommendations (during working lunch) can be more dynamic and spontaneous. 14

ATTC Network Evaluation FINDING 1 The vast majority of ATTC customers and stakeholders perceive the benefits they receive from ATTC services to be extremely valuable. Respondents who most often participated in ATTC services reported the greatest benefits.  High levels of satisfaction and perceived benefits of ATTC products and services  CSBS sample heterogeneous and representative of the addictions treatment workforce  Frequency of participation in a variety of ATTC events strongly related to perceived benefits  Motivation to participate in ATTC events  Participation in and satisfaction with specific types of ATTC services varies by role 15

ATTC Network Evaluation CSBS Survey Response Rates Raw Response Rate CSBS 48.6 % “Trimmed” Response Rate 54.6 % Range Across Regions 33.6 % % Median Response Rate 58.3 % Source: CSBS (n = 1,954) Customer Satisfaction and Benefit Survey: Response Rates 16

ATTC Network Evaluation Participation in Different Types of ATTC Activities 17

ATTC Network Evaluation Beyond Satisfaction: Perceived Benefits of ATTC Technology Transfer 18

ATTC Network Evaluation Perceived Benefits by Varying “Dosage” of Participation in ATTC Events 19

ATTC Network Evaluation Group Percent CSBS Sample Overall ATTC Satisfaction Overall Increase in Knowledge Overall Improved My Job Skills High external motivation 1.8%74%87% Some external motivation 7.3%88%91%87% Neutral 42.7%89%94%87% Some internal motivation 19.3%94%95%89% High internal motivation 29.0%97%98%95% Source: CSBS (n = 1,954) Perceived Benefits by Participants’ Motivation for Attending ATTC Events 20

ATTC Network Evaluation FINDING 2 ATTCs succeed in balancing the needs, interests, and requirements of diverse stakeholders within available resources. The variety of stakeholders creates a strategic tension, which ATTCs address by delivering services that have a broad reach and are less resource-intensive.  Leveraging funding resources  Relationship-building across the ATTC Network  Mechanisms for identifying workforce priorities and stakeholder needs and interests  Assessing and evaluating needs to strategically plan services  Partnering in service delivery  Bundling of services  Consequences of ATTCs meeting the diverse array of needs and requirements 21

ATTC Network Evaluation Leveraging Funding Resources The majority of ATTCs have actively leveraged SAMHSA/CSAT funding, and in some cases, this additional funding is considerable.  One ATTC has nearly doubled their SAMHSA/CSAT funding, with an operating budget of more than $1,000,000.—Site Visit  The ATTC [grant] is the springboard to receive these other funds.—ATTC Director Additional funding includes supplemental SAMHSA/CSAT funding and funding from NIDA, States, local organizations, and private foundations. ATTCs combine funding streams (ATTC funds, partner resources, program income, etc.) to provide services.  A little less than one third (27%) of ATTC activities had 2 or more funding sources. —E&A Database 22

ATTC Network Evaluation Relationship-Building Across the Network ATTCs work with and across a broad array of stakeholders, spending considerable time and effort identifying partners and building and maintaining relationships. During interviews, stakeholders identified the ATTCs’ efforts to collaborate with them and to maintain partnerships as a key factor to their satisfaction with the ATTCs. ATTCs also view the extensive effort they devote to building and maintaining relationships as crucial to determining the most appropriate services for the region. The methods used to build relationships vary and depend on the partner, preexisting relationships or personal connections with that partner, the partner’s location, and the expected outcome of the partnership. 23

ATTC Network Evaluation Mechanisms for Identifying Priorities and Assessing and Evaluating Needs ATTC mechanisms for identifying priorities and needs:  ATTCs continually seek input from customers about regional priorities and needs via formal and informal ways.  Mechanisms typically include Regional Advisory and other boards, regularly scheduled and ad-hoc communications and meetings, and participation in external committees and boards. Assessing and evaluating needs to strategically plan services:  Processes include holding annual ATTC staff retreats to discuss priorities and develop the ATTC’s work plan, conducting and analyzing needs assessments, working directly with the SSAs to prioritize the State’s needs, and utilizing decision-making tools.  In responding to different needs, often a single activity will try to meet the needs of several different stakeholder groups.  Notably absent in the planning of services is a discussion of the expected outcomes of the activities planned. 24

ATTC Network Evaluation Partnering in Service Delivery ATTC partnerships serve as a key platform for effective service delivery and are an effort to extend the ATTCs’ reach to more participants and to obtain additional support and funding for their activities. ATTCs partner with regional, State, and local organizations and agencies; national and Federal organizations; and other ATTCs, among others, to deliver services. Partners bring different resources and expertise to the relationship.  84% of events between January 1, 2008, and June 30, 2010, (4,031 events) were delivered in partnership with other organizations and 23% of events had multiple partners. Most frequently cited partners included SSAs (41% of events), another ATTC (24%), State/regional organizations (23%), and community treatment programs (10%). —E&A Database 25

ATTC Network Evaluation Bundling of Services “Bundling” of services occurs when a series of ATTC activities is coordinated at regional and State levels or when services are delivered as a series on a specific topic.  More than one third (38%) of ATTC events were delivered as part of a series and in collaboration with other services. —E&A Database Bundling of services has been shown to facilitate successful implementation of EBPs and organizational change (Fixsen & Blase, 2009) 1. During interviews stakeholders said there is a need for the ATTC to provide services aimed at changing practice and to increase follow-up activities, such as technical assistance provided over a period of time. 1 Fixsen, D.L. & Blasé, K.A. (2009, January). Implementation: The missing link between research and practice. NIRN Implementation Brief No. 1. Chapel Hill: The University of North Carolina, FPG, NIRN. 26

ATTC Network Evaluation Consequences of Meeting Diverse Array of Needs and Requirements ATTCs are successful at balancing the needs of their stakeholders and providing training and technology transfer activities to meet identified needs within limited funding. However, ATTCs encounter challenges in their efforts, e.g.,  Having to build and maintain relationships with many stakeholders  Delivering services to address diverse workforce needs  Coping and operating with limited resources to meet identified needs and other Federal requirements The result is that a disproportionate emphasis is placed on services and activities that fall within the earlier phases of the technology transfer continuum with fewer activities aimed at changing treatment practice. 27

ATTC Network Evaluation FINDING 3 The ATTCs have adopted a State-specific approach to service delivery, built upon the strong relationships they have established with SSAs. Additional demands on this relationship will likely arise from recent national policy developments and challenging economic conditions.  Relationship-building with SSAs  Longevity, expertise, and shared values of ATTC and SSA staff  Policy and economic conditions affecting states and the addictions treatment field 28

ATTC Network Evaluation Relationship-Building With SSAs ATTC directors and staff invest significant time and employ multiple strategies to initiate and nurture their relationships with SSAs. These strategies include:  Including SSA directors on ATTC Regional Advisory Boards.  Nearly three fourths (73.3%) of SSAs or their representatives agree or strongly agree that their input is sufficiently considered in this process, virtually the same as that of all Regional Advisory Board members (73.4%)—Regional Advisory Board Survey  Scheduling regular calls/meetings between ATTC staff and SSA directors.  Participating on outside committees or organizations, where SSA directors, staff, and other leaders in the addictions field also are represented (e.g., State provider associations or SSA committees).  Having ad hoc meetings with SSA directors at conferences and other regional or national meetings. 29

ATTC Network Evaluation Longevity, Expertise, and Shared Values of ATTC and SSA Staff ATTCs have unique expertise that is an asset to the SSAs, and this expertise has helped build, solidify, and sustain the relationship between SSA directors and ATTCs. A core function of the SSAs is providing training and assistance to professionals in the field, and SSAs are therefore motivated to form close relationships with their Regional ATTCs. Although SSA Administrators interviewed have only been in their positions an average of 3.5 years, they and ATTC Directors have similar longevity in the field (ATTC Directors: average of 26.5 years; SSAs: average of 22.3 years), which facilitates establishing and maintaining relationships. 30

ATTC Network Evaluation Policy and Economic Conditions Affecting States and the Addictions Treatment Field Contextual factors also affect the relationship between SSAs and ATTCs and suggest their intensity and importance may increase in the near future. These include:  Current economic recession, budget constraints, furloughs, travel freezes, program cuts.  Partnering with the ATTC is even more important in leaner times when the State has fewer resources. We are such a small State and have a declining budget, so that partnering with the ATTC has been invaluable. We wouldn’t have been able to do training on returning veterans or MI without the ATTC.—SSA Director  Changing profile of individuals with substance use disorders and environment in which individuals are being treated.  Continuing high turnover in frontline staff and supervisors.  Current health care reform and new requirements for parity, which will lead to increasing numbers of insured individuals eligible for health coverage and behavioral health treatment. 31

ATTC Network Evaluation FINDING 4 Many stakeholders see a need for the ATTCs to establish a national identity with regard to strengthening the workforce within the national behavioral health infrastructure.  Shifting federal direction and mandates to the ATTC Network  Increased collaboration among the Regional ATTCs  Dissemination and coordination by the ATTC National Office  ATTCs moving toward a leading role in the national behavioral health system 32

ATTC Network Evaluation Some History... ATTCs initially funded in 1993  N = 11 “independent Addiction Training Centers”  U.S. not fully covered and no National Office Expanded in 1998  N = 13 regional ATTCs and a National Office (no clear role)  Collaboration: The Change Book, TAP 21  Topically-focused inter-regional committees A few years later  ATTCs told to move away from national collaboration and stick to meeting regional needs  GPRA “targets”  NIH/SAMHSA role demarcation 33

ATTC Network Evaluation More Recently... Increased collaboration  2007 RFA and direction from GPO  Restoration of inter-regional work groups  Reduce duplication and capitalize on regional “lessons learned”  TAP 21-A, ATTC Technology Transfer Model, ROSC Expanded, proactive role of National Office  Consolidated ATTC Website, Event and Activity Reporting Database, Leadership Institutes, Workforce Survey, use of technology, etc.  National point of contact w/other professional groups (e.g., NADAAC, NASADAD)  Funding opportunities for the Network 34

ATTC Network Evaluation The Changing Landscape Behavioral Health is an Essential Part of Health —SAMHSA  ATTCs’ proven proficiency in establishing partnerships across wide variety of constituencies.  Data from this evaluation documenting benefits of ATTC work (beyond satisfaction) from a variety of perspectives.  Establishing partnerships with mental health and primary medical care professionals and organizations:  National Association of State Mental Health Program Directors  National Council for Community Behavioral Health Care  Federally Qualified Health Centers, Community Health Centers  A consistent presence and voice in SAMHSA internal and inter- agency work groups.  Providing skill-based technology transfer to medical professionals (e.g., SBIRT).  Continuing to educate other professionals, disciplines. 35

ATTC Network Evaluation Information Source Not a universal feeling, but among those adding this to their “wish list” for ATTCs are stakeholders from:  CSAT  NASADAD  Research Society on Alcoholism  NADAAC  Advocates for Human Potential  SSAs  Addiction Educators  State Providers’ Associations 36

ATTC Network Evaluation FINDING 5 Following their participation in ATTC initiatives designed to change clinical practice, practitioners report high levels of implementation of evidence based practices (EBPs) in addiction treatment settings. Implementation efforts were accompanied by improvements in associated clinical skills and improvement in supervisor/clinician/client relationships.  High levels of implementation of critical actions  Improved implementation and EBP-related skills  Key ingredients to successful implementation  Improved clinical relationships  Assurance of fidelity of implementation requires more in-depth study 37

ATTC Network Evaluation Methodology and Data Sources (Review) “Success Case” method  Critical Action Surveys  N = 100 MI participants (70% response rate 1 )  N = 150 CS participants (79%)  N = 138 Tx Planning MATRS (79%)  Success Case Interviews  N = 24 MI (100% of target sample size)  N = 21 CS (88%)  N = 22 Tx Planning MATRS (92%) 1 Of eligible and consenting participants 38

ATTC Network Evaluation “Critical Actions” of EBPs To assess changes in practice, needed to get specific—How would we know that a clinician was putting into practice what they learned via ATTC technology transfer activities?  Through review of research literature and ATTC curricula, identified “critical actions” associated with each of the 3 topics  5 for MI (e.g., Seeking to understand your client’s frame of reference via reflective listening)  6 for CS (e.g., Conducting regular supervisory interviews with supervisees)  7 for Tx PM (e.g., Producing individualized treatment plans that include goals reflecting what the client wants to achieve)  These critical actions were the centerpiece of our Critical Action Surveys (“Since participating in the ATTC... to what extent have you been able to implement the following... ?)  Formed a composite of the 5, 6, or 7 critical action implementation responses for a “total score” of implementation of a given EBP 39

ATTC Network Evaluation High Prevalence of Implementation of Critical Actions after ATTC Technology Transfer Event(s) 40

ATTC Network Evaluation Other Improvements Reported after ATTC Technology Transfer Event(s) Significant improvement in proficiency with these critical actions (all 18 at p <.001) Improvements in related clinical skills or agency operations  MI: 5 skills, 97%–99% reporting improvement  CS: 4 skills, 91%–98% reporting improvement  Tx PM: 4 skills, 61%–83% reporting improvement Perception of improved client behaviors such as client engagement, retention in treatment  MI: 62%–83%  Tx PM: 50%–60% Reported by many interviewees: Improved clinical relationships  MI and Tx PM: Clinician and Client  CS: Clinical Supervisor and Clinician 41

ATTC Network Evaluation Key Ingredients to Successful Implementation Individual Clinician characteristics  Belief that the practice was important to quality care  Prior experience/familiarity with the critical actions Treatment Agency/Organizational characteristics  Formal support from clinical supervisor(s); informal support and recognition from management  Work environment supportive of client-centered approaches to care ATTC TT Event(s) characteristics  Expert trainer, highly interactive format, peer-to-peer exchange  Evidence-based content, concrete, user-friendly materials (checklists, charts, forms) 42

ATTC Network Evaluation FINDING 6 While the ATTCs deliver high quality services to meet the needs of diverse stakeholders and customers, there is less emphasis on matching participants to services and providing posttraining support to improve technology transfer outcomes.  ATTC services match needs and requirements of stakeholders and are highly rated  Recruiting participants to match the services provided and outcomes desired  Postevent support enhances technology transfer outcomes 43

ATTC Network Evaluation Service Delivery Continuum Individual, organizational, and event characteristics all play a role in supporting implementation. In considering these data in the context of a full service delivery continuum, multiple lessons emerge. 44

ATTC Network Evaluation Clearly Identified Strengths of ATTC Service Delivery ATTCs successfully interact with a broad array of customers and stakeholders, resulting in a series of service delivery topics that are consistent with the needs and interests of ATTC stakeholders and customers and the requirements of their funders. (c.f., Finding 2) Customers report a high level of benefit and satisfaction from the activities, events, and products provided by the ATTCs. (c.f., Finding 1) Customers described specific characteristics of ATTC events that aided their learning and eventual implementation of critical actions associated with EBPs (c.f., Finding 5): 45

ATTC Network Evaluation Importance of Recruitment in Achieving Technology Transfer Outcomes Critical Action Surveys and subsequent success case interviews offer compelling evidence that participants’ success in postevent implementation is based more on the right match of service and participant than the length or intensity of the event(s).  A participant’s prior experience with the practices and postevent support provided were the strongest determinants of postevent implementation success (regardless of self-reported proficiency).  Many ATTC participants reported not being in an appropriate role to implement various critical actions, jeopardizing many potential technology transfer outcomes. Differential recruitment processes have varying levels of control over who attends ATTC events—hugely important especially for events tagged with “change in practice” objective.  ATTC-controlled versus shared or externally controlled recruitment strategies  Targeted versus open recruitment 46

ATTC Network Evaluation Importance of Post Service Follow-Up & Support in Achieving Technology Transfer Outcomes Data clearly indicate that participants need and want both formal and informal posttraining support to have meaningful, lasting implementation success. Specifically, they need further support to assist with (a) making the new practice a priority, (b) getting better at the practice they are implementing, and (c) avoiding the tendency to “fall back into their old ways” of doing things. Participants cited multiple organizational- and ATTC-based supports that aided their successful implementation. 47

ATTC Network Evaluation FINDING 7 ATTC services produce improvements in the knowledge, skills, and practices of the addictions treatment workforce. However, limited resources appear to be devoted to the specification and measurement of the intended outcomes of these services.  ATTCs’ technology transfer efforts bring about a variety of positive changes  Defining technology transfer success remains a challenge  The ATTC Network technology transfer model: boundaries and definitions  Lack of clarity in intended outcomes of ATTC technology transfer activities  The benefits of improved clarity in the intent and goals of ATTC technology transfer 48

ATTC Network Evaluation The Work of ATTCs Results in Positive Changes in Their Participants As reported previously (c.f., Findings 1 and 5), multiple data sources in this evaluation indicate that ATTC service participants report positive changes in their knowledge, skills, and practices following ATTC events. Specifically:  Statistical analysis of pretest, posttest, and 30-day follow up GPRA over the first 3 years of the current funding period demonstrates participants perceive a statistically significant increase in their knowledge, skills, and effectiveness from pre- to postevent and from post- to 30-day follow-up of the event (both p <.01).  This perceived improvement is also supported by findings from the CSBS, with 95% of those respondents expressing agreement that they increased their knowledge, 91% that they increased their skills, and 90% that they were able to do their job better as a result of services received from the ATTCs. (c.f., Finding 1).  Critical Action Surveys and follow-up interview results further confirm the ATTCs’ impact on clinical practice as training participants reported high levels of implementation, increased proficiency, and significant impacts on their clinical practice (c.f., Finding 5). 49

ATTC Network Evaluation Defining Technology Transfer Success is a Challenge Need for common terms and definitions:  Several models or frameworks for technology transfer  Corresponding variety and inconsistency in their terminology related to important concepts such as “success” Although the literature leaves little doubt that the ultimate goal when dealing with evidence-based practices is full implementation with fidelity and/or rational adaptation, smaller improvements to practice can be worthwhile. Changes may occur in multiple, incremental stages of implementation, distinguishing between installation, initial implementation, and full implementation (Fixsen et al., ) of practices. While it is clear that ATTC technology transfer efforts can elicit incremental changes and the adoption of discrete behaviors associated with EBPs, it is difficult to assess their success without knowing their ultimate goal or intention of service delivery. 1 Fixsen, D., Naoom, S.F., Blasé, K.A., Friedman, R.M. & Wallace, F. (2005)Implementation research: A synthesis of the literature. FMHI Publication No. 231 Univ. of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network 50

ATTC Network Evaluation An Example... Treatment Planning MATRS participants clearly changed their practices with regard to making the client’s voice a central part of the treatment planning process, which resulted in significant outcomes related to client-clinician relationships and overall client engagement in the ultimately prescribed treatment program. However, a significant piece of the Treatment Planning MATRS curriculum (using results from standardized assessment tools for the purpose of guiding the treatment planning process) was implemented far less frequently. 51

ATTC Network Evaluation Current Framework Offers Promise The ATTC Technology Transfer Model, developed in the current year by the ATTC Technology Transfer Workgroup, makes great strides in (a) defining the basic tenets of the technology transfer process, and (b) establishing boundaries as to where the ATTC Network’s responsibilities start and end.  However, terms such as “promoting adoption”, “encouraging success”, and “early implementation” have not been accompanied by any form of measurement or success metric and, at the service delivery level, the issues of identifying explicit goals, defining success, and implementing an assessment strategy remain. The ATTC Network most clearly attempts to define the technology transfer intentions of discrete service delivery via 3 technology transfer objectives: awareness-raising, skill-building, and change in practice.  These objectives are intended to represent important distinctions in the specific intent of technology transfer activities, but they appear to be unclear or inconsistently applied across the Network. 52

ATTC Network Evaluation An Example... Of the 32 technology transfer events included in the Change in Practice study in this evaluation, none were specified by its sponsoring ATTC to have change in practice as its objective. Further, 15% of these events specified awareness-raising, an objective at the opposite end of the continuum from an intent to change clinical practice. 53

ATTC Network Evaluation Benefits to Seeking Further Clarity Added clarity, accompanied by consistent reporting, not only will help ATTCs define success but also will help:  Better define expectations for all involved (partners and participants).  Establish the level of effort needed in planning, delivery, and follow-up of services.  Identify optimal strategies and resources needed to achieve intended outcomes.  Increase the evaluability of ATTC efforts, rendering success easier to define and lessons learned easier to interpret. 54

ATTC Network Evaluation Next Steps for Evaluation Findings... Completion of more detailed reports of the methodology and findings of the 3 Evaluation Studies (Jan. 2011)  Planning, Partnering and Service Delivery — Abt Associates  Customer Satisfaction and Benefits — MANILA Consulting  Change in Practice Study — RMC Research Webinars for ATTC Directors and staff on more detailed findings of these studies (Jan.-Feb. 2011) Dissemination of reports  Executive Summary  Full Report  Region-Specific Results (CSBS, E&A Reporting Database) 55

ATTC Network Evaluation Next Steps in Today’s Presentation Lunch Presentation and Discussion of Evaluation’s Recommendations 56

ATTC Network Evaluation Recommendations Requested of us by CSAT The collective sentiments from the National Evaluation Team (NET) Not necessarily endorsed by CSAT or all stakeholders Implications for both the ATTC Network and its funders 57

ATTC Network Evaluation Recommendation #1 SAMHSA/CSAT and the ATTCs should devote increased attention and resources to the Implementation phase of the Network’s Technology Transfer Model  Demonstrated expertise in planning, partnering, and service delivery  Requests from stakeholders  Evidence from CIP studies  Current health reform expands these opportunities  Obvious resource implications for SAMHSA/CSAT, NIDA 58

ATTC Network Evaluation ATTC Network Technology Transfer Model (2010) 59

ATTC Network Evaluation Evidence From Change in Practice (CIP) Study ATTCs have shown they can do this. CIP study event participants reported:  Very high prevalence of (self-reported) implementation of specific critical actions associated with all 3 CIP study topics  Statistically significant (self-reported) improvement in proficiency in critical action implementation  Very high prevalence of (self-reported) improvement in clinical skills and agency operations associated with the CIP study topics Stakeholders requesting it  Another need not being met is follow-up after dissemination: are people actually doing what they were trained to do or did they just go back to their programs and do what they were doing before they came to the trainings? – Addiction Educator 60

ATTC Network Evaluation New Opportunities Behavioral Health is an Essential Part of Health —SAMHSA 61

ATTC Network Evaluation Recommendation #2 ATTCs, in collaboration with SSAs, other partners, and evaluation experts should invest more in identifying, clarifying, and measuring the intended outcomes of their services.  Benefits to ATTCs and their partners  Provides opportunities to better demonstrate successes and lessons learned  SSA: “We don’t pay for activities, we pay for results.” 62

ATTC Network Evaluation Clarifying Intended Outcomes of Specific Activities: Benefits to All To ATTCs  Guiding the technology transfer approach, format, intensity, and resources allocated to services  Further clarifying lessons learned, standardizing approaches across regions  Committing to outcomes, avoiding “scattergun” accomplishments unlikely to be replicated To their Partners and Customers  Clarifying their expectations, potentially leading to better participant/service match  Keeping planning and implementation activities on the same page To SAMHSA/CSAT  Identifying best practices in technology transfer  Advancing the reputation of ATTCs and the Network 63

ATTC Network Evaluation Recommendation #3 ATTCs should not allow their strength in tailoring their services to specific needs of their SSAs and region to preclude further establishing a national, Network-wide identity and responding to opportunities to address national issues.  A Network-wide strategy  New opportunities with national health reform  Working with new disciplines will require more formal needs assessment processes (a continuation and acceleration of an ATTC trend)  SAMHSA/CSAT—not another “unfunded mandate” 64

ATTC Network Evaluation Recommendation #4 To ensure the sustainability of the Network, SAMHSA/CSAT and the ATTCs must plan for “what’s next” in evaluation for this highly visible, long-standing program.  First since 1993  GPRA ≠ Evaluation, and data used only minimally  Evaluation implementation challenges, extending beyond ATTCs to the treatment community  Need capacity building in evaluation in some regions 65

ATTC Network Evaluation Challenges in Implementing This Evaluation Excellent cooperation from ATTC regions and the addictions treatment community, in general. However, serious instances of  Inattention to evaluation planning considerations  Casual implementation of evaluation activities/procedures (e.g., CSBS sampling, CIP event specification, ORC Survey, MI audiotape administration)  Inconsistent translation of data collection instructions (e.g., consent processes in CIP events) GPRA system: A poor act to follow  Lack of meaningful interpretation and use of data at federal level  Meaningless procedures and ad hoc standards in data collection (e.g., resulting in follow-up response rates > 100%) 66

ATTC Network Evaluation What’s Next in Evaluation for ATTC Network? Needs  Evaluation investment and capacity building across Network  Evaluation as a voice in Network activity planning and implementation Opportunities  Secondary analysis of GPRA data  Use the ATTC Event and Activity Reporting Database  Target evaluation resources to important, specific Network initiatives (e.g., Leadership Institutes, skill-building and change in practice efforts with primary care physicians) 67

ATTC Network Evaluation Recommendation #5 SAMHSA/CSAT and the ATTCs should identify and implement mechanisms to expand the resources available to support ATTC events, activities, and initiatives  Increasing need and opportunities for ATTCs  Very difficult for SAMHSA/CSAT to increase base funding of ATTCs  Other existing discretionary grant programs offer opportunities  Better documentation, marketing of ATTC successes 68

ATTC Network Evaluation Existing Discretionary Grant Opportunities SAMHSA/CSAT and other Federal Agencies:  When federal agency identifies ATTCs as a resource in grant announcements, suggest/designate a portion of grant budget be allocated to ATTC as partner or subcontractor (TA, workforce development), as often done for evaluation  Encourage states to dedicate a portion of their SAPT Block Grant to ATTC, for workforce development and training  Include ATTCs in inter-agency strategic planning or work groups, e.g., implementing Affordable Care Act or 2008 MH Parity and Addictions Equity Act  Other TA providers supported by SAMHSA (e.g., State Systems Development, TA to States programs) can act as referral agents to ATTCs ATTCs and National Office:  Monitor discretionary grant opportunities and approach states with partnering ideas and services 69

ATTC Network Evaluation Next Steps for Recommendations Submitted to CSAT with Final Report, Sept. 30 Distributed to CSAT leadership and staff associated with ATTC Network We’re done 70

ATTC Network Evaluation And, Finally, From the National Evaluation Team... Heartfelt thanks to all who supported and contributed to this evaluation  To the ATTCs and the National Office for participation in the initial conceptual planning of the evaluation; and for assistance in the implementation of the many data collection activities in this evaluation.  To CSAT and, specifically,  Deepa Avula, our project officer, for her belief in our evaluation design and wise counsel in our implementation of the evaluation activities and interpretation of results  Cathy Nugent, ATTC project officer for most of the evaluation, for her unflagging support and encouragement of our efforts  Donna Doolin, current ATTC project officer, for picking up where Cathy left off and contributing her insights to the evaluation  To the many ATTC customers, stakeholders, and partners who contributed their opinions, perceptions and experiences to this comprehensive evaluation effort. 71