Overview of Evidence Based Practice

Slides:



Advertisements
Similar presentations
Diversity Issues in Research Charlotte Brown, Ph.D. Associate Professor of Psychiatry Western Psychiatric Institute and Clinic PMBC Summer Institute, Pittsburgh,
Advertisements

Scaling-Up Early Childhood Intervention Literacy Learning Practices Maurice McInerney, Ph.D. American Institutes for Research Presentation prepared for.
Current Issues in School-Based Speech-Language Pathology
Donald T. Simeon Caribbean Health Research Council
The Practice of Evidence Based Practice … or Can You Finish What You Started? Ron Van Treuren, Ph.D. Seven Counties Services, Inc. Louisville, KY.
US Office of Education K
Experimental designs Non-experimental pre-experimental quasi-experimental experimental No time order time order variable time order variables time order.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Ensuring Quality Services and the Protection of Patients’ Welfare A GPS for quality care. Jeanne L. Obert Executive Director Matrix Institute on Addictions.
September The California Evidence-Based Clearinghouse for Child Welfare (CEBC) In 2004, the California Department of Social Services,
Research Insights from the Family Home Program: An Adaptation of the Teaching-Family Model at Boys Town Daniel L. Daly and Ronald W. Thompson EUSARF 2014/
What You Will Learn From These Sessions
Systematic Review of the Effectiveness of health behavior interventions based on TTM.
 Drug Prevention and Education Programs.  There is a growing trend in both prevention and mental health services towards Evidenced Based Practices (EBP).
Does It Work? Evaluating Your Program
Why don’t innovation models help with informatics implementations? Rod Ward University of the West of England Medinfo 2010.
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
Using An Organizational Assessment : A framework to Help Agencies Build on Strengths, Recognize Challenges, and Develop a Comprehensive Work Plan, CWDA.
Evaluation and Policy in Transforming Nursing
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
The Vision Implementation Project
Chapter 15 Current Concerns and Future Challenges.
Gambling Research Think Tank December 7, About NSHRF Speaking the Same Language –Evidence, research and evaluation –Types of gambling research strategies.
Evaluation Basics Purpose of Evaluation As adults and youth design and implement their evaluation, there are several important principles that will help.
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
Performance Measurement and Analysis for Health Organizations
Implementation Strategy for Evidence- Based Practices CIMH Community Development Team Model Pam Hawkins, Senior Associate Association for Criminal Justice.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Facilitated by Sharon Schnelle, Ph.D. Social Science Research Specialist Incorporating Evidence Based Practices: Overview, Opportunities & Challenges.
Clinical Social Work Research Patience Matute-Ewelisane Eugene Shabash Jayne Griffin.
Diffusion of Innovations Gerontology 820 Ashley Waldoch October 18, 2010.
Medical Audit.
What’s in a Name? What out-of-home care managers think ‘evidence-based practice’ really means! Deirdre Cheers ACWA Conference - 2nd September 2002.
Investigation and case planning Your responsibilities under the Children Act 1989 Brayne & Carr: Law for Social Workers: 10e Chapter 9.
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Classroom Assessments Checklists, Rating Scales, and Rubrics
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
Improving Outcomes with Effective Trauma-Informed Interventions
California Statewide Prevention and Early Intervention (PEI) Projects Overview May 20, 2010.
CEBP Learning Institute Fall 2009 Evaluation Report A collaborative Partnership between Indiana Department of Corrections & Indiana University November.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Secondary Translation: Completing the process to Improving Health Daniel E. Ford, MD, MPH Vice Dean Johns Hopkins School of Medicine Introduction to Clinical.
This material was developed by Oregon Health & Science University, funded by the Department of Health and Human Services, Office of the National Coordinator.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Laying the Foundation for Scaling Up During Development.
Your Presenters Melissa Connelly, Director, Regional Training Academy Coordination Project, CalSWEC Sylvia Deporto, Deputy Director, Family & Children’s.
Project KEEP: San Diego 1. Evidenced Based Practice  Best Research Evidence  Best Clinical Experience  Consistent with Family/Client Values  “The.
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
Charles Wilson, MSSW, Executive Director of Chadwick Center The Sam and Rose Stein Chair on Child Protection Rady Children’s Hospital-San Diego
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. &
Karen A. Blase, PhD, Allison Metz, PhD and Dean L. Fixsen, PhD Frank Porter Graham Child Development Institute University of North Carolina at Chapel Hill.
Knowing what or understanding how: The role of RCTs in changing clinical practice Ivan Eisler Reader in Family Therapy Institute of Psychiatry, Kings College.
Evidence-Based Practice What is it & what do we do with it? Barrett Johnson CWDA Child Welfare Conference Monterey, CA May 29 th, 2008.
+ Evidence Based Practice University of Utah Evidence-Based Treatment and Practice: New Opportunities to Bridge Clinical Research and Practice, Enhance.
Charles Wilson, MSSW, Executive Director of Chadwick Center The Sam and Rose Stein Chair on Child Protection Rady Children’s Hospital-San Diego
Gaps in Substance Use Treatment Presented by: Rhonda G. Patrick, LCSW, MPA Amy C. Traylor, MSW, Ph.D.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Ethical Issues in Treatment Selection Northern Arizona University Timothy C. Thomason.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
What Makes A Juvenile Justice Program Evidence Based? Clay Yeager Policy Director, Blueprints for Healthy Youth Development.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Stages of Research and Development
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Criteria for Assessing MHPSS Proposals Submitted through the CAP, CERF and HRF Funding Mechanisms to the Protection Cluster.
Evidence Based Practice Process
MUHC Innovation Model.
IV-E Prevention Family First Implementation & Policy Work Group
Evidence-Based Public Health
Presentation transcript:

Overview of Evidence Based Practice Charles Wilson, MSSW, Executive Director of Chadwick Center The Sam and Rose Stein Chair on Child Protection Rady Children’s Hospital-San Diego Sponsored by the California Evidence- Based Clearinghouse for Child Welfare www.cachildwelfareclearinghouse.org

How Things Change A Problem is Recognized Action-Any Action

Action- Creation of Orphan Trains Between 1854 and 1929 100,000-200,000 children were placed in new families via the Orphan Trains. http://www.orphantraindepot.com Children were taken in small groups of 10 to 40, under the supervision of at least one adult, and traveled on trains to selected stops along the way, where they were taken by families in that area. http://www.pbs.org/wgbh/amex/orphan/teachers.html

Series of Trail and Errors Adjustments-Some Better-Some Worse How Things Change A Problem is Recognized Series of Trail and Errors Adjustments-Some Better-Some Worse Action-Any Action Informed Action

Trial and Error Family Foster Care Orphanages and Boarding schools Tennessee Preparatory School for Dependent Children

Informed Action-Based on Science How Things Change A Problem is Recognized Informed Action Informed Action-Based on Science

So how do we know what works vs. mere marketing marketing hyperbole? Let the Buyer Beware

Thought Field Therapy “Thought field therapy with Callahan techniques® is a powerful therapy exerted through nature's healing system to balance the body's energy system. This therapy promotes stress management and stress relief as well as the reduction or elimination of anxiety and anxiety related problems. This includes help for weight control and weight loss, trauma or sleep difficulties, depression, addictions and the disorders associated with past trauma including nightmares and post traumatic stress disorder.” (underlines added) Roger J. Callahan, PhD Retrieved from http://www.tftrx.com/, November 17, 2006

More Claims for TFT Q. How Can TFT Benefit You? – What Kind of Problems Can Be Helped? Anxiety and Stress Personal fears or your children’s fears Anger and Frustration Eating or smoking or drinking problems Loss of loved ones Social or public speaking fears Sexual or intimacy problems Travel anxiety including fear of flying or driving on the freeways Nail biting Cravings Low moods and mood swings Retrieved from http://www.tftrx.com/profaq.php?PHPSESSID= f4cf66c40b9678b742b82989fee7b377# on November 17, 2006

NPR All Things Considered, March 29, 2006 “According to psychologist Roger Callahan, the creator of thought field therapy, major problems like depression can be cured quickly with this method. He says post-traumatic stress disorder is easily dispatched in 15 minutes, and even the most serious cases of anxiety, addiction and phobias are likewise subject to quarter-hour cures.”

Research on TFT? “Has any research been carried out on TFT? There have been no control (sic) studies on the success of TFT” From the Thought Field Therapy Training Center of La Jolla Retrieved from http://thoughtfield.com/faqs.htm on November 17, 2006

All sorts of “interventions” are available out there. Distinguishing groundless marketing claims from reality The Problem: All sorts of “interventions” are available out there.

Waiting Room Sign Ben Saunders MUSC

Evidence Based Social Work “Professional judgments and behaviors should be guided by two interdependent principals: When ever possible, practice should be grounded on prior findings that demonstrate empirically…that they are likely to produce predictable, beneficial, and effective results. Every clients system, over time should be evaluated” Evidence Based Practice Manual Oxford University Press 2004 Albert Roberts, PhD Kenneth Yeager, PhD, LISW

Global Definition of EBP The conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients. Including Both The best available clinical evidence from systematic research Individual clinical expertise -David Sackett

Huge Policy Implications Should policy makers support adoption of EBP? If so, which ones –When are they “Ready for Prime time” What is the standard of evidence? If so, how best can they support adoption? What are the pitfalls of a state or national policy level adoption of EBP? Impact on Innovation Misapplication of good models?-One size does not fit all Watering down of empirically based practice-danger of implementing in name only Ideology vs. Science- who is the judge of the science? Should we limit what we do to EBP?

Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomized controlled trials (Gordon C Smith, Jill P Pell, 2005) The perception that parachutes are a successful intervention is based. largely on anecdotal evidence Observational data have shown that their use is associated with, morbidity and mortality due to both failure of the intervention and mechanical complications. In addition, “natural history" studies of free fall indicate that failure to take or deploy a parachute does not inevitably result in an adverse outcome... The effectiveness of an intervention has to be judged relative to non-intervention. Understanding the natural history of free fall is therefore imperative. If failure to use a parachute were associated with 100% mortality then any survival associated with its use might be considered evidence of effectiveness. Therefore, studies are required to calculate the balance of risks and benefits of parachute use.

Why Evidence-Based Practice Now? A growing body of scientific knowledge Increased interest in consistent application of quality services Increased interest in outcomes and accountability by funders Past missteps in spreading untested “best practices” that turned out not to be as effective as advertised Because they work !!

Problems in the Child Abuse Field in the U.S. Empirical evidence of efficacy has not been a common criteria for treatment selection in the child maltreatment field. Lack of outcome research for many commonly used interventions. Ready willingness among some to use, embrace, promote, and staunchly defend practices that have no evidence for their efficacy and questionable theoretical bases. Poor dissemination of the significant clinical outcome research that has been done. Ineffective approaches to continuing education. Poor adoption of empirically supported treatments in real world clinical settings. Disconnection between current scientific knowledge and practice in the field.

Scared Straight

TF-CBT

Reactive Attachment Disorder and Attachment Therapy …pioneered by psychoanalyst Aaron Lederer, the RAD Consultancy’s creator and director. His methods yield remarkable results within weeks. Retrieved from http://www.radconsultancy.com/, November 17, 2006

Why should we worry about using Evidence Supported Treatments?

Best Research Evidence Best Clinical Experience Institute of Medicine: Apply the Principles and Methods of Evidence Based Practice Integration of: Best Research Evidence Best Clinical Experience Consistent with Client Values http://www.shef.ac.uk/scharr/ir/netting/ http://ebmh.bmj.com/ http://cebmh.com/ http://www.cebm.utoronto.ca/

Understand Adoption of Innovation Early Adopters Majority Late Traditionalists Innovators MTFC 1991

Common Errors When Deciding about Intervention Effectiveness Reliance solely on individual anecdotes and remembered cases. “That child made such amazing changes during treatment.” Confusing client satisfaction with clinical improvement. “The family just loved coming to therapy. Never missed a session during their 3 years of therapy. Amazing. Too bad they had to move away.” Misattribution of the cause of change. Failure to appreciate resilience and natural recovery. “The family got multiple services and wrap around care.” “With treatment her PTSD resolved in about 3 months after the rape.” Guru effect in training and treatment adoption. “I heard Dr. McDreamy is doing a level II training. And, it’s in San Diego in January!” “Those videos were just so amazing! I have got to try that.” Ben Saunders MUSC

What to look for in a Practice? Treatment or intervention protocol that has at least some scientific, empirical research evidence for its efficacy with its intended target problems and populations. Evidence may be based on a variety of research designs. Randomized Clinical Trial (RCT) Controlled studies without randomization Open trials, pre- post-, or uncontrolled studies Multiple baseline, single case designs The degree to which we are persuaded that the treatment is effective will vary by the quality of empirical support. Number of RCT’s Replication by researchers other than the treatment developers Sampling, sample size used, comparison treatment, effect size Various methods have been developed for classifying the level of empirical support enjoyed by treatment approaches. Should be useful for front-line practitioners

CEBC Website: www.cachildwelfareclearinghouse.org

Current Data on Visitors to the Website Total Number of Visits to the Website 46,635 Percentage of Total Visitors from over 131 International Countries 14% Percentage of Total Visitors from U.S. 86% Percentage of Total Visitors from California 33% Data based on numbers as of September 1, 2007

CEBC’s Definition of Evidence-Based Practice for Child Welfare Best Research Evidence Best Clinical Experience Consistent with Family/ Client Values (modified from The Institute of Medicine) http://www.iom.edu/

The California Evidence-Based Clearinghouse for Child Welfare (CEBC) In 2004, the California Department of Social Services, Office of Child Abuse Prevention contracted with the Chadwick Center for Children and Families, Rady Children’s Hospital-San Diego in cooperation with the Child and Adolescent Services Research Center to create the CEBC. The CEBC was launched on 6/15/06.

Advisory Committee The Advisory Committee is composed of 15 members drawn from a broad cross-representation of communities and organizations. There are representatives from: California Department of Social Services Child Welfare Departments from California Counties Child Welfare Director’s Association (CWDA) California Child Welfare Training Leaders Public and Private Community Partners Within the State The role of the Advisory Committee is to: Determine the topical areas for the CEBC Ensure the CEBC remains up-to-date with emerging evidence. Assist in disseminating the products of the CEBC. Provide feedback on the utility of the CEBC products.

National Scientific Panel The National Scientific Panel is composed of five core members and up to 10 selected Topical Experts. The Panel is nationally recognized as leaders in child welfare research and practice, and who are knowledgeable about what constitutes best practice/evidence-based practice. The Panel assists in identifying relevant practices and research and provide guidance on the scientific integrity of the CEBC products.

Scientific Rating Scale Relevance to Child Welfare Scale and Relevance to Child Welfare Scale Next we’ll 1) discuss how the rating scales were developed, 2) review the scale criteria, and 3) describe the rating process

Rating Scale Development Goals: Multiple categories High standard for top ratings – Randomized Controlled Trials Clearly defined criteria Focus on peer-reviewed research and ability to replicate program There were several goals for the Clearinghouse rating system. First, some rating systems we examined had only one or two categories – programs were either on or off the list, with no middle ground or room for comparison. We wanted to have multiple categories so that each program we examined would fit somewhere in the scheme. We also wanted to establish a high standard for the top ratings, and decided to require randomized clinical trials in order to get the top two levels of the rating scale. More on these in just a minute. We wanted clearly defined criteria, so that program developers would understand exactly what was necessary to achieve each rating scale, and so that there was no ambiguity in how ratings were assigned. Finally, we wanted to focus on peer reviewed research and programs that were able to be replicated.

Gold Standard for Evidence Randomized controlled trial (RCT) –Participants are randomly assigned to either an intervention or control group. This allows the effect of the intervention to be studied in groups of people who are the same, except for the intervention being studied. Any differences seen in the groups at the end can be attributed to the difference in treatment alone, and not to bias or chance. Now, to cover some of these terms… Randomized Controlled trials or RCTs are considered by many to be the gold standard of evidence for a research study. They involve randomly assigned members of the study sample to either receive the intervention or a control treatment (often treatment as usual or usual care services). Randomizing the subjects ensures that the two groups are equal on factors that may influence the study outcomes. Randomly assigning them to groups ensures that group assignment is not biased by the assigner, even unconsciously. Having a control group ensures that the changes between the beginning and end of the study are due to the treatment – you can compare to the group that didn’t get the treatment and see what the difference is.

Peer-Reviewed Research Peer review – A process used to check the quality and importance of research studies. It aims to provide a wider check on the quality and interpretation of a study by having other experts in the field review the research and conclusions. Another term we use in the Clearinghouse is Peer-reviewed research. Peer review serves as a reality check on research. Many people find research confusing, especially when statistics are involved. These days, anyone can write and self-publish anything they want on the internet. The peer review process ensures that other people with knowledge and expertise have looked what the researchers have done and agree with the methods used and conclusions reached from the data. Peer review assures that any studies we include in our ratings have been held to a high standard. For example, most journals use a blinded peer-review process, in which the reviewers don’t know whose study they are reviewing and the author doesn’t know who reviewed their article. This helps to reduce bias and increase honesty.

Efficacy vs. Effectiveness Efficacy focuses on whether an intervention works under ideal circumstances and looks at whether the intervention has any impact at all. Effectiveness focuses on whether a treatment works when used in the real world. An effectiveness trial is done after the intervention has been shown to have a positive effect in an efficacy trial. Our final terms for today… Researchers often talk about the efficacy and effectiveness of programs. Let’s clarify these: Efficacy looks at whether the program works in a highly controlled setting, like a research lab or university-run mental health clinic. Efficacy means the program works in these ideal circumstances, but doesn’t tell us if it will work in more typical settings. That’s were effectiveness comes in. Effectiveness means that the program has been tested and works in real world settings, like an outpatient mental health clinic or community setting. Its been tested and show to work with actual clients in their natural setting, in the way that services are typically delivered.

Scientific Rating Scale Now on to the Scientific Rating Scale… Each program we review is rated on a scale of 1 to 6, where 1 stands for effective practice and 6 stands for concerning practice. Moving between ratings is like moving up or down a step, as shown here. A program rated a 2 is one step below a program rated 1, and so on…

6. Concerning Practice If multiple outcome studies have been conducted, the overall weight of evidence suggests the intervention has a negative effect upon clients served. and/or There is a reasonable theoretical, clinical, empirical, or legal basis suggesting that, compared to its likely benefits, the practice constitutes a risk of harm to those receiving it. At the bottom of the scale are Concerning Practices. These are programs that actually cause harm to clients who receive it , or that have been shown by the majority of studies to have a negative effect.

5. Evidence Fails to Demonstrate Effect Two or more randomized, controlled outcome studies (RCT's) have found that the practice has not resulted in improved outcomes, when compared to usual care. If multiple outcome studies have been conducted, the overall weight of evidence does not support the efficacy of the practice. One step up from Concerning Practices is Level 5, in which the evidence fails to demonstrate that the program or practice has the desired effect. To get this rating, at least two RCTs have to have demonstrated that the program was no better than standard care. If there have been several studies, the majority of the studies need to show that the program doesn’t improve outcomes.

4. Acceptable/Emerging Practice- Effectiveness is Unknown There is no clinical or empirical evidence or theoretical basis indicating that the practice constitutes a substantial risk of harm to those receiving it, compared to its likely benefits. The practice has a book, manual, and/or other available writings that specifies the components of the practice protocol and describes how to administer it. The practice is generally accepted in clinical practice as appropriate for use with children receiving services from child welfare or related systems and their parents/caregivers. The practice lacks adequate research to empirically determine efficacy. One step up from Level 5 is Level 4, in which the program is termed Acceptable, but because it is a relatively new or emerging practice, it has not been widely studied and the program’s effectiveness is unknown. There are several requirements to achieve this rating. First, there must be no evidence or basis to believe that the program is harmful to clients Second, as described earlier, it has to have available written materials so that it can be replicated.

3. Promising Practice Same basic requirements as Level 4 plus: At least one study utilizing some form of control (e.g., untreated group, placebo group, matched wait list) has established the practice’s efficacy over the placebo, or found it to be comparable to or better than an appropriate comparison practice. The study has been reported in published, peer-reviewed literature. Outcome measures must be reliable and valid, and administered consistently and accurately across all subjects. If multiple outcome studies have been conducted, the overall weight of evidence supports the efficacy of the practice. . But also add three new criteria: The outcome measures used in the studies need to be reliable and valid, which means that the tools are measuring what they are intended to measure and the measurements are accurate and stable. In addition, the measures have to be used consistently across all subjects. For example, you can’t use one measure with the intervention group, and a different measure with the control group. In addition, if there have been multiple studies, the majority show that the practice works.

2. Well Supported-Efficacious Practice Same basic requirements as Level 3 plus: Randomized controlled trials (RCTs): At least 2 rigorous RCTs in highly controlled settings (e.g. University laboratory) have found the practice to be superior to an appropriate comparison practice. -The RCTs have been reported in published, peer-reviewed literature. The practice has been shown to have a sustained effect at least one year beyond the end of treatment, with no evidence that the effect is lost after this time. Two additional requirements for a well-supported practice are that 2 RCTs in controlled settings have shown that the practice works, and these results have been peer reviewed. Often, these studies are conducted by the program developer, so carefully oversees the studies to ensure that the intervention is delivered correctly. In addition, the positive results of the program have to least for at least one year after the end of treatment. For example, a parent education program may show improved child behavior and decreased parent stress at the end of the 16 week intervention, but are any of these improvements still there one year later?

1. Well supported - Effective Practice Same basic requirements as a Level 2 plus: Multiple Site Replication: At least 2 rigorous randomized controlled trials (RCTs) in different usual care or practice settings have found the practice to be superior to an appropriate comparison practice. The RCTs have been reported in published, peer-reviewed literature. The practice has been shown to have a sustained effect at least one year beyond the end of treatment, with no evidence that the effect is lost after this time. What been added is the requirement for at least 2 RCTs in real world settings to show an effect, and the studies reports have been peer reviewed.

Child Welfare Ratings Not every program that is evidence-based will work in a Child Welfare setting… We also examined each program’s experience and fit with Child Welfare systems and families

Relevance to Child Welfare Scale High: The program was designed or is commonly used to meet the needs of children, youth, young adults, and/or families receiving child welfare services.    Medium: The program was designed or is commonly used to serve children, youth, young adults, and/or families who are similar to child welfare populations (i.e. in history, demographics, or presenting problems) and likely included current and former child welfare services recipients. Low: The program was designed to serve children, youth, young adults, and/or families with little apparent similarity to the child welfare services population.

Child Welfare Outcomes We also examined whether programs had included outcomes from the Child and Family Services Reviews in their peer-reviewed evaluations: Safety Permanency Well-being

Common Continuing Education Dissemination Model One day workshop Use Tx with appropriate clients Therapist Book

X Laying the Groundwork for Implementing Evidence Based Practice

Levels of Implementation Fixen et al Paper Implementation Process Implementation Performance Implementation Fixsen, D., Naoosm, S., Blasé, K., Friedman, R., Wallace, F. (2005)

Institute for Healthcare Improvement Model Environmental Context Community, Government, Funders Organizational Context Organizations Microsystem Departments and Programs Within Organizations Patient and Community Social Workers, Therapists, Medical Professionals and Families

Transtheoretical Model of Change 5 Stages of Change Precontemplation Compliant Status Quo Contemplation Changes in orientation Preparation Planning for change Organizational and environmental readiness Action Training Maintenance Monitoring/Institutionalization Driven at each stage by: Self Efficacy & Decisional Balance FROM CHARLES USC PRESENTATION Precontemplation Step - In which individuals are not intending to take action to change their behavior and are either uninformed or under informed about the consequences of their behavior. Contemplation Step - In this Step people are actively thinking about and may even plan to change their behavior but may remain in this Step for years Preparation Step - Now people are actively planning to take action and have initiated preliminary steps such as self-education or enrolling in classes that will help them make the change.

Components of Implementation Select a Solution that Fits a Problem Prepare the internal and external environment Supervision and Leadership Buy-in Acquire knowledge and skills Use practice with support, supervision and consultation Adapt practice to environment Monitor fidelity Teach others Institutionalize Practice

Practice Selection Attributes that can facilitate adoption Relative Advantage- clear, unambiguous advantage in either effectiveness or cost effectiveness Costs- training/materials/on-going consultation-loss productivity during start up- costs of delivery Compatibility-How compatible is the practice with the organizational and workforce’s values, norms, and clinical traditions and orientation Complexity –perceived as more simple to use and to implement Trialability- able to experiment with in a limited basis Observability of Benefits –outcomes or interim results/measures Reinvention- if can adapt, refine or otherwise modify it to meet own needs Risk- if there is higher certainty of outcomes Task Issues- If relevant to performance of intended users work and improved task performance Knowledge- if knowledge can be codified and transferred from one context to another Augmentation/Support- if provided with training/consultation From Greenhalgh et al From Greenhalgh et al

Organizational Readiness Organizational Culture/Traditions/History Leadership Supervision Capacity to evaluate change-Know if it is working Support of Opinion Leaders Connections with other supportive organizations/individuals Does organization have the technology to support the change Staff readiness

Staff Readiness Staff Directly and Indirectly involved Understand What Benefits Will the Adoption of the EBP Bring Meaning-What does the change mean to the staff? What concerns will staff have about adoption How congruent are the trainers in orientation and values with the staff Presence of Champions

Readiness of External Environment Congruence with Community/Cultural/Family Values Referral Source Understanding and Support Funding Source Support Political Support Role of Social Influence/Demand for Services Role Social Movement Theory

Supportive Implementation Model Administrative Leadership and Support for EBT Obtain client feedback Supervision Technical Assistance Expert Consultation Use EST with appropriate clients Therapist Training Materials Community/Consumer Support for EBT

Finding Evidence Supported Treatments on the Web www.nctsn.org www.cachildwelfareclearinghouse.org/ http://modelprograms.samhsa.gov/template.cfm?CFID=119292&CFTOKEN=55491051 www.strengtheningfamilies.org/ www.ncptsd.va.gov/topics/treatment.html www.childtrends.org www.wsipp.wa.gov http://ebmh.bmjjournals.com/ www.cochrane.org www.campbellcollaboration.org www.colorado.edu/cspv/blueprints/model/overview.html

Contact Information Download reports from: www.chadwickcenter.org E-mail: cwilson@rchsd.org www.cachildwelfareclearinghouse.org