An Experimental Study of Child Welfare Worker Turnover Nancy S. Dickinson, University of Maryland John S. Painter

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

The Effectiveness of the Internet As a Recruitment Source and Medium By Bob Van Cleave IDSc Workshop April 5, 2002 Please do not distribute or reference.
Parent Connectors: An Evidence-based Peer-to-Peer Support Program Albert J. Duchnowski, Ph.D. Krista Kutash, Ph.D. University of South Florida Federation.
Research Findings and Issues for Implementation, Policy and Scaling Up: Training & Supporting Personnel and Program Wide Implementation
Nancy S. Dickinson, MSSW, PhD The Use of Evidence in Child Welfare Practice and Policy: An International Perspective on Future Directions Jerusalem, May.
Job Search Assistance Strategies Evaluation Presentation for American Public Human Services Association February 25, 2014.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
CPS Recidivism Associated with a Home Visiting Program: A Quasi Experimental Analysis Ed Byrnes, Ph.D. Eastern Washington University Michael Lawson, M.S.
Initiating & Sustaining a Mentoring Program Dr. Virginia Strand- Fordham University Jodi Hill-Lilly, MSW & Tracy Davis, MSW Connecticut Department of Children.
Online Career Assessment: Matching Profiles and Training Programs Bryan Dik, Ph.D. Kurt Kraiger, Ph.D.
The Kansas Child Welfare Workforce Profile SSWR 2011 Annual Conference January 14, 2010 Alice Lieberman, Ph.D. and Michelle Levy, A.M.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Keeping Kids in School:
Talbert House Project PASS Goals and Outcomes.
ASU Involvement in Child Welfare Workforce Development and Research: Experiences, Challenges, and Opportunities Testimony to the Arizona Child Safety Task.
Pey-Yan Liou and Frances Lawrenz Quantitative Methods in Education of the Department of Educational Psychology, University of Minnesota Abstract This research.
Retention Interview Process Training July 2008 Retention Interview Process Training 1.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Human Resource Management: Gaining a Competitive Advantage
Student Technological Mastery: It's Not Just the Hardware Wm. H. Huffman, Ph.D. Ann H. Huffman, Ph.D.
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Early Childhood Mental Health Consultants Early Childhood Consultation Partnership® Funded and Supported by Connecticut’s Department of Children and Families.
Proposed Conceptual Model to Guide Workforce Development Efforts in Child Welfare Feb 2014.
Evidence-based Strategies for Improving Child Welfare Performance, Staff Retention and Client Outcomes XIX ISPCAN International Congress on Child Abuse.
Making a difference? Measuring the impact of an information literacy programme Ann Craig
CSWE Child Welfare Symposium Child Welfare Workforce Retention Research in New York State New York State Social Work Education Consortium.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
Chapter 10 Human Resource Management. HRM Human Capital Human Resource Management 3 major responsibilities of HRM  Attracting a quality workforce  Developing.
Factors Influencing the Retention of Specially Educated Public Child Welfare Workers Nancy Dickinson, UNC Chapel Hill Robin Perry, Florida State University.
Proposed Conceptual Model to Guide Workforce Development Efforts in Child Welfare Feb 2014.
Development and results of an older adult health communication program using the Theory of Planned Behavior Virginia Brown, DrPH; Lisa McCoy, MS The National.
SEDA IMPACT EVALUATION WESTERN CAPE (SOUTH AFRICA) Varsha Harinath (the dti) Francisco Campos (World Bank) Finance and Private Sector Development IE Workshop.
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
Successful and Not Successful Implementation THE IMPACT OF ORGANIZATIONAL CLIMATE FACTORS Funding for this project is made possible through a Cooperative.
1 Using a Statewide Evaluation Tool for Child Outcomes & Program Improvement Terry Harrison, Part C Coordinator Susan Evans, Autism Project Specialist.
Implementing Adult Risk Factor Surveillance in Manitoba Case Studies ARFS Symposium January 26, 2011.
The effects of an organizational intervention on child welfare agency atmosphere and workforce stability Jessica Strolin, PhD Jim Caringi, LiCSW Thank.
1 Promoting Evidence-Informed Practice: The BASSC Perspective Michael J. Austin, PhD, MSW, MSPH BASSC Staff Director Mack Professor of Nonprofit Management.
The mission of TnT is threefold: (1) to study prevalence, policy and resources, individualized decision making, training and support and other factors.
Project CLASS “Children Learning Academic Success Skills” This work was supported by IES Grant# R305H to David Rabiner Computerized Attention Training.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Community Partnerships to Protect Children: Challenges and Opportunities Deborah Daro.
CalSWEC Data Sets. 2 Since its inception in 1990, CalSWEC has collected data on:  The career interests of MSW students in California  The retention.
Addressing Maternal Depression Healthy Start Interconception Care Learning Collaborative Kimberly Deavers, MPH U.S. Department of Health & Human Services.
Patient and Staff Satisfaction in Outpatient Substance Abuse Treatment Programs A. Kulaga 1, B. McClure 1, J. Rotrosen 1, P. Crits-Christoph 2, S. Ring-Kurtz.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
The State of Mentoring in Michigan Report on the Mentor Michigan Census: Wave I, Fall 2004 Robert W. Kahle, Ph.D. Kahle Research Solutions Inc. Ferndale,
Project KEEP: San Diego 1. Evidenced Based Practice  Best Research Evidence  Best Clinical Experience  Consistent with Family/Client Values  “The.
What the NCWWI Evaluators have been up to… A love story Changing... Leading... Learning...
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Addiction Treatment Workforce Characteristics for California, Arizona, & New Mexico: Implications for Workforce Development NAADAC/CAADAC/NALGAP National.
1 Clinical Supervision in the CTN: Availability, Content, and Impact on Counselors Lori J. Ducharme, Hannah K. Knudsen, J. Aaron Johnson & Paul M. Roman.
Secondary Analysis of Child Welfare In-Service Training Data Comparing Title IV-E and non-Title IV-E Graduates 1.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
The Training Process 1. Needs Assessment
Diversity Awareness Training Sanchez & Medkik Hypothesis Nature of quasi-experimental design Measures used & their validity Tests of Hypotheses Alternative.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Oregon’s Early Learning Workforce: What is the data telling us? Oregon Association for the Education of Young Children Saturday, April 16, 2016 Megan Irwin,
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Testing the Feasibility and Impact of the Res-Care-CI Elizabeth Galik, MSN, CRNP University of Maryland School of Nursing AMDA 30th Annual Symposium March.
Thomas Danford | June 3, #SERC10
Chapter Eight: Quantitative Methods
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Presentation transcript:

An Experimental Study of Child Welfare Worker Turnover Nancy S. Dickinson, University of Maryland John S. Painter National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Child Welfare Staff Recruitment and Retention: An Evidence Based Training Model Study Objectives Determine the feasibility of using an experimental design to study training outcomes Understand the impact of worker perceptions on their intent to leave child welfare employment Study the effectiveness of the intervention on worker retention National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Retention External Environment Agency’s public image Awareness of jobs Agency Climate Shared mission Affirmation & recognition Shared authority Growth & advancement Org commitment Worker Characteristics Desire to help Self-efficacy Depersonalization Education Supervision Practice support Emotional support Team support The Work Role clarity Role expectations Workload Influences on Recruitment, Selection and Retention Recruitment Selection National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Intervention Components RecruitmentSelectionRetention PosterRealistic Job Preview DVD An Invitation to Choose A Supervisor’s Guide to Retention FlyersCompetency Based Selection Process A Director’s Guide to Retention Custom BrochuresSelection TrainingRetention Training 2 30-second PSA’sRetention Toolkit Slide PresentationTechnical Assistance Recruitment Training National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Research Questions Do workers in the intervention counties show statistically significant differences from those in the control counties on relevant survey scales? Does child welfare worker retention improve in the intervention counties compared with the control counties? National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

National Human Services Training Evaluation Symposium Cornell University, June 15, 2011 Procedures Random assignment of county child welfare agencies to 17 intervention and 17 control groups 33 project counties participated in data collection activities (1 agency withdrew after a year)

General Design Intervention: RO1XO2 Comparison: RO1O2 R = random assignment O = data collection (or observation) X = intervention or treatment National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Instruments Online worker survey administered 5 times to all project child welfare workers between 6/1/05 and 6/1/08 Human Resources Database gathered employment information on all project workers between 12/1/04 and 9/1/08 National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Worker Survey 17 scales validated using reliability analysis and confirmatory factor analysis Average response rate of 47% (45% - 48%) across 5 waves of delivery to an average sample of 831 workers ( ) Waves 1 & 2 were pre-intervention; waves 4 & 5 were post-intervention. National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Respondent Demographics Wave 1Wave 2Wave 3Wave 4 Wave 5 Total Gender Female87.1%86.7%88.3%89.2%87.5% Male12.9%13.3%11.7%10.8%12.5% Race African-American23.9%17.8%24.4%26.1%24.4% European- American 68.3%70.8%63.5%63.7%66.1% Other7.9%11.3%12.2%10.2%9.5% Age Average St. Dev National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Demographics, Continued Wave 1Wave 2Wave 3Wave 4 Wave 5 Total Degree Type Bachelor47.2%45.1%47.4%47.5%46.8% Master7.6%7.4%6.9%7.6%6.4% BSW33.4%33.7%31.7%29.4%30.7% MSW11.7%13.8%14.0%15.5%16.1% Missing(4.2%)(7.6%)(9.3%)(6.6%)(6.4%) Caseload Average Number of Families per Month St. Dev National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Data Analyses Multi-level regression analysis –Scales are compared pre-post intervention Survival analysis –Days employed & status at end of study (exit vs. no exit) National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Statistical Comparisons for Survey Scales Four primary comparisons were made: –Intervention vs. control post-training Individual level County level –Pre vs. post training intervention group only Individual level County level National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Overview of Survey Results ScaleIndiv. Interv. Vs. Control Cnty. Interv. Vs. Control Indiv. Pre. Vs. Post Cnty. Pre. Vs. Post S1 Depersonalization ****** S2 Desire to help ***** S3 Self Efficacy ***** S4 Workload *** S5 Role Clarity ***** S6 Role Expectations S7 Supervisor: Practice Support *** S8 Supervisor: Team Support * S9 Supervisor Emotional Support S10 Organizational Commitment ** S11 Agency’s Negative Image S12 Agency Affirmation *** S14 Shared Mission *** S15 Shared Authority S16 Growth & Advancement Opportunities * S17 Intent To Leave (lower scores indicate lower intent to leave) ***** * P <.05, ** p <.01; *** p <.001 National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Impact of Intervention on Turnover: Data and Sample HR Database used by all project counties –Internet accessible –Interactive database application In 9/08, analysis file of 877 workers hired after January 1, 2004 –485 workers from control counties –392 from intervention counties National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Worker Demographics for Original and Propensity Matched Samples OriginalMatched InterventionControlInterventionControl Total Degree TypeN%N%N%N% No SW Degree % % % % BSW or MSW % % % % Previous Experience None7318.6%7214.8% %5617.6% Indirect6616.8%6713.8% %5818.2% Direct % % % % National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Analysis and Results Cox regression survival analysis assessed the impact of the intervention on undesirable exits Effect of the intervention is statistically significant (p<.05) – 27% of control group sample experienced an undesired exit –17% exit in the intervention group National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Type of Exit Post Intervention GroupType of ExitFrequencyPercent Control No Exit Undesired Exit Promoted 20.7 Transferred Other 10.4 Total Intervention No Exit Undesired Exit Promoted 62.7 Transferred Other 31.3 Total National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Summary A rigorous research methodology can be used to test the effectiveness of a training intervention. Undesired exits by child welfare workers can be slowed significantly because of increased skills and behaviors of supervisors and managers. National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Limitations Absence of statewide employee database limits quality of data. –Some concern that project database was used inconsistently –Cannot track workers across counties to determine if worker left the profession or the agency National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

What Worked Well Recruiting counties thru site visits Random assignment Providing counties with data on turnover Longitudinal design Control group Lots of personal contact with counties HR dbase data proved key Web surveys were very efficient National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Think Twice… Number of counties in study Number of times surveyed Web reports National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Unexpected Challenges Data management! A beast… Some counties were inconsistent in use of HR dbase Inconsistent response to surveys left gaps in data Collecting baseline data before intervention was finalized National Human Services Training Evaluation Symposium Cornell University, June 15, 2011

Acknowledgement This study was supported by the U.S. Children’s Bureau (Grant No. 90CT0114) as part of the project Child Welfare Staff Recruitment and Retention: An Evidence- Based Training Model. THANKS National Human Services Training Evaluation Symposium Cornell University, June 15, 2011