Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation of Michigan Child Care Expulsion Prevention Program (CCEP), 2007-2010 Michigan State University October 27, 2010.

Similar presentations


Presentation on theme: "Evaluation of Michigan Child Care Expulsion Prevention Program (CCEP), 2007-2010 Michigan State University October 27, 2010."— Presentation transcript:

1 Evaluation of Michigan Child Care Expulsion Prevention Program (CCEP), 2007-2010 Michigan State University October 27, 2010

2 Rosalind H. Kirk a John S. Carlson a Laurie A. Van Egeren a Holly Brophy-Herb a Stacy L. Bender a Betty Tableman a Mary A. Mackrain b Deb Marciniak c Sheri Falvay d a Michigan State University b Michigan Child Care Enhancement Program c Michigan Public Health Institute d Michigan Department of Community Health

3 Agenda  CCEP’s research questions (child, provider, program, family, CCEP process & fidelity)  Evaluation approach  Evaluation strategies  Strategies – strengths and challenges  Use of CCEP evaluation results

4 Child Care Expulsion Prevention Program (CCEP), Michigan  Began in late ‘90s  Initiated by MDCH, supported with funding from MDHS  Plans for state-wide coverage  At time of evaluation, 16 programs covering 31 out of 83 counties  Approx. 500 - 600 children per year.  Programmatic consultation also provided.  After T1 data collection ended in 2009, focus of CCEP changed to 0-3 yrs.  Along with many other MI programs funding ended on 30 September 2010

5 Research questions Child Outcomes (John) Child Outcomes (John) 1. Does the severity of children’s challenging behavior decrease from the onset of CCEP services to the conclusion of services? 2. Does children’s social and emotional health increase from the onset of CCEP services to the conclusion of services? 3. Does the impact of services on children’s behavior last past services? 4. Do children receiving CCEP services successfully stay in child care vs. being expelled?

6 Research questions Parent outcomes (Holly) 5. Do subjective feelings of parental competence in dealing with their child’s challenging behavior increase as a result of CCEP services? 6. Are families able to consistently attend work or school?

7 Research questions Child Care Provider outcomes(Laurie) 7. Is the childcare provider better able to recognize early warning signs of social and emotional challenges in infants, toddlers, and preschoolers? 8. Is the child care provider better able to manage challenging behavior in the child care setting, with all children?

8 Research questions Child Care Program outcome (Ros) 9. Has the social and emotional quality of the child care setting receiving CCEP services improved?

9 Research questions Program Fidelity (Laurie) 10. What is the fidelity of the child and family consultation process among CCEP programs? 11. What is the fidelity of the programmatic consultation process among CCEP programs?

10 Evaluation approach  Collaborative and consultative  Built upon existing systems  Mixed method – mainly quantitative, some qualitative

11 Four overall strategies 1. Cross-sectional (formative) : Consultant survey 2. Longitudinal study (mainly summative) : Pre-post data + 6 month follow-up from intervention group using measures of child, parent, provider outcomes 3. Quasi-experimental comparison study (summative) : Comparison group with pre-post data matching longitudinal intervention group 4. Case studies (formative): Perceptions of experiences with CCEP based on interviews.

12 1. Cross-sectional strategy: strengths On-line survey of consultants on participation in CCEP and delivery of service, including compliance with six CCEP cornerstones  ‘Snap-shot’ of program and processes based on perceptions of consultants and administrators  Electronic surveys are accessible, flexible, user friendly and can be quick to analyze  Very collaborative with CCEP in design, data collection, interpretation  Provided a wealth of information for program improvement, etc.  Collaboration provided opportunity to share expertise & help develop CCEP internal monitoring systems

13 Cross-sectional strategy: potential challenges  Potential factors affecting response rate: organizational change, personal views about evaluation, stress levels, vacations, sickness, staff turnover, workload, length of survey etc,  Anonymity can mean that survey data more likely to be accurate but non- respondents cannot be targeted to increase response rate.

14 Cross-sectional strategy: survey of consultants, 2008 (N =29) Gender Female100% Age Mean (yrs)43 Range27-60 Race/ethnicity White76% Af-Amer21% Asian3% Endorsement MI AIMH Level 224% Level 372% Experience Child MH10 years CCEP (yrs)4 Status with CCEP Full-time59% Part-time41% PT Mean20 hrs Educational Level Master’s83% Bachelor’s17% Degree/Major Social work59% Psychology17% Other24% State licensure Yes83% No17%

15 Cross-sectional strategy: survey summaries/ research briefs  1. Informing Providers About CCEP Services  2. Child and Family Consultation Processes  3. Programmatic Consultation Processes  4. Reflective Supervision  5. Group Training and Individual Coaching of Providers and Parents of Providers and Parents  6. Consultants: Experience, Job Satisfaction, and Organizational Support  7. The Most Important Things Consultants Do  8. Collaboration with Michigan Child Care Coordinating Council, MSU Extension, and the Great Start Collaborative  9. State-Level Training and Technical Assistance Available at http://outreach.msu.edu/cerc/research/ccep.aspx http://outreach.msu.edu/cerc/research/ccep.aspx

16 Cross-sectional strategy: other survey results  Preventing Children’s Expulsion from Childcare: Variations in Consultation Processes in a Statewide Program Poster and Survey summaries/research briefs at SRCD conference (2009) View at: http://outreach.msu.edu/cerc/ http://outreach.msu.edu/cerc/

17 2. Longitudinal strategy - strengths  Able to assess child, parent, provider and program outcomes pre (T1) and post (T2) and if these were sustained over 6 months(T3).  Collaborative – state and local e.g. consultation on selection, organization and use of measures; attendance at monthly meetings; electronic Q & A; personal contacts between consultant and MSU team especially with new staff; collaborative troubleshooting at state level.  Built on existing systems so incorporated measures already used by consultants e.g. DECA

18 Longitudinal study sample size Time when cases received Sample size Child & family cases (T1) 432 Child & family cases (T2) 394 Child & family cases Follow-up (T3) 177 Programmatic cases(all) 55 Sample sizes included in analyses varied depending on the quality of the data collected

19 Children & Families intervention sample (N=361) Child’s age-months Mean (SD)43.2 (13.2) 0-3525% 36-60+75% Gender Male75% Race/ethnicity Afr- Amer15% White77% Other8% Hispanic8% Household income Low34% Family 2-parent60% Provider Center86% F. home5% Gp. home7% Relative1% In-home1% Previous expulsions 10%

20 3. Quasi-experimental strategy  Includes collection of matching data from a sample of children exhibiting challenging behaviors but resident in a county where CCEP unavailable. Need to create a matched sample (N=86).  Enables comparison with CCEP intervention group beyond maturation changes  Ongoing challenges (resources - time, staff, organization, incentives) for recruiting and participation of comparison group but not resident in county with CCEP  Limitations –missing data, multiple raters, reliance on self- report measures and interviews, how representative was the intervention group who participated in the evaluation, were comparison families enough like CCEP group even with matching? what other services, if any, were comparison families receiving in their own counties? Did counties with CCEP differ from counties without?

21 OutcomeMeasure Child1.Devereux Early Childhood Assessment (DECA; LeBuffe & Naglieri, 1999). 2.DECA-Infant-Toddler Version (DECA-IT; Mackrain, LeBuffe & Powell, 2007) 3.Problem Coding Grid developed by Michigan CMH. 4.Subscales from the Behavior Assessment System for Children- Second Edition (BASC-2; Reynolds & Kamphaus, 2004) 5.Retention, placement, and expulsion. Parent1.Parenting Stress Index/Short Form (PSI/SF; Abidin, 1990) 2.Skills and Knowledge subscale of the Psychological Empowerment Scale (PES; Akey, 1996) 3.Work productivity Provider1. Early Warning Signs (developed by MSU team) 2. Goal Achievement Scale (GAS; Alkon, Ramler, & MacLennon, 2003) 3. Teacher Opinion Survey (TOS; Geller & Lynch, 1999). Consultation process, effectiveness, and acceptability Adaptation and/or sub-scales of various instruments including: 1.Parent-Teacher Relationship Scale, (PTRS; Vickers & Minke, 1995) 2.Consultation Evaluation Form (CEF; Erchul, 1987). 3.Behavioral Intervention Rating Scale (BIRS; Von Brock & Elliott, 1987). 4.Benefits of Consultation (Sheridan, 1998, 2000a, 2000b) including other sub-scales from BIRS) 5.Competence of Other (Sheridan 1998, 2000a, 2000b)

22 Does consultation make a difference to parents?  Awaiting final results, (on child, parent, provider, program outcomes and perceptions of effectiveness & relationships). With qualifications, trends prior to the final analyses have indicated that:  Both parental competence increased and stress reduced more among parents who used consultation services.  There was strong/high levels of satisfaction with the perceived consultation process, its’ effectiveness, and acceptability by both parents and providers.

23 Interim results (N=129) - Change in child outcomes after early childhood mental health consultation (see link to poster) Before taking dosage of CCEP into account, raw parent & provider data showed:  Both CCEP and comparison children showed significant improvements in behavior problems and positive behaviors over the study period.  For parent report in the CCEP group, attention problems and functional communication continued to improve 6 months after consultation; most others remained level. Are higher doses of consultation linked to greater improvement in child challenging and positive behaviors compared to lower doses?  After taking satisfaction with CCEP into account, more hours of consultation with providers (but not parents) predicted increases in provider reports of some positive behaviors.  At 6-month follow-up, more hours of provider consultation was linked to continued improvements in parent-reported attention problems.  Gains made in behavioral concerns and functional communication were not sustained. Do children with challenging behavior who receive consultation show more behavior improvement compared to children with challenging behavior who do not receive consultation?  While children in the intervention (N=129 and comparison (N=59) groups both improved over time, probably due to maturation, the CCEP group showed greater improvements in behavior than the comparison group in almost all areas.

24 4. Case studies  Sample: (N=9 children) 2 programs, 3 consultants  Method: Interviews in-person or phone with parent, provider (s) and consultant  Analyses: Coded & content thematically organized around process and outcomes

25 CASE STUDY SAMPLE NameSexAg e Reason for referral Household# IOutcome DylanM60 m Listless, withdrawnMother, stepfather 5Adjusted, kindergarten SophiaF40 m Defiant, aggressiveMother, boyfriend, sibling 2Mom lost job, withdrawn from cc JasonM71 m Head-banging, tantrums Single mother3Reduced intensity RyanM51 m Tantrums, screaming 2 bio. parents, twins 3Reduced intensity, moved on to schoo KaylaF41 m Defiant, hyperactive 2 adoptive parents, sibling 3Parent & provider behavior adapted NathanM49 m Developmental delay, aggressive 2 bio. Parents, sibling 3Parent & provider behavior adapted MadisonF60 m Tantrums, disruptive 2 bio. parents1Provider adapted Kindergarten HannahF42 m AggressionSingle mother3Incomplete consultation- moved out of state DanielM48 m Aggression, sexualized behavior Single mother4Expelled #I=Number of interviewees

26 Case studies: strengths  Combines quantitative and qualitative methods.  Illustrates the variation and unique relevance for individual children  Adds depth to the understanding of the processes that underpin consultation  Highlight the importance of context and relationships for intervention

27 Case studies: challenges  Balancing case study importance with a primarily outcome focused evaluation.  Self-selection bias in sample  Combining meaningfully with quantitative data- using quotes in body of report (outcomes), thematic table about process and ‘stories’ about children with standardized scores compared to mean

28 Program’s use of preliminary evaluation results  Accountability. Was the money being spent as agreed? Was it being spent wisely?  Planning – program and community Where to focus limited resources? Was more needed? Help others understand the consultants’ role and perspective and the contribution it can make to community planning. Grant preparations.  Quality improvement. How could CCEP build on its strengths? What could CCEP have done better? Ready access to evaluator expertise offered more support. e.g. internal monitoring systems.  Advocacy & Dissemination. Tell others about CCEP successes and challenges. Politicians, potential funders, academics-contribution to the ECMH knowledge base.

29 Closing comments from Daniel’s mom “I think it’s (CCEP) an awesome program, I really do. There are a lot of daycares out there that if they come across just the littlest behavior, and the child becomes difficult to take care of, they just give up and say ‘okay, well we can’t have him in the daycare’. So someone like Julie (consultant) that could come out and talk to the caregivers and explain different ways of doing things, I mean, I think that’s awesome because then you know, the kid can stay in the daycare and the mother can continue working. I mean, I think it’s a really good program.”

30 Further information Principal Investigators:  John Carlson, PhD, NCSP; Asc. Professor, College of Education; carlsoj@msu.edu carlsoj@msu.edu  Holly E. Brophy-Herb, PhD, Associate Professor, Human Dev. & Family Studies; hbrophy@msu.edu  Laurie A. Van Egeren, PhD, Director, Community Evaluation and Research Center (CERC), University Outreach and Engagement ; vanegere@msu.edu vanegere@msu.edu

31 Useful links  MSU CCEP EVALUATION RESULTS R EFERRED TO HERE : B RIEFS AND POSTERS http://outreach.msu.edu/cerc/research/ccep.aspx http://outreach.msu.edu/cerc/research/ccep.aspx  T ECHNICAL A SSISTANCE C ENTER FOR S OCIAL E MOTIONAL I NTERVENTION : http://www.challengingbehavior.org/ http://www.challengingbehavior.org/  U NIVERSITY OF W ISCONSIN – E XTENSION : http://www.uwex.edu/ces/pdande/evaluation/index.html http://www.uwex.edu/ces/pdande/evaluation/index.html  NSF Online Evaluation Resource Library: http://www.oerl.sri.com  T ROCHIM, W.M. T HE R ESEARCH M ETHODS K NOWLEDGE B ASE, 2 ND E DITION : http://www.socialresearchmethods.net/kb/ http://www.socialresearchmethods.net/kb/


Download ppt "Evaluation of Michigan Child Care Expulsion Prevention Program (CCEP), 2007-2010 Michigan State University October 27, 2010."

Similar presentations


Ads by Google