Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B Holly Howat Oliver Winston Greg Crandall.

Similar presentations


Presentation on theme: "Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B Holly Howat Oliver Winston Greg Crandall."— Presentation transcript:

1 Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B Holly Howat Oliver Winston Greg Crandall

2 PBS in Louisiana: 2006-2007 Evaluation Findings Understanding the power of data-based decisions

3 Cecil J. Picard Center for Child Development The Cecil J. Picard Center for Child Development was established in 2005 at the University of Louisiana at Lafayette. Our mission is to improve Louisiana by focusing on its children. The Center’s is dedicated to providing high quality, rigorous evaluation of programs that addresses learning from birth to adulthood. The Center is proud to partner with many state agencies including the Department of Education. Our Center’s work with DOE includes the evaluation of the implementation of Positive Behavior Support.

4 Evaluation Focus School-wide Evaluation Tool Correlation Analysis Behavioral Characteristics Academic Characteristics Risk and Protective Factors Characteristics Qualitative Results for District-Wide Implementation

5 Positive Behavioral Support Schools Trained 2006-2007 School Year

6

7 School Wide Evaluation Tool The more experience a sampled school has with universal level PBS, the better they are at implementing it. Most sampled schools had strengths with monitoring and district support and had difficulty with expectations taught. Comparison of 2006-07 SET Total Scores Across Cohorts by Years of Experience 0 20 40 60 80 100 1 Year 2 Years 3 Years 4 Years Percentage Cohort 4 N=7 Schools Cohort 3 N=11 Schools Cohort 2 N=15 Schools Cohort 1 N=6 Schools SET Total and Subcategories Mean Scores for All Sampled PBS Schools 0 20 40 60 80 100 Percentage Total Score Expectations Defined Expectations Taught Reward System Violation System Monitoring Management District Support

8 Correlation Analysis This graph indicates that there is statistical significant correlation between School-wide Evaluation Tool scores and Benchmarks Of Quality scores. SET Scores Relation to Benchmark Scores 0 20 40 60 80 100 020406080100 Benchmarks SET SET - Benchmarks Linear (SET - Benchmarks)

9 Behavioral Characteristics: Suspension Rates Sampled schools with over two years of PBS implementation had much lower increases in in-school suspension rates. A similar pattern existed for out-of- school suspension rates. Change in ISS Rates from 2003-04 to 2005-06 1.051.04 6.73 6.86 0 2 4 6 8 10 Percentage Cohort 1Cohort 2Cohort 3Cohort 4 Change in OSS Rates from 2003-04 to 2005-06 -2.45 0.02 0.45 2.67 -3 -2 0 1 2 3 4 Percentage Cohort 1Cohort 2Cohort 3Cohort 4

10 Academic Characteristics: Test Scores and Retention Rates A general pattern of decline in retention rates can be observed in this sample. From the data collected for 2006-2007, there was no discernible correlation of PBS implementation to academic outcomes on test scores.

11 Risk and Protective Factors Protective factors increased in Grades 6 and 8, particularly the rewards for pro-social behavior. Risk factors decreased in Grades 6 and 8, particularly a low commitment to school. PBS Sample School Results on CCYS Protective Factor: Rewards for Pro-social Behaviors 0 20 40 60 80 100 Grade 6Grade 8Grade 10 Percentage 2004 N=34 Schools 2006 N=34 Schools PBS Sample School Results on CCYS Risk Factor: Low Commitment to School 0 20 40 60 80 100 Grade 6Grade 8Grade 10 Percentage 2004 2006

12 Qualitative Results for District-Wide Implementation ComponentDescription OneDistrict-wide PBS implementation must have continued technical assistance from LDE, LSU and UL Lafayette in the development, implementation, and evaluation of a district- wide plan. TwoDistrict-wide PBS implementation must include the organization of personnel, resources, and time, as well as set out goals and strategies for sustainability and expansion. ThreeDistrict-wide PBS implementation must have superintendent buy-in at the district level as well as principal buy-in at the school level.

13 Qualitative Results for District-Wide Implementation ComponentDescription FourDistrict-wide PBS implementation must have training and technical assistance that is consistent and continual. FiveDistrict-wide PBS implementation must address a systematic method for collecting, analyzing, and using data to make decisions. SixDistrict-wide PBS implementation must include an evaluation of the implementation across the district.

14 Data Driven Decision Making At the Picard Center for Child Development, we collect and analyze data to inform policy makers so they can informed decisions. School and districts can also collect and analyze data so they can make informed decisions.

15 Data Driven Decision Making PURPOSE: To review critical features & essential practices of data collection and the analysis of data for interventions

16 Non-classroom Setting Systems Classroom Setting Systems Individual Student Systems School-wide Systems School-wide Positive Behavior Support Systems

17 Data Collection Examples An elementary school principal found that over 45% of their behavioral incident reports were coming from the playground. High school assistant principal reports that over two-thirds of behavior incident reports come from our cafeteria.

18 Data Collection Examples A middle school secretary reported that she was getting at least one neighborhood complaint daily about student behavior during arrival and dismissal times. Over 50% of referrals occurring on “buses” during daily transitions.

19 Data Collection Examples At least two times per month, police are called to settle arguments by parents & their children in parking lots. A high school nurse lamented that “too many students were asking to use her restroom” during class transitions.

20 Data Collection Questions What system does the parish utilize for data collection? How is the data system being used in each school setting? How frequently are data collection system reports generated (bi-weekly, monthly, grading period and/or semester reports )?

21 Minimal School-Level Data Collection Needs Minor referrals Major referrals Referrals by staff members Referrals by infractions Referrals by location Referrals by time Referrals by student

22 Minimal District-Level Data Collection Needs Majors referrals (ODRs) Referrals by Incident Referrals by Infractions Times of incidents Locations of incidents (what school and where in the school)

23 Data Analysis Questions How is the data displayed (graphs, tables, etc.) and is it effective? What are the outcomes of data review? Are data-based decisions reached? How are data-based decisions monitored for effectiveness?

24 Minimal School-Level Data Analysis Needs PBS team should be part of analysis process Data should be reviewed to determine patterns of problem behaviors Decisions should be based upon data presented Decisions should include an intervention that can be successfully implemented and monitored.

25 Using Data to Make Decisions What interventions are needed to respond to problem behaviors? How do we implement the intervention throughout the school? What is the time table for the intervention to show a decrease in undesirable behavior?

26 Contact Information Dr. Holly Howat 337-482-1552 holly.howat@louisiana.edu Mr. Oliver Winston 337-365-2343 olwinston@iberia.k12.la.us http://ccd-web.louisiana.edu/


Download ppt "Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B Holly Howat Oliver Winston Greg Crandall."

Similar presentations


Ads by Google