9 State Leadership Team State PBS Consultant Regional Coordinators Training & CoachingVisibility & Political SupportAssessment & EvaluationState PBS ConsultantTeam Leader(In –School Coach)School AdministratorRecorderData ManagerTime-keeperCommunication CoordinatorSchool StaffParentsStudentsCommunityLEACoordinatorRegional CoordinatorsTrainerExternal Coach
10 Visibility & Political Support State Leadership TeamAssessment & EvaluationTraining & CoachingVisibility & Political SupportVision:All schools in North Carolina will implement Positive Behavior Support as an effective and proactive process for improving social competence and academic achievement.Misson:To provide leadership, professional development, resources, and on-going support in order for schools to successfully implement Positive Behavior Support
11 Visibility & Political Support State Leadership TeamVisibility & Political SupportGoalsMaintain an up-to-date PBS Website.Coordinate and inform all depts/divisions at DPI regarding PBS updatesIncrease awareness of North Carolina’s mission and vision for PBSFacilitate networking among all PBS stakeholders
12 State Leadership Team Goals: Current registry of trainers/coaches Training & CoachingGoals:Current registry of trainers/coachesCurrent registry of participating LEAs, contact people/coordinators, & schoolsProvide Training, Support, and Networking Opportunities for Trainers, Coaches, & CoordinatorsFidelity of Training & ImplementationInclusion of IHEsInclusion of PBS in standards for Education Leadership Candidates, preservice/graduate personnel Support Creation of Durable Systems
13 Assessment & Evaluation State Leadership TeamAssessment & EvaluationGoals:Determine specific data to be collected statewideCreate a plan for obtaining a thorough evaluation of the PBS Program in North Carolina
14 State Leadership Team State PBS Consultant Training & CoachingVisibility & Political SupportAssessment & EvaluationState PBS ConsultantTeam Leader(In –School Coach)School AdministratorRecorderData ManagerTime-keeperCommunication CoordinatorSchool StaffParentsStudentsCommunityLEACoordinatorRegional CoordinatorsTrainerExternal CoachState PBS ConsultantPosition now filled by Heather Reynolds Solone, as a result of legislative action. The PBS consultant is part of the Behavior Support & Special Programs Section of the EC Division, led by Chief, Diann Irwin.
15 State Leadership Team Regional Coordinators Training & CoachingVisibility & Political SupportAssessment & EvaluationState PBS ConsultantTeam Leader(In –School Coach)School AdministratorRecorderData ManagerTime-keeperCommunication CoordinatorSchool StaffParentsStudentsCommunityLEACoordinatorRegional CoordinatorsTrainerExternal CoachRegional CoordinatorsRegional responsibilities include the following:2/3 of the time working with PBS implementation in the region and state (14 or 15 work days per work month).Attend PBS coordinator meetings and training.Host state and regional meetings for implementing schools.Coordinate regional PBS training.Provide PBS Awareness Presentations in the region.
16 Regional Coordinators Regional responsibilities include the following:Work with PBS LEA trainers to complete School Evaluation Tools.Help plan PBS summer institute and conference presentations.Visit implementing schools in other LEAs, as possible.Provide PBS technical assistance and support in the region.Coordinate data collection for the region.Assist local PBS trainers with using and understanding data.Stay informed about national PBS research.Coordinate with Behavior Support Consultant from the region.
17 Regional Coordinators Expected LEA responsibilities for the position include:Coach participating schools in LEA.Help train new schools in LEA.Direct data management and program evaluation.Chair School System PBS Leadership Team.Work with PBS trainers and chair persons in LEA.Link between schools, leadership team and leadership of school system.See that School Evaluation Tool is completed for each implementing school in LEA.Manage school system action plan.
18 State Leadership TeamTraining & CoachingVisibility & Political SupportAssessment & EvaluationState PBS ConsultantTeam Leader(In –School Coach)School AdministratorRecorderData ManagerTime-keeperCommunication CoordinatorSchool StaffParentsStudentsCommunityLEACoordinatorRegional CoordinatorsTrainerExternal CoachLEA CoordinatorCoordinate with PBS Regional Coordinator from the region.Attend PBS coordinator meetings and training.Coordinate LEA PBS training.Provide PBS Awareness Presentations in the LEA.Provide PBS technical assistance and support in the LEA.Host LEA meetings for implementing schools.Visit implementing schools.Work with PBS coaches, trainers, and chair persons in LEA.Assist local PBS teams with using and understanding data.Stay informed about national PBS research.
19 Direct data management and program evaluation. LEA CoordinatorDirect data management and program evaluation.Chair School System PBS Leadership Team.See that School Evaluation Tool is completed for each implementing school in LEA.Coordinate data collection for the LEA and send to Regional Coordinator.Link between schools, leadership team, and leadership of school system.Manage school system action plan.
20 State Leadership Team Coordinate with PBS LEA Coordinator. Training & CoachingVisibility & Political SupportAssessment & EvaluationState PBS ConsultantTeam Leader(In –School Coach)School AdministratorRecorderData ManagerTime-keeperCommunication CoordinatorSchool StaffParentsStudentsCommunityLEACoordinatorRegional CoordinatorsTrainerExternal CoachExternal CoachCoordinate with PBS LEA Coordinator.Attend PBS Coach meetings and training.Attend LEA PBS Leadership Team meetingsCoordinate LEA PBS training.Provide PBS technical assistance and support in the LEA.Facilitate LEA meetings for implementing schools.Attend implementing school team meetings.Work with PBS trainers and school teams in LEA.Assist local PBS teams with using and understanding data.Stay informed about national PBS research.Complete School Evaluation Tool for each implementing school in LEA.Coordinate data collection for school teams send to LEA Coordinator.Link between schools and LEA Coordinator.Assist schools with action planning.
21 State Leadership TeamTraining & CoachingVisibility & Political SupportAssessment & EvaluationState PBS ConsultantTeam Leader(In –School Coach)School AdministratorRecorderData ManagerTime-keeperCommunication CoordinatorSchool StaffParentsStudentsCommunityLEACoordinatorRegional CoordinatorsTrainerExternal CoachTrainerWork with PBS Regional Coordinator & LEA Coordinator to plan trainings.Complete the train-the-trainer process.Participate in all 3 Modules as a team member.Co-train all 3 Modules with an experienced trainer.Achieve competence and train independently.Attend Trainer refreshers and updates.Provide support and technical assistance for school teams.Complete annual self-assessment and competency requirements.
22 State Leadership TeamTraining & CoachingVisibility & Political SupportAssessment & EvaluationState PBS ConsultantTeam Leader(In –School Coach)School AdministratorRecorderData ManagerTime-keeperCommunication CoordinatorSchool StaffParentsStudentsCommunityLEACoordinatorRegional CoordinatorsTrainerExternal CoachTeam Leader(In –School Coach)Coordinate with LEA/External Coach & LEA Coordinator.Attend PBS coach meetings and training.Facilitate team meetings for your school.Assist teams with using and understanding data.Stay informed about national PBS research.Coordinate completion of School Evaluation Tool.Coordinate data collection and send to LEA Coach.
23 State Leadership Team State PBS Consultant Regional Coordinators Training & CoachingVisibility & Political SupportAssessment & EvaluationState PBS ConsultantTeam Leader(In –School Coach)School AdministratorRecorderData ManagerTime-keeperCommunication CoordinatorSchool StaffParentsStudentsCommunityLEACoordinatorRegional CoordinatorsTrainerExternal CoachSchool AdministratorRecorderData ManagerTime-keeperCommunication CoordinatorStudentsParentsSchool StaffCommunity
25 Presented by Laura Phipps NC PBS Data CollectionPresented by Laura Phipps
26 Objectives Understanding of… NC PBS Data Collection Manual. NC PBS Recognition Program.Strategies for using data for effective action planning.
27 NC Data Collection Manual Section I History and Purpose
28 Manual HistoryHistorically, we have had inconsistent data collection due to:Different size LEAsDifferent data collection sourcesVarious levels of data system knowledgeLead to challenges for schools to assess data and effectiveness of PBS implementation.State was challenged to make state-wide conclusions about PBS Outcomes.
29 Manual PurposeNot intended to add work, but to organize the work you are already doing.Toolbox: Describes different types of data you may want to collect and the rationale for how it will help you.Road Map: Provides guidelines for completing a thorough assessment for PBS Implementation.State-Wide Goal: Increase consistency of data- collection across the state and guiding support from DPI and PBS Coordinators.
30 Using Data at the School Level CreateAction plan andStepsImplementMake neededrevisionsCollect data todetermineneedassess planHere are the steps.Get input at all levels…Research best practicesCreate action planImplement planAssess the planRevise the planRepeat as needed…30
31 Using Data at the State Level Collect datafrom allregionsAnalyzeandsummarizepatternsCreate actionsteps/planImplement PlanDetermineneeded supportsHere are the steps.Get input at all levels…Research best practicesCreate action planImplement planAssess the planRevise the planRepeat as needed…31
32 Manual Overview 5 Sections Manual Purpose of data collection manual Implementation DataSystem Level Outcome DataSmall Group/ Individual Outcome DataSystem-wide Implementation DataManualWhat and WhyHow To
33 Recognition ProgramGoal is to motivate schools to provide data collection SO THAT we can increase sustainability of implementation.Manual provides specific data requirements for meeting state implementation standards33
34 Recognition Program Components SystemsTrainingTeamDataImplementation dataOutcome dataPracticesImplementation levelSET Score
35 Recognition Program: Documents Recognition Program RequirementsApplication for State RecognitionData Requirements on websiteManual
36 NC Data Collection Manual Section II Implementation Data
37 Implementation Data Rationale How will collecting this data impact: To ensure that the implementation of PBS at any given school is being done with reliability and accuracy.How will collecting this data impact:School administratorsProvides clear information about the fidelity of implementation of PBS and guides decision making regarding use of time and resourcesPBS teamsProvides specific information regarding areas for improvement in order to create meaningful action plansTeachershelps the PBS team move as quickly and efficiently as possible towards creating a sustainable model, improving school climate and overall student outcomesStudents, parents, communitiesHighly accurate implementation will quickly transition schools towards a more positive climate.
38 Implementation Data vs. Outcome Data Designed to measure fidelity of implementationGoal is to develop action stepsOutcome DataMeasures progress on specific school-wide goalsAllows schools to determine impact of PBS implementationDocuments the effectiveness of PBS on overall school climateUsed by LEAs to make system-wide decisions38
40 Implementation Data Tools Implementation Checklist/ InventorySchool Survey (EBS Survey/ Self-Assessment)Trainer ReportSETFuture Training ListSee Data Manual pages
41 NC Data Collection Manual Section III System Level Outcome Data
42 System Level Outcome Data RationaleTo determine how prevention and intervention strategies are impacting the school environmentHow will collecting this data impact:School AdministratorsBy evaluating system level outcome data, you can make sure that your school resources are being used most efficientlyPBS TeamsTo know what kind of prevention and intervention strategies are needed based on your specific school populationTeachersHelp the PBS Team make accurate decisions about practices to use in the school and classroom based on your specific student populationStudents, parents, and communitiesData will help choose or modify strategies to ensure best academic and behavioral outcomes
44 System Level Outcome Data Tools Achievement DataSuspension/ Expulsion DataReferral DataHow to collect using SWIS vs. NC Wise (or other system).Climate SurveysSpecial Education Referrals/ Eligibility DataStaff Retention DataAttendance DataSee Data Manual pages
45 NC Data Collection Manual Section IV Small Group and Individual Level Outcome Data
46 Small Group and Individual Level Outcome Data RationaleAllows better identification of which students are in need of the most supportNeed to be able to better assess how interventions are working at the individual and small group level prior to problem behaviorPBS traditionally only collects data after problem behavior has occurred preventing ability to know what interventions will work at the classroom level
47 Small Group and Individual Level Outcome Data How will collecting this data impact:School AdministratorsDocument the educational and behavioral progress of at-risk studentsidentify which interventions are most effective in working with at-risk studentsPBS Teamsdetermine the effectiveness of functional based behavioral supports and address problem areas through a team-based approach.TeachersProvides clear way to focus time and energy on interventions that are shown to be effectiveGives clear way to communicate progress to other staff and parentsStudents, parents, and communitiesImproves quality of interventions for childrenGives common way for teachers and parents to communicate about progress
48 PBS and RTI RTI and PBS need to work together PBS good at universals implementationWhole school/class-wide assessment, intervention and implementationRTI good at small group and targeted interventionsIndividual child and small group assessment intervention and implementationRTI historically better at assessment and use of dataPBS historically better at implementing and getting schools to do something differentWe need to learn from each other and adopt both methods
49 Small Group and Individual Level Outcome Data Tools Direct Behavior Ratings (DBR)Other OptionsPermanent productsSystematic direct observationSee Data Manual pages 33-35
50 NC Data Collection Manual Section V System Wide Implementation
51 Documenting System Wide Implementation RationaleFor large school systems (LEAs) implementing system wide, it may be helpful to write a single report summarizing progressThis allows LEA administrators to assess overall impact of PBS implementation in order to better provide support and resourcesA system wide report should not replace collecting and assessing data on an individual school basis, but can be an additional tool in creating sustainability of PBS
52 Documenting System Wide Implementation Possible Components of a System Wide ReportExecutive summaryOverview of number of schools implementing (elementary, middle and high)Combined SET Scores for each of the seven areas by elementary middle and highCombined referral information by elementary, middle and high)Average Per Day and QuarterBy LocationBy Problem BehaviorCombined suspension/expulsion dataTriangle data and analysis by elementary, middle and highAchievement data (PBS schools compared to non PBS schools)Staff impact data (e.g., retention morale, etc.)
54 Challenges and Solutions Time consumingIncompatible data collection systemsFearData does not feel meaningfulBeliefs that it is just restating the obviousResistance to technology and numbers
55 Challenges and Solutions Reframe thinking about TIME: Effective use of data will save time in the long runPBS Team and administrators need to work together to streamline data collection methodsBuild trust among staff members and administrators through frequent sharing of data=Data is information NOT judgmentMake sure all data collection is CLEARLY connected to tangible action stepsShow staff how objective information (data) increases staff investment and makes implementation more meaningfulDemystify the word “data”
56 Using Data Effectively Action PlanningUsing Data Effectively
57 Effective Action Plans Effective action plans are…used regularly.frequently reviewed and updated.accessible to all staff.made up of specific doable action steps with clear timelines.generated using data from staff and team.
58 Using Self Assessment Survey Data to Generate Action Steps Once survey is closed (on pbssurveys.org) you will be able to access reports using the same login numberThere are three separate reportsYou can see overall trends as well as specific numbers for each itemOnce you identify items and areas needing improvement the team should prioritize action stepsSome items can be addressed through information onlyOthers will need revisions or completion of tasks.
59 Sample Action ItemOne school’s survey data showed that many staff indicated both high priority for improvement and not in place for the item that read “ Data on problem behaviors are collected and summarized within an ongoing system” Since the school was using SWIS the team decided the issue was more awareness. The created the following action item.GoalStepsWhoResources RequiredBy WhenEvaluation MeasureIncrease staff awareness of SWIS data1. Present SWIS big 5 graphs at next staff meetingDebbie DataCopies of big 5Discussion QuestionsMarch 24Staff will increase requests for big 5 for grade level meetings
60 Using Implementation Inventory to Generate Action Steps The team completes the Implementation InventoryAfter calculating percentages of implementation, focus on the areas scoring below 80%Within each section look at the items marked 1 or 0 and create a prioritized listCross reference the team list with the results from the staff surveyCreate action steps starting with the highest priority
61 Sample Action Item Goal Steps Who Resources Required By When After completing the implementation inventory the team found that they scored a 61% in universal practices. Upon further review they found all the items marked 1 or 0 had to do with specific teaching of school wide expectations. The also noticed from the staff survey a focus on the need to improve non classroom setting routines. The team developed the following action itemGoalStepsWhoResources RequiredBy WhenEvaluation MeasureIncrease consistent use of expected behaviors in the cafeteriaCreate lesson plans for cafeteria expectationsCreate a schedule for all teachers to teach expectations in the cafeteriaLori Lessonplan and cafeteria TATime to meet with Cafeteria TAsSample lesson plans from other schoolsLesson plan template from PBS websiteMarch 24thFirst week after spring breakAll staff will complete a feedback form after completion of the lessonReduction of referrals from cafeteria during spring semester
62 Using SWIS Data to Generate Action Steps Regularly review SWIS big 5 graphsFor each graph create a list of questions the data generatesCreate custom graphs to answer questionsBring data back to the team and/or staff for discussion of patternsList possible action itemsPrioritize the list and develop action steps for highest priority
63 Sample Action Item Goal Steps Who Resources Required By When After reviewing the SWIS big five one team discovered that the majority of problem behaviors were occurring in the classroom. They created a custom graph to determine what specific behaviors were occurring in the classroom. They found that non- compliance/disrespect was the biggest issue. The team generated the following action itemGoalStepsWhoResources RequiredBy WhenEvaluation MeasureIncrease compliance with requests in classroomsHave counselor create a social skills on specific skillsComplianceAccepting noAsking for helpHave all teachers schedule lesson with counselorCarmine CounselorLesson plan templateSocial skill curriculumFirst week after spring breakDecrease in classroom referrals for non-compliance / disrespect
64 Activity: Action Planning Using the SAMPLE School Data provided (Implementation Inventory results, SWIS Big 5, and PBS Self Assessment Survey), create at least one action item. Use the blank action plan provided.
65 Data Collection Next Steps Review the Data Collection Manual with your LEA administrators and school teamsContact your PBS Regional Coordinators for any questions or concernsdinator/Look for upcoming trainings on Data CollectionPBS Summer InstituteRegional Trainings
66 School-Based Behavioral Assessment: Informing Intervention and Instruction S. Chafouleas, T. Chris Riley-Tillman, G. SugaiDirect Behavior Rating (DBR) presentation by: C. Riley-TillmanPresented by: C. McCamish66
67 Section IV: Small Group and Individual Level Outcome Data
68 Small Group and Individual Level Outcome Data: Assess effectiveness of interventionsDocument educational and behavioral progressDetermine effectiveness of functional based behavioral supportsCommunication tool
69 Direct Behavior Rating (DBR) Background Effective behavioral assessment and intervention procedures in applied settings require the use of empirically-supported yet feasible techniquesTo date, feasible assessment of behavior skills has been focused on ODR data – which may not be sensitive to capture all behaviors of interestTo date, support for feasible, formative assessment of academic skills is available (e.g., CBM) – but attention has not been directed toward social behaviors69
70 Defining Characteristics of the DBR The DBR involves a brief rating of target behavior over a specified period of timea behavior(s) is specifiedrating of the behavior(s) typically occurs at least dailyobtained information is shared across individuals (e.g., parents, teachers, students)the card is used to monitor the effects of an intervention and/or as a component of an intervention(Chafouleas, Riley-Tillman & McDougal, 2002)70
71 Direct Behavior Ratings Refer to a hybrid of assessment tools that combine characteristics of systematic direct observation and behavior rating scales.SDO- method of behavioral assessment that requires a trained observer to identify and operationally define a behavior of interest, use a system of observation in a specific time and place, and then score and summarize the data in a consistent manner (Salvia & Ysseldyke, 2004; Riley-Tillman, Kalaber, Chafouleas, 2006)These tools are designed to be used in a formative (repeated) fashion to represent behavior that occurs over a specified period of time (e.g., 4 weeks) and under specific and similar conditions (e.g., 45 min. morning seat work).Using these tools requires rating target behavior on a scale (e.g., rating the degree to which Johnny was actively engaged. )So, teachers might be asked to rate on a scale from 1 (not at all) to 5 (almost always) the degree to which Johnny was actively engaged in work activities during independent seat work this morning.71
72 Other Names for the DBR Home-School Note Behavior Report Card Daily Progress ReportGood Behavior NoteCheck-In Check-Out CardPerformance-based behavioral recording72
74 Who already uses the DBR? 60% of teachers surveyed already use DBRs to change student behavior32% to monitor or observe student behavior81% to identify positive behaviors, 77% to identify negative behaviors86% use with individual students, 19% with whole class, 9% with small groups32% use DBRs “routinely” as part of classroom management plan(Chafouleas, Riley-Tillman, & Sassu, 2006)74
75 Many Potential Uses for the DBR Increase communication (teacher-student, home-school)As a component of an intervention package, particularly in self-managementProvide “quick” assessment of behaviors, especially those not easily captured by other meansMonitor student behavior over timeFlexibleK-12,+ or –1 student or larger grouprange of behaviors75
77 A systematic DBR possesses the following 4 characteristics: 1. The behavior of interest must be operationally defined.2. The observations should be conducted under standardized procedures.3. The DBR should be used in a specific time, place, and at a predetermined frequency.4. The data must be scored and summarized in a consistent manner.77
78 Guiding Questions 1. Why do you need the data? 2. Which tools are best match to assess the behavior of interest?3. What decisions will be made using the data?4. What resources are available to collect the data?78
79 Design Flexibility What is the target behavior and goal? Focus on a specific behavior (e.g., calling out) or a cluster term for behaviors (e.g., disruption)Goal to increase or decrease behaviorWho is the focus of the rating?Individual, small-group or class-wideWhat is the period for rating?Specific school period, daily, or otherWhat is the setting of observation?Classroom or other locationHow often will data be collected?Multiple times a day, daily, weeklyWhat scale for rating will be used?Checklist, scale, continuous lineWho will conduct the rating?Classroom teacher, aide, or other educational professionalWill ratings be tied to consequences?Consequences must be consistently delivered by person responsible79
80 Considerations When Using a DBR Ensure that use is “systematic”Identify and operationally define a behavior of interestUse a system of observation in a specific time and placeScore and summarize the data in a consistent manner(Similar to the criteria that define systematic direct observation (Salvia & Ysseldyke, 2004)Provide checks on integrity and acceptabilityUnderstand correspondence with other data sources80
81 How are Direct Behavior Ratings data summarized? Data can be quantified, compared, combined, and summarized for summative and formative purposes.For example, DBR data of Susie’s disruptive behavior over the week can be summarized into a statement of average daily or weekly rating (6 out of 9 points) or most likely period of high or low disruption if multiple ratings per day are taken (just before lunch).Since DBR involve rating on some scale, data are summarized relevant to the scale.For example, a simple yes/no checklist can be easily depicted through a bar chart whereas rating information might be plotted on a line graph, with the intervals on the y-axis indicating the DBR scale.81
82 Summary of Strengths and Weaknesses of Use in Assessment Highly FlexibleHighly Feasible, Acceptable, and FamiliarMinimal Cost Given Potential Amount and Uses for DataReduced Risk of Reactivity (atypical behavior)Can be used in assessment, intervention, and communicationWeaknessesRater Influence (history)Limited Response FormatLimited Knowledge about Psychometric Adequacy82
83 Case ExampleMr. Cohen is the sole school psychologist in Sunnyvale, a small, rural district. One of the teachers in the elementary school, Ms. Yoon, recently implemented a token economy in her classroom in an effort to increase pro-social behaviors among a small group of her students during cooperative learning activities. Although Ms. Yoon thinks that the intervention has been successful (she told Mr. Cohen that “the classroom environment feels more positive”), she would like to know for sure and asks Mr. Cohen to help her collect data to support this belief. Mr. Cohen is pleased that Ms. Yoon has sought him out and certainly wants to help, but his schedule is barely manageable over the next few weeks given other commitments. Thus, Ms. Yoon and Mr. Cohen agree to have Ms. Yoon collect data using a DBR, with Mr. Cohen coming in periodically (i.e., once per week) to supplement the DBR data with systematic direct observations.83
84 Points to Consider Measures perception of behavior “3 to 7” not “he is a 7”Academic-absolutes Social- No absolutesRater Effect- vary on initial ratingOverestimate consistentlyConsistent responses to changes in behavior
85 Resources- This website offers an extensive resource on using behavior ratings in the Classroom Behavior Report Card Manual.Chafouleas, S.M., Riley-Tillman, T.C., & Sugai, G. (in press). Behavior Assessment and Monitoring in Schools. New York: Guilford Press.Crone, D. A., Horner, R. H., & Hawken, L. S. (2004). Responding to problem behavior in schools: The behavior education program. New York: Guilford Press.Jenson, W.R., Rhode, G., & Reavis, H.K. (1994). The Tough Kid Tool Box. Longmont, CO: Sopris West.Kelley, M.L. (1990). School Home Notes: Promoting Children’s Classroom Success. New York: Guilford Press.Shapiro, E.S., & Cole, C.L. (1994). Behavior change in the classroom: Self management interventions. New York: Guilford Press.85
86 For More InformationCharouleas, S., Riley-Tillman, T. C., & Sugai, G. (2007). School-Based Behavioral Assessment: Informing intervention and instruction.
Your consent to our cookies if you continue to use this website.